I’ve finally wrapped up a few bigger tasks in the last few weeks (except for that whole Innovate 2013 conference planning, support for the HS 1:1 roll out, and putting together an accreditation plan) so I’ve been able to make it to classrooms more often. This isn’t about evaluating your work, it’s about evaluating mine. We’re devoting more time to professional learning structures and they are only as successful as the impact that they have on student learning.
You’ve set goals with your administrators that serve to anchor their observations so I’m using a short, informal structure of a three-minute walk-though focusing only on the learning environment as it aligns to our Professional Growth and Supervision Plan principles. I’m also tracking the integrity of our documentation of the student learning in Rubicon Atlas.
In three minutes, I’m building a school-wide body of evidence based on the following questions implicit in our teacher evaluation rubric:
- Do students know what they will (should) learn from engaging in the task?
- Why I look for this? How can we empower kids to advocate for their learning if they don’t know the opportunities or the criteria for success?
- Is there accountability to high expectations of behavior and engagement in learning?
- Why I look for this? The one doing the talking, the writing, the modeling, the problem-solving, the lab (etc) is the one doing the learning. How can kids construct meaning unless they dig into the work and grapple with ideas?
- What is the level of cognitive demand?
- Why I look for this? How are we treating kids as thinkers? Are they engaging in tasks that require critical thought and “higher levels” of Bloom’s taxonomy or do we expect only lower order thinking skills, such as simply collecting or recalling information?
- Is what is happening in the classroom aligned to what we say is happening in the classroom?
- Why I look for this? Can we track the story of a cohort’s learning? Without the story, we cannot access where (and why) there are strengths and challenges with understanding, determine where to replicate or replace, and hold ourselves and students accountable so we can continue to build a cohesive learning experience.
How do I use this data? Data never provides an answer. Its strength resides in our ability to ask better questions. I think this can inform our work in many ways. First and foremost, it helps me target who needs support and it supports principals in deepening dialogue with all teachers in their division. Most importantly, I hope it pushes our reflection as a community. For example, how do we support kids in building skills of collaboration when our most prevalent classroom configuration is whole class instruction? How do we celebrate the strides we’ve made in documenting the learning we put before students?
I recognize three minutes may not capture the power that happens within a block of learning. This is about patterns and follow-up. This is about how I can serve Graded better.
A two-week snapshot of 22 classrooms
(Click on graphs for a larger version)