Mike Caufield’s post made me realize I’ve done a pretty poor job of publicizing what we’re trying to do lately in good old HCPS1. So here’s my attempt to put this out there for people to spot holes, misdirection, etc.
Setting Best Practice
We’re turning to video for best practice more and more. We’re doing that for individual subjects, certain pedagogical techniques, major tools like IWBs (no luck so far) and for other things you might not expect like administrator training.
We have a whole series dedicated to illustrating the art of the post observation interview for our administrators. It’s a three step process so for some of them we now have the planning conference video, the class itself and then the post-observation wrap up interview. That allows administrators to see the steps of the process, put themselves in the classroom to gather data and then watch how that administrator wraps things up. This video series supports a larger web based structure that has different observational tools, process maps etc. We tend to use the videos when the administration is brought together. They’ll watch a video, break it down in small groups etc. etc. and then be directed back to the main site for the tools/process maps/documentation.
So that’s the relatively simple part.
Here’s where we’re attempting to merge a lot of things in a flowing loosely linked communal animal.
What we’ve had in the past are courses usually based on applications like this one I did on Google Earth many moons ago2. There’s some attempt there to provide content based examples but there’s lots of failure points. In some ways the discussion of the pedagogy gets in the way of how to use the application and vice versa. Additionally, there’s no way to communicate, rate or contribute anything back. People could find errors, crazy ways to make things far better etc. and I’d never know nor would anyone else. This is a dead site and there are many like it scattered across the HCPS intranets. They have additional problems like being hard to find, being rarely updated and often ending up off kilter in terms of best practice (both demonstrated and as examples).
So here are our steps to getting this right.
Step one – Set best practice district wide.
Framework – We’re doing this through the TIPC. It’s a mix of a condensed version of P21 and ISTE standards mixed with some student centered and constructivist beliefs. This provides a pretty decent tool for self-reflection and is extensible enough to be used to develop walk through tools.
Best Practice Examples – We’re taking this on in two ways.
Video – We’re video taping exceptional teachers and lessons and making them available in a variety of formats with and without annotations to illustrate key points. We’ll also be working on combining the video with the resources from the lesson when possible.
Content – We also wanted to reward and publicize the teachers who were creating exemplary lessons according to the TIPC. That led to the Henrico 21 awards. There will be 15 winners from k-12 who will be recognized publicly and given a cash prize3. Their winning entries will be published and shared according to some of the ideas I laid out here.
We had about 600 submissions to this contest with about 7GB of student artifacts as well- that doesn’t include a lot that was on linked web pages. We’re in the judging stages right now. It’s been very interesting to think about these submissions as a way to snapshot where we might be as a district in terms of understanding and implementing the TIPC.
How Do I? We have created modules for each category of the TIPC (they still need a lot of work but here’s one). These modules focus on why these skills are important, stress the conceptual understanding, and have some minor examples of how they might be used along with some basic tutorials.
Paralleling these modules we have tool based databases that tie the tool to the concepts and to tutorials on how to use the tools themselves. The databases also need some serious work.
So clearly there’s a lot of work still to be done and I haven’t even gotten to the interweaving portion yet.
Linking It Together Conceptually, I don’t think the majority of k12 teachers browse for information on increasing creativity in the classroom. They’re more likely to be looking for the things that the state says are important like SOLs. So we put up our best practice examples with SOL tags. Then the goal is to look at those lessons and see what the likely support needs for a teacher attempting these lessons will be. That’s where we link in the conceptual frameworks and the tool tutorials. We also want the reverse happening. If a teacher is participating in a course on creativity we want that module to link in the best practice lessons and have the tutorials available as well. Cross pollination is the goal both because it’s more likely to get teachers involved and doing things right and because we have too many pieces of content living on lonely islands.
The other aspect here is an attempt to create a community talking about these lessons and teaching in general. There will be commenting and rating for the elements. More importantly, I’m going to make a concerted effort to get people commenting. I feel this is key and can be an amazingly powerful aspect of a site like this. I also have no illusions about how difficult that is going to be.
Checking for Change So we need to see if doing all this stuff is making a difference and that is where the Reflective Friends process comes in. Despite the awful name4, it’s a pretty intense and powerful process. We did this for 3 high schools this year and will be spreading it to 7 or so next year. Basically –
General data collection about the learning environment and how well aligned it is with the T-PAC model and 21st Century classrooms.
In this option, data collection tools will be used that look for the presence of the following indicators:
* Student centered learning activities
* Critical thinking and problem-solving embedded in the learning activities
* Opportunities for communication and collaboration in the learning activities
* Learning activities that foster student creativity and innovation
* Opportunities for students to find, evaluate, organize and synthesize information
* Use of technology as a tool for meeting all of the above indicatorsThese data will help school faculty and staff determine their baseline level on each indicator and identify areas for future professional development and training.
Data are gathered through classroom observations and teacher/student interviews. That data is compiled and shared with the administration. We then meet to analyze the data and discuss next steps. It’s really an interesting and intense process.
The results of the Henrico 21 submissions next year will also help us gauge where we are as a district.
I think it’s time to stop now. I’m not sure how clear I was but it feels good to get a chunk of this out of my head. It’s a lot to do and think about that’s for sure.
Thanks so much for this. There’s a lot of this that would just be impossible in HE, but much of it was interesting. In particular, the high number of submissions for the contest — was that related to the money, or was the money a distraction? And were the submissions generally quality?
There’s not much I wouldn’t do for that level of participation. Painful things. Borderline illegal things. How on earth did you get that response? And based on your knowledge of HE, could the same levers work here?
I also love this comment of yours — that the submissions were, if nothing else, a snapshot of where you were — BRILLIANT! So the varying quality of them is a good outcome, and level of participation is more important than percent that are quality. This is a really key insight to me, that a contest could serve both a benchmarking goal and a best practice initiative.
(incidentally — are you aware of the “good practice” terminology? It’s kind of a geeky distinction, and one which I usually punt on foregrounding, but many theorists believe what we need in Ed. is “good practice”, not best practice, since the classroom situation always has this emergent element. Not usually terminology I pull out, but some thing I try to keep in mind).