Monthly Archives: April 2010

Step Up Your Vocab: The Musical


cc licensed flickr photo shared by bionicteaching
This is pretty simple and likely to be pretty fun. It probably fits best in an English classroom1

I’m not sure how I’d start this . . . I think I’d go this route. I’d show the kids a bunch of article headlines and quotes complaining about the deterioration of today’s society and how today’s music sucks. This is really just to get them riled up and interested in proving they’re not the brain dead people being described.

The kids pick their favorite favorite song and go find the lyrics.

Then you have the kids run they lyrics through something like this site which calculates reading levels. This one isn’t great for this purpose but it’ll do for this demonstration. We just want some sort of number that quantifies the sophistication of the lyrics.

The challenge for the kids is to increase the reading level as high as possible while maintaining the spirit of the song and it’s rhyme scheme (if any).

So they have to really figure out what makes the reading level go up or down and then apply what they learn. They’ll be working with vocabulary, sentence structure etc. The students will have to think about the essence of the song and struggle with the subtleties of word choice.


1 Although breaking down the pieces of the “reading level” algorithm as an exercise in logical thinking would be interesting in science or maybe math.

Picture 3

Jason Tester Presentation


cc licensed flickr photo shared by bionicteaching

I saw a presentation today delivered by Jason Tester (Institute for the Future). It was one of the more interesting presentations I’ve seen in a while.

The whole presentation was framed around superhero skills for the immediate future. The names are a little silly but he had some fresh websites and I think the ideas are solid. Tester is not an education expert and so he didn’t try to make tight ties to how these skills play out in schools.

  • mobbability -work in large groups while maintaing role, organize and collaborate with many simultaneously
  • influency -ability to be persuasive in multiple social contexts
  • ping quotient -measures your responsiveness to other peoples request for engagement, propensity/ ability to reach out to others via a network
  • protovation -fearless innovation in rapid, iterative cycles
  • open authorship -creating content for public consumption and modification
  • emergensight -ability to prepare for and handle surprising results and complexity (or create the right conditions for it)
  • signal noise management -filtering meaningful info patterns and commonalities from the stream of information coming into our lives
  • cooperation radar -the ability to sense, almost intuitively, who would make the best collaborators on a particular task
  • longbroading -thinking in terms of higher level systems, cycles, the big picture


One of the things that struck me was an art project based around emotion maps (see image above)/ I’d seen this before on Boing Boing or something but today it sparked an idea. It may be that I’ve been dealing with trying to figure out how to gather data to assess our 1:1 initiative1 The basic idea is the artist build a GPS device that tracked people’s emotional states via some sort of skin temperature reading. So it picked up emotional highs and lows. The individual could also text in information.

Here’s what I thought would be interesting for a school. Think of the map above but done in a school. There are a variety of ways it could work even without the thermal readings and GPS data. Even if you did the data collection with simple a simple number system, you could map the people to the time codes and then to places or do it through an online database that skipped a step or two of that process. There are definitely ways to work it out.

Think of what you could do with an emotional map of your school and/or class. You overlay that data with location and various teachers/subjects/standards and you’d be able to have some really interesting conversations.

It’s late and I’ll try to make this more exact in a forth coming post on the ultimate data dashboard for educators. There’s also a fair chance I’ll post more on Tester’s presentation.


1 Long, involved post on that and data dashboards in the future. It’s currently molding with a couple hundred other drafts.

If you’re going to do this 21st century thing . . .

Here’s my advice.

Get your leadership on board with the same vision of what this looks like. That doesn’t mean “Yeah, P21 sounds good.” or “I like ISTE’s version.” If your administration is going to push this as something that needs to be done, they actually have to know what they’re talking about in detail. General comments about doing things 21st century style won’t cut it. The vision has to come from the top and it has to be focused on your county/school. Even if you could just apply some framework out of the box you’d lose quite a lot of value.

So now you’ve got your vision.

You’re going need to have some way to assess where you are now, otherwise there’s no way to see if you’re getting better or where you need to focus specifically. If you really think about you can make a tool that will perform a variety of functions with only minor alterations. This is a good idea for all kinds of reasons- for instance you won’t have to spend enormous amounts of time making brand new tools all the time and the commonalities will make the data more comparable and consistent. Think about observations, quick walk throughs . . .

Now you’re going to need to norm the observation tool. If not, you might as well not make one at all. As a group, each leadership member has to be able to individually look at a variety of lessons and agree on the key components that make the lesson 21st century as well as how those components rate using the tool. This is actually a really powerful and useful conversation. It’s also a great way to really test your tool before releasing it into the wild. I found it useful to hand pick lessons that illustrate issues that came up during the design of the tool. Have some best practice to evaluate but make sure you put in some examples that are contentious and that people will argue about. You might not have a lot of video lying around for people to analyze so you might browse

You are also going to want to think hard about how you’re going to deliver this to your staff. You probably want a little sell on this, not a “Did You Know?” try to find, or make, something better and more focused on why teachers would want to change. I’d recommend doing a similar norming exercise. Get people talking and discussing what this is, how the tool works etc.

A few questions that came up for us.

  • Is technology a necessary component?1
  • What is the difference between innovation and creativity?2
  • How do you assess creativity/innovation?3

There’s a lot more to do but that’s not a bad start. Things get interesting when you’re assessing the data and using it to determine professional development and then how you start publicizing best practice.


1 It seems strange but you’ll see great lessons that have all the thinking involved but don’t use any technology. Does that matter to your group? Maybe it doesn’t but it’s best to figure this stuff out.

2 We had a rough time even defining innovation.

3 Is it relative to the individual? Is it by the class average? I still don’t know how you look at this except by individuals. Not that I really care because I don’t think that much of the grading aspect of things but grading always comes up.

Best Practice, Shared Resources, Tools and Community

Mike Caufield’s post made me realize I’ve done a pretty poor job of publicizing what we’re trying to do lately in good old HCPS1. So here’s my attempt to put this out there for people to spot holes, misdirection, etc.

Setting Best Practice

We’re turning to video for best practice more and more. We’re doing that for individual subjects, certain pedagogical techniques, major tools like IWBs (no luck so far) and for other things you might not expect like administrator training.

We have a whole series dedicated to illustrating the art of the post observation interview for our administrators. It’s a three step process so for some of them we now have the planning conference video, the class itself and then the post-observation wrap up interview. That allows administrators to see the steps of the process, put themselves in the classroom to gather data and then watch how that administrator wraps things up. This video series supports a larger web based structure that has different observational tools, process maps etc. We tend to use the videos when the administration is brought together. They’ll watch a video, break it down in small groups etc. etc. and then be directed back to the main site for the tools/process maps/documentation.

So that’s the relatively simple part.

Here’s where we’re attempting to merge a lot of things in a flowing loosely linked communal animal.

What we’ve had in the past are courses usually based on applications like this one I did on Google Earth many moons ago2. There’s some attempt there to provide content based examples but there’s lots of failure points. In some ways the discussion of the pedagogy gets in the way of how to use the application and vice versa. Additionally, there’s no way to communicate, rate or contribute anything back. People could find errors, crazy ways to make things far better etc. and I’d never know nor would anyone else. This is a dead site and there are many like it scattered across the HCPS intranets. They have additional problems like being hard to find, being rarely updated and often ending up off kilter in terms of best practice (both demonstrated and as examples).

So here are our steps to getting this right.

Step one – Set best practice district wide.

Framework – We’re doing this through the TIPC. It’s a mix of a condensed version of P21 and ISTE standards mixed with some student centered and constructivist beliefs. This provides a pretty decent tool for self-reflection and is extensible enough to be used to develop walk through tools.

Best Practice Examples – We’re taking this on in two ways.
Video – We’re video taping exceptional teachers and lessons and making them available in a variety of formats with and without annotations to illustrate key points. We’ll also be working on combining the video with the resources from the lesson when possible.

Content – We also wanted to reward and publicize the teachers who were creating exemplary lessons according to the TIPC. That led to the Henrico 21 awards. There will be 15 winners from k-12 who will be recognized publicly and given a cash prize3. Their winning entries will be published and shared according to some of the ideas I laid out here.

We had about 600 submissions to this contest with about 7GB of student artifacts as well- that doesn’t include a lot that was on linked web pages. We’re in the judging stages right now. It’s been very interesting to think about these submissions as a way to snapshot where we might be as a district in terms of understanding and implementing the TIPC.

How Do I? We have created modules for each category of the TIPC (they still need a lot of work but here’s one). These modules focus on why these skills are important, stress the conceptual understanding, and have some minor examples of how they might be used along with some basic tutorials.

Paralleling these modules we have tool based databases that tie the tool to the concepts and to tutorials on how to use the tools themselves. The databases also need some serious work.

So clearly there’s a lot of work still to be done and I haven’t even gotten to the interweaving portion yet.

Linking It Together Conceptually, I don’t think the majority of k12 teachers browse for information on increasing creativity in the classroom. They’re more likely to be looking for the things that the state says are important like SOLs. So we put up our best practice examples with SOL tags. Then the goal is to look at those lessons and see what the likely support needs for a teacher attempting these lessons will be. That’s where we link in the conceptual frameworks and the tool tutorials. We also want the reverse happening. If a teacher is participating in a course on creativity we want that module to link in the best practice lessons and have the tutorials available as well. Cross pollination is the goal both because it’s more likely to get teachers involved and doing things right and because we have too many pieces of content living on lonely islands.

The other aspect here is an attempt to create a community talking about these lessons and teaching in general. There will be commenting and rating for the elements. More importantly, I’m going to make a concerted effort to get people commenting. I feel this is key and can be an amazingly powerful aspect of a site like this. I also have no illusions about how difficult that is going to be.

Checking for Change So we need to see if doing all this stuff is making a difference and that is where the Reflective Friends process comes in. Despite the awful name4, it’s a pretty intense and powerful process. We did this for 3 high schools this year and will be spreading it to 7 or so next year. Basically –

General data collection about the learning environment and how well aligned it is with the T-PAC model and 21st Century classrooms.

In this option, data collection tools will be used that look for the presence of the following indicators:

* Student centered learning activities
* Critical thinking and problem-solving embedded in the learning activities
* Opportunities for communication and collaboration in the learning activities
* Learning activities that foster student creativity and innovation
* Opportunities for students to find, evaluate, organize and synthesize information
* Use of technology as a tool for meeting all of the above indicators

These data will help school faculty and staff determine their baseline level on each indicator and identify areas for future professional development and training.

Data are gathered through classroom observations and teacher/student interviews. That data is compiled and shared with the administration. We then meet to analyze the data and discuss next steps. It’s really an interesting and intense process.

The results of the Henrico 21 submissions next year will also help us gauge where we are as a district.

I think it’s time to stop now. I’m not sure how clear I was but it feels good to get a chunk of this out of my head. It’s a lot to do and think about that’s for sure.


1 When I have to leave Jim Groom length comments there’s a problem.

2 Yes, it makes me cry too

3 Yeah, I’m not sure money is the right path here either.

4 Critical Friends was seen as too intimidating. Seriously.

BattleDecks (Queensland Rules)


cc licensed flickr photo shared by bionicteaching

Way back when, I really, really wanted to do something with this whole BattleDecks idea. Two years or so later and I finally step up to the plate.

These were the rules my students got when they came in.

Your group will be assigned to one of the following topics.

1. Internet safety
2. 21st century skills
3. Technology integration
4. The future of education

Rules

* Your presentation will last between 2 and 3 minutes
* You can only choose from these 20 pictures
1
* You must use 10 of the provided images
* You can add no more than 10 written words in the entire slide show2

You have 30 minutes to prepare.

So it’s not exactly like the BattleDecks setup but it’s close enough and I think the rule changes made allow for more processing and a little less improv/comedy talent being necessary. The students had a lot of fun and the presentations were pretty decent. I think they’d improve the more we did this sort of thing.

My plan for next semester is to do this weekly or every two weeks as a review of the previous class. I’m kicking around a few additional ideas to keep it interesting.

  • having the students pick the images for the next group
  • giving less options on the photos
  • adding a required phrase or series of words
  • allowing for one wild card image

I like this activity for all sorts of reasons. It allows for a lot processing, creativity and gets people thinking about presenting in a different way. It also works for all sorts of content.


1 The images were chosen fairly randomly from a selection I had taken that happened to be on my computer. I tried to pick images that were interesting and open to multiple interpretations.

2 I need to come up with a better way to say this. i had to explain it multiple times.