Category Archives: Examples

AKA "draw with pen"

Keynote “Flowing Ribbon” Tutorial

This is just one of those weird little things that might come in handy for someone someday plus I’m always happy to see software do things that are just a bit off standard.

Part of our online summer courses is creating course trailers. One of the instructors wanted to portray the connection between stages in a persons’s life as connected by a moving ribbon that links different representational photographs together. A cool idea and one that I wanted to support. Given we’re dealing with a large number of people, the goal was to do something that was quick and relatively easy.

I may yet choose another piece of1 software but I managed to do the example above in Keynote (Apple’s version of PowerPoint). I’m pretty sure you could do it in PPT as well.

AKA "draw with pen"

AKA “draw with pen”


Step one is to put the image in and draw some lines with the vector tool. It’ll be easiest if you end the ribbon in one of the main directions of movement (up, down, left, right). In this case I chose down.

The vector drawing tool in Keynote is quite different from what you’re used to in Illustrator or Photoshop or anything I’ve ever used. I kind of like it but it’s different.

Screen Shot 2014-03-05 at 9.34.31 PM
Once you have that set up, select the line and click on “Build In” and choose the “Wipe” option. In this case, I want it to move from the left. I’ll do the same thing with the other line but this time the wipe will be from the top.

Screen Shot 2014-03-05 at 9.34.17 PM
I set the build orders to be automatic and sequential. That gets us a decent ribbon flow on that slide but now we need to shift from one slide to the other while keeping the continuity of the ribbon flow.

Screen Shot 2014-03-05 at 9.33.55 PM
I set the slide transition to Push and set it to bottom to top (against the flow of the wipe). You might notice I put the next image directly against the top of the slide. That just makes life easier. You could carefully line up you ribbons on different slides but this was far easier.

Screen Shot 2014-03-05 at 9.33.31 PM
The second slide is modified a bit differently. You might notice the line seems longer than it should. That’s because two items are moving. The line wipe like before but this line is longer (it extends under the picture up to the top of the page) and the picture itself is animated so that its “build out” is “Move Out” from bottom to top.2


1 “Piece of” sounds really weird.

2 The language here doesn’t help directions.

Two Different Time Lapse Experiments

Time Lapse Trip from Richmond VA to Tampa FL from Tom Woodward on Vimeo.

This is my first attempt with the GoPro. I think it’s set to one shot every 30 seconds. You can see me fiddling with the settings a few different times if you’re masochistic enough to watch it through. The battery ran out early in the trip and resulted in me using it without the stand. That helps explain the repeated drifting as the USB cord1 pulls it slowly towards the driver’s side. It is interesting to see a 12.5 hr trip condensed down to 4 minutes or so. I may do it on the way back but pointed mostly towards the sky or maybe at the kids.

Knowing where the stops were makes me wonder if something similar would make for a interesting take on Dan Meyer’s original graphing stories.


On the other end of the time lapse spectrum is this attempt to condense one my attempts to fix a photo from the reddit pic request group. This one is kind of amusing to me in that you can see me googling some stuff for a sick child in the middle and finishing up with some posts to reddit and flickr. There is no sound but it’d be pretty easy to narrate if you wanted to make it more instructional.

The screenshots were generated based on this post. It is a copy/paste terminal command that will take a full desktop screenshot every X seconds.

i=1;while [ 1 ];do screencapture -t jpg -x ~/Desktop/screencapture/$i.jpg; let i++;sleep 4; done

Semi Tutorial

In either of the cases above you end up with a fist full of images. It used to be that you could set default input length for still images in FinalCut. I can’t figure out how to do that in FCP X. There are lots of people claiming (and even showing) other ways to bulk change the length of static images once they are in FCP. Several ways didn’t work for me. What did work is …. drag/drop the images into the timeline then . . .


cc licensed ( BY SA ) flickr photo shared by Tom Woodward

Just click yes or change the settings to your preferred set up.


cc licensed ( BY SA ) flickr photo shared by Tom Woodward
Once things have calmed down a bit, select all the images then ctrl click/right click. You’ll select “Create a compound clip“.


cc licensed ( BY SA ) flickr photo shared by Tom Woodward

Then it’s Modify>Retime>Fast and your choice of speeds. You can also go back and enter a custom speed to get things to around the duration you want.


1 How strange to be able to easily charge USB devices in the car and even stranger to have the need to do so.

If it seems like playing . . .

IMG_0537

If it seems like I’m playing lately it is because I am. The last week or so has been an exploration of all sorts of fairly odd things. Markov chains, Twitterbots, McRibs1, photo walks to name a few items.

These are easy things to dismiss as trivial. It’s not necessarily obvious how these strange wanderings connect back to outcomes that other people may want or how they mesh with the idea of online learning at VCU. I believe that’s because we’ve created a belief that (in many things) we know both where we are (point A) and where want to go (point B) and that whatever gets us between these two points most “efficiently” is the best path. I’m going to try to both justify the value of a wandering path by pulling in pretty disparate examples2 from time/space with some recent examples of these wanderings coming to fruition.

Similar patterns of over-narrowing happen in lots of areas. People tend to think they know lots of things they don’t.3 I see elements of this narrowing in terms of the echo chamber, the specification focused patterns of today’s world4, and the general lack of joy evident in work and school.5


Here’s a fairly typical pattern for me.

Stage One – September 2011

Screen Shot 2013-11-20 at 2.56.57 PM
On Sept. 22, 2011 at 10:48 PM6, I took a picture of a random artifact from my youth and put it on insta-facebook-agram so one of my friends could see it. Based on his enthusiastic interest, I decided to figure out how to make a digital version which I had as a working product on Sept. 24th.

Stage Two – August 2013

Nearly two years later (roughly August 8), I stuck my head in a meeting and heard a complaint about a Buck’s Institute tool. I knew from my vast experience with PHP fortune sticks two years earlier that I could make something like what was requested. The details of that exploration are in this post. The short version is that I opted to learn a chunk of javascript because I couldn’t manage rotate the variable independently in PHP. That got me involved with javascript libraries which in turn opened up a few other avenues of creation. A few days later, I made a gif randomizer which let me know I could randomize images as easily as text.

A few more days and I built some bean shaped math manipulatives I’d seen while helping to unpack supplies at an elementary school. This pushed me into touch libraries because the students most likely to use them would have iPads instead of laptops. That same basic concept (you can make user-movable things on the internet) branched into sight word refrigerator poetry after a conversation with a high school English teacher. This was finished up around August 27.

The same basic concepts came back again on September 19 when I built this getting-to-know-you page for the ITRTs after seeing this Dan Meyer post from back on August 7. I agree it’s not the most wonderful example of technology but it did open up a few conversations and I had an ITRT ask if they could move the red dots. My response was sure if they could figure out how to do it. This allowed me to squeeze in a quick Firebug/Inspect Element/CSS conversation and bring up that this is how some middle school kids had been tricking their parents about their online grades.7

Stage Three – November 2013

Screen Shot 2013-11-20 at 4.01.40 PM
Jump to another job and a few months later. A conversation between Gardner, Jon, and myself wanders to the idea of algorithmic construction, Twitterbots, and human attempts to derive understanding8. In many ways the #ds106 generator is a combination of all of these wanderings (and more). Like the best fortunes (sticks or otherwise), there is ambiguity to parse and use to construct meaning. There is also the combination of paths I’d taken previously- Google –> StackOverflow and Github. Little bits and pieces adding up.

At the same time we’re working on a logo for Online @ VCU. Thinking about what we hoped to accomplish and what I’d like the logo to represent led me to think of networks and connections which resulted in some flash backs to some interesting things I’d seen digging through javascript libraries back when I was trying to drag beans around. I’d remembered trying to do something with D3 and collapsible force nodes (just because I thought they looked interesting). That led to the idea that our logo should represent a networked relationship. On top of that being interactive would be an interesting plus and . . . for the next step an organic logo that is built by the actions of our users would be pretty interesting.

I don’t know if it’s heutagogy or combinatory play9 or maybe something else I haven’t learned a word for yet. Things like this happen all the time. It all adds up and comes together in beautifully unexpected ways and it’s not just information or skills- it’s people as well. In the end I really believe it is about interesting intersections but how will you know what to blend if you don’t wander around a good bit? My goal is to be interesting by being interested.10

Currently, I’m also prying at using a IFTTT recipe to capture my Twitter stream in a spreadsheet for analytical experiments and possible use later as a variable generator.

I remain, I hope, usefully deranged.


1 Strangely, McRibs and I have coincided before.

2 All the links are from my Diigo links rather than looked up for this post. I mention that because it’s an example of what I mean by seemingly aimless wanderings coming together at points in time.

3 It could be that my belief that we don’t know what we think we know is derived from a similar confirmation bias but I usually see that pattern cycling towards confidence rather than less . . . but I would think that wouldn’t I? . . . this does get meta pretty quickly.

4 Or 1945

5 It is not a coincidence that both teachers and students go insane with happiness when school is canceled for snow. Consider just how thankful both groups are to have one day of not going to school.

6 The NSA knows everything about me.

7 Crazy on so many levels. The students would pull up their online grades and change them in the browser. It’s like super ninja level whiteout.

8 Algorithmic oracles and Markovian driven divination both have a nice ring to them.

9 Not a victim of Churchillian Drift- Einstein did say that.

10 Amiable weirdness would be acceptable as well.

Markov Tweet Generator Code, Path, & Potential

DS106 Markov Tweet Generator

The following is how I adapted the Markov chain generator from Hay Kranen. Thanks to the comments1 I found below Hay’s post2 this Markov + Shakespeare version inspired me to figure out the “post-to-Twitter” option.3

Anyway, the much cleaner version is up and running. It now allows you to push the results to Twitter although I’m still adjusting this a bit. The code for the page I modified is below. It’s still slower than I’d like but it’ll do for now.

The fact that I can go from a conversation one day to a fairly finished product the next is the piece that amazes me about computers and the Internet. I cannot stress enough that I don’t know how to write PHP. I feel that’s a statement of empowerment. This project took about three hours of work. 95% of that was searching/research and breaking it and then fixing it.4 Someone who knew what they were doing could probably knock it out in ten minutes.

Now how is this more than just random #ds106 amusement? I think the generator works a little like this example about machine imagined artworks5. So there’s a chunk of human constructed meaning from machine assembled pieces. It doesn’t always work but that’s part of why I like having a human layer between generation and Twitter publishing (although I may still automate it when that makes sense time wise). This does generate interesting assignments and juxtaposes them in ways that are similar to the remix assignments idea but with an additional dose of randomness that I like. It also brings older conversations, student products, and links back into the conversation that’s occurring now.

The possibility of the built in @ convention of Twitter has a lot of possibility as well. That element of personalization and specificity could bring people back into the game/course/conversation in ways that reenergize both the participant and the community. There are lots of ways this might be attractive even absent the Markov element. I think there’s value in trying to pull people back into public conversations via methods like this. If participants in a class (MOOC-ish or otherwise) opted in, you could randomly (and judiciously) @ them to engage in conversations around different concepts, posts, products etc. It’d be a balance to avoid being boring and/or spammy but it might be the prompt needed to have a longer term engagement with a course/community.6

[code]]czozMDE4OlwiDQoNCjw/cGhwDQovKg0KICAgIFBIUCBNYXJrb3YgQ2hhaW4gdGV4dCBnZW5lcmF0b3IgMS4wLjENCiAgICBDb3B5cml7WyYqJl19Z2h0IChjKSAyMDA4LCBIYXkgS3JhbmVuIDxodHRwOi8vd3d3LmhheWtyYW5lbi5ubC9wcm9qZWN0cy9tYXJrb3YvPg0KDQogICAgTHtbJiomXX1pY2Vuc2UgKE1JVCAvIFgxMSBsaWNlbnNlKQ0KDQogICAgUGVybWlzc2lvbiBpcyBoZXJlYnkgZ3JhbnRlZCwgZnJlZSBvZiBjaGFye1smKiZdfWdlLCB0byBhbnkgcGVyc29uDQogICAgb2J0YWluaW5nIGEgY29weSBvZiB0aGlzIHNvZnR3YXJlIGFuZCBhc3NvY2lhdGVkIGRvY3V7WyYqJl19bWVudGF0aW9uDQogICAgZmlsZXMgKHRoZSBcIlNvZnR3YXJlXCIpLCB0byBkZWFsIGluIHRoZSBTb2Z0d2FyZSB3aXRob3V0DQogICAge1smKiZdfXJlc3RyaWN0aW9uLCBpbmNsdWRpbmcgd2l0aG91dCBsaW1pdGF0aW9uIHRoZSByaWdodHMgdG8gdXNlLA0KICAgIGNvcHksIG1vZGl7WyYqJl19ZnksIG1lcmdlLCBwdWJsaXNoLCBkaXN0cmlidXRlLCBzdWJsaWNlbnNlLCBhbmQvb3Igc2VsbA0KICAgIGNvcGllcyBvZiB0aGUgU3tbJiomXX1vZnR3YXJlLCBhbmQgdG8gcGVybWl0IHBlcnNvbnMgdG8gd2hvbSB0aGUNCiAgICBTb2Z0d2FyZSBpcyBmdXJuaXNoZWQgdG8gZG8ge1smKiZdfXNvLCBzdWJqZWN0IHRvIHRoZSBmb2xsb3dpbmcNCiAgICBjb25kaXRpb25zOg0KDQogICAgVGhlIGFib3ZlIGNvcHlyaWdodCBub3R7WyYqJl19aWNlIGFuZCB0aGlzIHBlcm1pc3Npb24gbm90aWNlIHNoYWxsIGJlDQogICAgaW5jbHVkZWQgaW4gYWxsIGNvcGllcyBvciBzdWJzdHtbJiomXX1hbnRpYWwgcG9ydGlvbnMgb2YgdGhlIFNvZnR3YXJlLg0KDQogICAgVEhFIFNPRlRXQVJFIElTIFBST1ZJREVEIFwiQVMgSVNcIiwgV0l7WyYqJl19VEhPVVQgV0FSUkFOVFkgT0YgQU5ZIEtJTkQsDQogICAgRVhQUkVTUyBPUiBJTVBMSUVELCBJTkNMVURJTkcgQlVUIE5PVCBMSU1JVHtbJiomXX1FRCBUTyBUSEUgV0FSUkFOVElFUw0KICAgIE9GIE1FUkNIQU5UQUJJTElUWSwgRklUTkVTUyBGT1IgQSBQQVJUSUNVTEFSIFBVUlBPe1smKiZdfVNFIEFORA0KICAgIE5PTklORlJJTkdFTUVOVC4gSU4gTk8gRVZFTlQgU0hBTEwgVEhFIEFVVEhPUlMgT1IgQ09QWVJJR0hUDQogICB7WyYqJl19IEhPTERFUlMgQkUgTElBQkxFIEZPUiBBTlkgQ0xBSU0sIERBTUFHRVMgT1IgT1RIRVIgTElBQklMSVRZLA0KICAgIFdIRVRIRVIgSXtbJiomXX1OIEFOIEFDVElPTiBPRiBDT05UUkFDVCwgVE9SVCBPUiBPVEhFUldJU0UsIEFSSVNJTkcNCiAgICBGUk9NLCBPVVQgT0YgT1IgSU4ge1smKiZdfUNPTk5FQ1RJT04gV0lUSCBUSEUgU09GVFdBUkUgT1IgVEhFIFVTRSBPUg0KICAgIE9USEVSIERFQUxJTkdTIElOIFRIRSBTT0ZUV0F7WyYqJl19UkUuDQoqLw0KcmVxdWlyZSBcJ21hcmtvdi5waHBcJzsNCg0KLyoNCk9yZGVyIHNlZW1zIHRvIGJlIHRoZSBsZXZlbCBvZiBjaGFvdGlje1smKiZdfSBtYWduaXR1ZGUgaW4gdGhlIGNvbWJpbmF0aW9uIGZyb20gdGhlIHNvdXJjZS4gSGlnaGVyIG51bWJlcnMgc2VlbSB0byBiZSBsZXN7WyYqJl19cyBkaXNvcmRlci4NCkxlbmd0aCBpcyB0aGUgbnVtYmVyIG9mIGNoYXJhY3RlcnMgaW4gdGhlIHJldHVybmVkIHN0cmluZy4NClRleHtbJiomXX10IGlzIHRoZSBwYXRoIHRvIHRoZSB0ZXh0IGZpbGUgY29udGFpbmluZyB5b3VyIHNvdXJjZSBkYXRhLiBJIGRpZCB1c2UgY3VybCBwe1smKiZdfXJldmlvdXNseSB0byBwdWxsIGRpcmVjdGx5IGZyb20gYSB3ZWJzaXRlIGJ1dCBmb3VuZCBwYXJzaW5nIG91dCB0aGUgaHRtbCB0byB7WyYqJl19YmUgcHJvYmxlbWF0aWMuIA0KDQoqLw0KICAgICRvcmRlciAgPSA2Ow0KICAgICRsZW5ndGggPSAxMjA7DQogICAgJHRleHQgPSBmaXtbJiomXX1sZV9nZXRfY29udGVudHMoXCJkczEwNnR3ZWV0cy50eHRcIik7DQogICAgJG1hcmtvdl90YWJsZSA9IGdlbmVyYXRlX21hcmtvdl90YWJ7WyYqJl19bGUoJHRleHQsICRvcmRlcik7DQogICAgJG1hcmtvdiA9IGdlbmVyYXRlX21hcmtvdl90ZXh0KCRsZW5ndGgsICRtYXJrb3ZfdGFibHtbJiomXX1lLCAkb3JkZXIpOw0KCQ0KICAgICAgICBpZiAoZ2V0X21hZ2ljX3F1b3Rlc19ncGMoKSkgJG1hcmtvdiA9IHN0cmlwc2xhc2hlcygke1smKiZdfW1hcmtvdik7DQoNCj8+DQo8IWRvY3R5cGUgaHRtbD4NCjxodG1sPg0KPGhlYWQ+DQogICAgPG1ldGEgaHR0cC1lcXVpdj1cIkNvbnRle1smKiZdfW50LVR5cGVcIiBjb250ZW50PVwidGV4dC9odG1sO2NoYXJzZXQ9dXRmLThcIi8+DQogICAgPHRpdGxlPlBIUCBNYXJrb3YgY2hhaW4gRFN7WyYqJl19MTA2IHR3aXR0ZXIgdGV4dCBnZW5lcmF0b3IgY3JlYXRlZCBieSBIYXkgS3JhbmVuIGFuZCBidXRjaGVyZWQgYnkgbWU8L3RpdGxlPntbJiomXX0NCiAgICA8bGluayByZWw9XCJzdHlsZXNoZWV0XCIgdHlwZT1cInRleHQvY3NzXCIgaHJlZj1cIm1hcmtvdi5jc3NcIiAvPiAgICANCjwvaGVhe1smKiZdfWQ+DQo8Ym9keT4NCiAgPGgyPk1hcmtvdiBDaGFpbiBEUzEwNiBUd2VldCBUZXh0IEdlbmVyYXRvcjwvaDI+DQoNCjxkaXYgaWQ9XCJne1smKiZdfW9vZGhhbmRzXCI+PC9kaXY+DQo8IS0tIFRoaXMgaXMgd2hlcmUgdGhlIHJlbWl4ZWQgY29udGVudCBnZXRzIHByaW50ZWQuIC0tPg0Ke1smKiZdfSAgICAgICAgPGRpdiBpZD1cInF1b3RlXCI+PD9waHAgZWNobyAkbWFya292OyA/Pg0KICAgIDwvZGl2Pg0KPCEtLSBUaGlzIGlzIHdoZXtbJiomXX1yZSB0aGUgY29udGVudCBpcyBwYXNzZWQgdG8gY3JlYXRlIHRoZSB0d2l0dGVyIFVSTC4gVGhlIHR3ZWV0IHZhcmlhYmxlIHVzZXMge1smKiZdfXVybGVuY29kZSB0byBrZWVwIHRoZSB3ZWlyZCBjaGFyYWN0ZXJzIGZyb20gbWVzc2luZyBldmVyeXRoaW5nIHVwLiAtLT4NCiAgICB7WyYqJl19DQogICAgPGRpdiBpZD1cInR3ZWV0XCI+DQogICAgIDw/cGhwIA0KICAgICR0d2VldCA9IHVybGVuY29kZSgkbWFya292KTsNCiAgICAke1smKiZdfW9uZXR3aXR0ZXIgPSBcIjxhIGhyZWY9XFxcImh0dHA6Ly90d2l0dGVyLmNvbS9zaGFyZT91cmw9aHR0cCUzQSUyRiUyRnRpbnl1cmwuY297WyYqJl19bS9tN2xwY3J6JnRleHQ9XCI7DQogICAgJGVuZGVyID0gXCJcXFwiPlwiOw0KICAgIGVjaG8gJG9uZXR3aXR0ZXIgLiAkdHdlZXQgLiAkZW5ke1smKiZdfWVyOw0KDQogPz5Ud2VldDwvYT4NCiAgICA8YSBocmVmPVwiaHR0cDovL2Jpb25pY3RlYWNoaW5nLmNvbS90cmlhbHMvbWFya292L2lue1smKiZdfWRleC5waHBcIj5SZWZyZXNoPC9hPjxicj4NCiAgICA8L2Rpdj4NCiAgICANCjwvYm9keT4NCjwvaHRtbD4NClwiO3tbJiomXX0=[[/code]


1 Comments matter and help stitch together the Internet.

2 which is from 2008 I might add- long tail etc. etc.

3 Note to self and other clueless people, urlencode is just a bit easier way to clean up the text than trying to think through a str_replace. That’s a fairly awesome example of the fact that I have no idea what I’m doing. I only happened across that function (?) by chance on some random StackOverflow post and it was as if the world just fell into place.

4 I consider that testing.

5 Serendipitously posted on the same day I had the conversation that inspired this and which I read last night (h/t Boing Boing).

6 You could get all meta-data and create profiles of interest to help algorithmically connect people with posts they might like etc. etc. but that starts to feel a bit different to me.

Markov Chains, Horse e-Books and Margins

In discussing trajectories, elements of engineered serendipity, “thought vectors in concept space” with Gardner and Jon yesterday the following occurred.

Gardner shared this video (which is well worth watching and I rarely have the patience for videos).

That led to a discussion about creating and using a MOOC/hashtag specific Twitterbot (like horse e-books but real1) using Markov Chains2 to create algorithmically driven conversations/connections that occur in the margins of intention and result.3

So I began messing with the idea last night. Given I have a completely illusionary knowledge of programming I looked for people to tell me how to do this. I found the metaphor a minute tutorial which will help me out with the Twitterbot end of things in the near future. I also found this PHP based Markov generator which does very nearly what I want absent the Twitter-ing part.

I did want to automate the connection to a particular Twitter hashtag rather than adding the content manually so I started wandering around looking for ways to do that. Step one was trying to use curl. I eventually semi-melded some curl examples with the Markov generator. I was using the Twitter search for #ds106 as the source initially. With curl you are pulling the html so I got some interesting pieces but a fair amount of code fragments as well. Stuff like . . .

I liked the code to some degree but figure a larger audience would probably ignore it. So I harassed Alan, Jim and Martin early this morning and got access to the #ds106 Twitter spreadsheet archive. I pulled it down as a txt file and used it for the source material. That started to get cleaner results like . . .

You can mess around with the semi-working (just refresh page and hit resubmit form- I did say semi-working) manual/random #ds106 tweet generator over here.


1 Really fake, I mean. I guess.

2 I’m not really sure if that should be pluralized or not.

3 There’s a whole additional piece where you think about larger scale curricular design which incorporates random elements and assignments that use algorithms to push people in new directions. That starts to get really interesting. I am considering how the assignment and maybe a browser plugin could create contextual variables based on what site you were on at the moment that would then be incorporated into the larger assignment- kind of a #ds106 remix on contextual steroids.