Yesterday, I decided I’d look for four leaf clovers getting in and out of my car. Not hanging out searching, just opening my eyes and paying a bit more attention. Wikipedia tells me there’s one four leaf clover per 10,000 three leaf clovers.
What surprises me is despite their relative rarity just how many four leaf clovers seem to be out there.
It’s like interesting things. If you just start looking around, you end up amazed at how many interesting things surround you daily that you never noticed.
One interesting thing leads to another. It gets to be harder to pay attention to more mundane things like crossing the road because there are so many interesting things to see and think about.
I tried to take pictures representing each question I had walking to work the other day. I only decided to do it about halfway in but it was interesting to see it snowball because I made it intentional. The results are embedded below as a set. Additional questions are sometimes in the descriptions and won’t be visible in the embedded view.
Dodge Caravan takes on a very odd feel if you read it literally. I decided to start capturing all the car/bike names I came across that were also actual words. I’m working on a categorization system for them. It’s another interesting way to shift how you process the stuff that normally just flows on by.
This is my first attempt with the GoPro. I think it’s set to one shot every 30 seconds. You can see me fiddling with the settings a few different times if you’re masochistic enough to watch it through. The battery ran out early in the trip and resulted in me using it without the stand. That helps explain the repeated drifting as the USB cord1 pulls it slowly towards the driver’s side. It is interesting to see a 12.5 hr trip condensed down to 4 minutes or so. I may do it on the way back but pointed mostly towards the sky or maybe at the kids.
On the other end of the time lapse spectrum is this attempt to condense one my attempts to fix a photo from the reddit pic request group. This one is kind of amusing to me in that you can see me googling some stuff for a sick child in the middle and finishing up with some posts to reddit and flickr. There is no sound but it’d be pretty easy to narrate if you wanted to make it more instructional.
The screenshots were generated based on this post. It is a copy/paste terminal command that will take a full desktop screenshot every X seconds.
In either of the cases above you end up with a fist full of images. It used to be that you could set default input length for still images in FinalCut. I can’t figure out how to do that in FCP X. There are lots of people claiming (and even showing) other ways to bulk change the length of static images once they are in FCP. Several ways didn’t work for me. What did work is …. drag/drop the images into the timeline then . . .
All this from the random ramblings of a robot algorithm.
Because asides are what this post is about after all), you may recall some attempts I was making to use an IFTTT recipe to pull my Tweets into a Google spreadsheet to mess with them a bit more.
I decided to see how often I’d get close to the full 140 characters. In playing around with the chart types I decided to visualize it with the radar chart. I was just curious what it would look like. No real reason. Strangely it has completely frozen the chart. I can’t remove it or interact with it in any way. It looks like the first image on my end and gives the second humorous (to me) error message on the published view.
It’s always interesting when you break something. Usually it’s best if it’s fixable but I don’t mind too much in this case. there’s something fairly attractive about breaking a web service in this way.
It’s supposed to represent the role of mind/emotion in creating engagement but the very fact that I feel compelled to explain that probably means I’m not doing a great job and I wonder about the degree to which I’m joking. There are elements here I may end up making work though. I can parse a few out for a #ds106 assignment as well . . .
Anyway, the much cleaner version is up and running. It now allows you to push the results to Twitter although I’m still adjusting this a bit. The code for the page I modified is below. It’s still slower than I’d like but it’ll do for now.
The fact that I can go from a conversation one day to a fairly finished product the next is the piece that amazes me about computers and the Internet. I cannot stress enough that I don’t know how to write PHP. I feel that’s a statement of empowerment. This project took about three hours of work. 95% of that was searching/research and breaking it and then fixing it.4 Someone who knew what they were doing could probably knock it out in ten minutes.
Now how is this more than just random #ds106 amusement? I think the generator works a little like this example about machine imagined artworks5. So there’s a chunk of human constructed meaning from machine assembled pieces. It doesn’t always work but that’s part of why I like having a human layer between generation and Twitter publishing (although I may still automate it when that makes sense time wise). This does generate interesting assignments and juxtaposes them in ways that are similar to the remix assignments idea but with an additional dose of randomness that I like. It also brings older conversations, student products, and links back into the conversation that’s occurring now.
The possibility of the built in @ convention of Twitter has a lot of possibility as well. That element of personalization and specificity could bring people back into the game/course/conversation in ways that reenergize both the participant and the community. There are lots of ways this might be attractive even absent the Markov element. I think there’s value in trying to pull people back into public conversations via methods like this. If participants in a class (MOOC-ish or otherwise) opted in, you could randomly (and judiciously) @ them to engage in conversations around different concepts, posts, products etc. It’d be a balance to avoid being boring and/or spammy but it might be the prompt needed to have a longer term engagement with a course/community.6
PHP Markov Chain text generator 1.0.1
Copyright (c) 2008, Hay Kranen
License (MIT / X11 license)
Permission is hereby granted, free of charge, to any person
obtaining a copy of this software and associated documentation
files (the "Software"), to deal in the Software without
restriction, including without limitation the rights to use,
copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the
Software is furnished to do so, subject to the following
The above copyright notice and this permission notice shall be
included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
OTHER DEALINGS IN THE SOFTWARE.
Order seems to be the level of chaotic magnitude in the combination from the source. Higher numbers seem to be less disorder.
Length is the number of characters in the returned string.
Text is the path to the text file containing your source data. I did use curl previously to pull directly from a website but found parsing out the html to be problematic.
1 Comments matter and help stitch together the Internet.
2 which is from 2008 I might add- long tail etc. etc.
3 Note to self and other clueless people, urlencode is just a bit easier way to clean up the text than trying to think through a str_replace. That’s a fairly awesome example of the fact that I have no idea what I’m doing. I only happened across that function (?) by chance on some random StackOverflow post and it was as if the world just fell into place.