A drawing of a small terrier dog jumping through a hoop held by a monkey.

Code as Poetry, Time as a Variable – Options in the Ether

Backstory Driving into work I was listening to NPR and they were interviewing Nikki Giovani a poet from Virginia Tech. In high school I was one of those people who really suffered reading the The Red Wheelbarrow and other non-rhyming poems. They irritated me in the same way people seem to be annoyed by White Paintings or 4’33”. In any case, in college I took lots of English classes. One of those classes was on poetry with Donna Hickey. The class selection was driven more by fitting my schedule and a vague notion that I might minor in English rather than any real interest in poetry. The first day of class she had everyone list their favorite poets. I don’t recall what people chose but I remember feeling like my choices of Shel Silverstein and Dr. Suess was not of the same category. In any case, I had a great deal of fun with the course and using poems as games and puzzles to think around and through. I later took a graduate course with Dr. Hickey1 in poetry and made my first digital liberal arts website around 2001 or 2002. It focused on breaking down various Richard Hugo poems and creating attempts at multimedia experiences.2 That stuff is all long gone from the UR website. I might have backup […]

25

Rampages Numbers

In responding to some data requests, I delved into the WP tables to pull some rampages data. All users ever . . . I need to set up something more automated but for this I dumped the MySQL tables as CSVs and then just imported them to Google Sheets. With very minimal functions, I got this data. My favorite function combos are to pull the unique values from a range =unique and then do a =countif off those values. It’s a really quick way to get graphs like these.

Rethinking Our Project Page (and other stuff)

Image from page 211 of “Bulletin” (1961-1962) flickr photo by Internet Archive Book Images shared with no copyright restriction (Flickr Commons) I’ve been lucky enough to hire two awesome people who have started over the last month or so.1 We’re also going to get a new supervisor on July 3rd. That’s led me to have a bit of breathing room and a reason to start re-thinking some things. One of those things is how we combine documenting our work. Can we document what we do in a way that will create more people interested in doing these things? Can we do a much better job bringing active faculty to the forefront? Can we serve the end of the year report needs regarding various data elements? Can we gather data we might reflect on regarding our own processes? How do we knit all this stuff together from various services without a lot of extra work? The Old I’ve done this more than a few times. The latest incarnation at VCU was the examples page (pictured above). It is semi-decent but was done in haste. It tries to affiliate tools and instructional concepts with examples. Conceptually, it’s pretty close to TPACK in that way. It has done a marginal job thus far. It houses examples and people can browse them. It doesn’t […]

Trump Dump Data & Playing with TAGS

Image from page 60 of “Children’s ballads from history and folklore” (1886) flickr photo by Internet Archive Book Images shared with no copyright restriction (Flickr Commons) Driven mad by curiousity after this Matthew Baldwin tweet, I built this little thing. It uses the amazing Martin Hawksey’s TAGS for gathering the Tweets in Google Sheets and then displays it with Vue.js (which I’m sort of learning). It led me to realize that I could extend TAGS without much effort. My first attempt was to write two custom functions to get favorites and retweets. Turns out that was pretty straight forward given all Martin’s work. The TAGS element (TAGS.get) links me into Martin’s library and that’s that. So very easy once you know and then you’re just navigating the Twitter API. Turns out I can do something similar to get the Twitter bio. Initially, I just stuck these in like you’d do normal functions … =getRT(A1) or whatever and it soon ate up all the processing time allowed for my Google Scripts. That caused other things to break. Lesson learned. I then opted to set the value in the script and write that value to the cell. This little script runs on the spreadsheet change trigger and checks to see if there are any blank cells in the Favorites column (column U) […]

Importing CSV into Google Sheets via Google Script

flickr photo shared by OSU Special Collections & Archives : Commons with no copyright restriction (Flickr Commons) As part of a future project, I’m looking for easy, automated ways to push/pull CSV files around. Initially I thought I’d just do the =importdata(‘http://theurl.com/data.csv”) function but I realized that had some drawbacks that made it less ideal which lead to the script below. Using Google Script triggers this script could be set to retrieve a CSV file every X amount of time and write it to a Google Sheet. The upper portion is adapted from this answer. In any case, it opens up some decent automatic options and would keep data fresh for easy access charts and graphs in Google.

Rampages Stats vs My Own Data

flickr photo shared by NASA on The Commons with no copyright restriction (Flickr Commons) It’s that time of year when you try to prove to your institution that the work you do matters . . . and I am prepared to make it rain datums.1 I’m not sure how valuable this will be to others but who knows and it helps me to write it down. Currently, my former team of 5 is now just me. So this stuff is sole mio23 and compasses everything from manual password resets to full on custom development work with faculty. Can you tell when the semester started? That’s an increase of abut 2,400 sites in about 3 weeks. I’ve got our weekly Twitter/Google Sheet spitting this out and writing it to a spreadsheet so this data was handy. I’ve been playing with logging data for a variety of reasons. For instance I now tag my rampages support emails in GMail and that logs them to a spreadsheet each night. I’m at least mostly consistent doing that because it’s a very light weight action on my part. I can then get an idea how stuff is really playing out rather than simply my perception of things (although that matters too). The chart above represents rampages email support over the last 30 days as of […]

Scraping with Google Spreadsheets Across Instagram, Flickr, YouTube etc.

I remain kind of amazed with how many little tricks can be done with Google Sheets. After seeing Alan’s post today, I wonder how much of the data I could pull (assuming we had the right user names and knew the services . . . really the harder part) just using Google Sheets. Turns out we could get a pretty good amount. The following is a mix of XPath, regex, and APIs. I started with as little real programming as possible and gradually increased sophistication. The following are just meant to get a rough idea of how much stuff you’ve got in the various spaces. Flickr The URL: http://flickr.com/photos/bionicteaching The function: =IMPORTXML(C2,”//*[@class=’photo-count’]”) This uses a basic Google Sheets function to grab the photo-count content. The function is grabbing the div class with the title photo-count. Vimeo The URL: http://vimeo.com/twwoodward The function: =INDEX(IMPORTXML(C3,”//*[@class=’stat_list_count’]”),1) Pretty similar to the example above but with the addition of INDEX. That solves the problem that there are multiple items that are all in the stat_list_count class and we only want the first matching item. Sound Cloud The URL: http://soundcloud.com/cogdog The function: =REGEXEXTRACT(IMPORTXML(C4,”//*[@name=’description’]/@content”),”([0-9]+) Tracks”) This gets a bit fancier. IMPORTXML brings in a large chunk of content from the page but it wasn’t structured in a way that I could get the exact information I wanted. REGEX […]

YouTube View Count to Tweet

I was just messing around a bit more with mashing up the YouTube API post from earlier with the Twitter post from earlier today. This Google Script will get the view count of a particular video (Gangnam Style) and send out the total views and the time elapsed watching the video (assuming I did the math right). It has no real purpose (other than the scale of the number amuses me) but it’s one of those things with concepts that I’ll be able to apply more usefully in the future. That’s a gut feeling but I’m pretty confident. Don’t forget to add the Twitter library as directed here.1 1 in the Apps Script project, include the library MKvHYYdYA4G5JJHj7hxIcoh8V4oX7X1M_ with the identifier set to Twitter

WordPress Reflection Plugin – Step 1 – Counting Links

The idea that technology ought to help students reflect on their use of technology seems to make sense. As we have more and more students engaging in online writing little things come to light. Take the humble/magical hyperlink for example. We often look at the use of hyperlinks as a marker for progress in digital fluency. Are students using the thing that makes the web so webby? Can we help make that a point of reflection for them?1 I had a conversation with Laura a while back about pulling out URLs and looking at the their use over time by students.2 Clearly, these aren’t pure quantitative things. You’ll never say “Six links? Failure!” or even “Seventy four links? That’s an A+.” Not that I would ever think that about you but this is on the Internet and I don’t want anyone tying hyperlink numbers to Bloom’s levels and then linking to me. But it would be interesting to look back over your writing and see when you use lots of links and when you don’t. So, at the moment, that’s what this plugin does. It’ll do some more tricks in the future but these are early days. The plugin as it sits now (below) will do three things. It’ll run a regex on the post and store all the URLs […]

Scraping Instagram #2

flickr photo shared by Marco Gomes under a Creative Commons ( BY ) license Remember last night when I posted about scraping data from Instagram? I woke up this morning about 5:30 (literally with a start) astounded by how easy the solution to archiving the pagination returns was. So before I even left for work I managed to get this working so much better than my previous attempt. I stripped out all of the previous GitHub stuff as I realized I didn’t really need it. It had provided a nice crutch and let me know I sort of knew what I was doing. The explanation of what’s going on is in the comments interspersed in the code below. There’s a much cleaner way to do this where I don’t duplicate so much code. I could just call the part that builds the csv1 twice. I may do that at some point but I think having it all in one place will help people new to this sort of thing see what’s going on more clearly. This is fun stuff. I need to do more of it and more consistently. In the past, I’d do some programming for a few days and then not do any for a number of months. That makes for slow progress and frustration. I’m going to […]