Three O’s for Software Mangement

In my continuing saga1 of trying to explain how managing software costs time and energy, I decided to use the letter 0. In a perfect world, I’d do an explainer video with muppets with 0 as the letter of the day.


Whether you are preparing to bring something on or bringing it on, that’s an expensive time. Even if the software is free, you need to do the reviews for security, privacy, and accessibility.2 That’s a chunk of human time and expertise. In the system I hope for, we’re also setting the goals and what we’ll look at to see if it’s meeting those goals. That’s another chunk of work and discovery in terms of what data we have access to and possibly creating or altering surveys, establishing audiences etc. Then we’ve got to define our technical, pedagogical, and training support expectations and alignment to various positions/groups (not individual people). And finally, we’ll have to document all these decisions and publicize them.

All new software needs . . .

  1. to be reviewed for security, privacy, and accessibility (this requires skill and time)
  2. to have a purpose and data that helps determine if that purpose is being met
  3. to have defined support levels for technology, pedagogy, and training that is associated with specific positions
  4. to have this information consistently published and publicized
  5. to have specific contract review dates established


Now that we’ve got a application in the system, we’ve got to look at it regularly to see if it’s doing what it’s supposed to. Maybe this is once a semester, once a year, or some other pattern . . . but the people who set the goals need to look at the data on a regular basis.

I can easily see a series of calendar events being needed for each application. It’d be nice to get the ongoing reviews synched up so that we could make that more efficient. I could see getting all the people associated with particular software families and running through the data.

This shouldn’t be a huge amount of work. It might generate work though. Do we see trends in usage rates? Help desk tickets? What do we do about that?


This category is for things like long term contract reviews for an LMS. What are the irregular patterns we need to document and be aware of for each application?

In year 3 of a 5 year contract, what are we considering to see if we open the door to a full LMS request for proposal vs continuing on for another 3 to 5 years? How often do we update our security, privacy, accessibility reviews for long-term contracts? How do we make sure that happens? This kind of irregular interaction can be huge amounts of work.

Another O

Obvious. All this stuff feels painfully obvious. I’m confident this is part of some well established framework that was created in 1952. Maybe the words don’t start with O’s. I guess they’re part of ITIL and maybe I just need to spend more time in those places. It’s a challenge for me in some ways, a bit like when I took a leadership course in college. I found the ideas so obvious that I just wasn’t willing to cite them. This didn’t go well given the person teaching the course had a doctorate in leadership and was in the school of leadership. I highlight this as an indication that I do not always choose the most reasonable path and I don’t regret the majority of those choices.

1 Sounds cooler than whatever word means ongoing slog.

2 S.P.A. if you please, S.P.A. if you don’t please . . .

Leave a Reply