The Future of Search

It has been interesting to see the excitement surrounding WolframAlpha .

The new “Computational Knowledge Engine” called Wolfram|Alpha has gone through a full media cycle before it has even been unleashed on the world. It has been hyped as a “Google Killer” and denounced as snake oil, and we’re still at least a few days from release.

The simple goal behind the engine is to connect searchers with precise information. Wolfram|Alpha’s search magic comes through a combination of natural language processing and a giant pool of curated data.

That quote is from Radio Berkman (which is a very interesting podcast out of Harvard Law) and they’ve got an interview with the creator as well. Watch the abbreviated 10 minute version below.

I’m not sure how well the idea of a curated semantic web will work (although I can understand that urge). This does really show a different way to think about searching for information. It really takes it beyond search, making it closer to exploration maybe.

It’s similar in some ways to one of David Huynh’s Parallax project (of Simile Exhibit fame) which has been out for quite a while now. Video of that is below.

Freebase Parallax: A new way to browse and explore data from David Huynh on Vimeo.

While the media may be portraying Google as being totally out of the mix here, keep in mind the GoogleLookup function available in their spreadsheet program. Don’t get me wrong, it’s limited and somewhat frustrating but it shows that they’re thinking along similar lines and playing with the same ideas.

Simple search isn’t enough. The data and the connections are too complex.

Where all of this matters educationally is that search and how you’ll be able to interact with the information you find is going to change drastically because of the size and sophistication of the information we’re dealing with. The finding phase is going to become more and more a component of use.

No doubt you’ll still be able to look for simple answers and that’ll be as necessary as ever but being able to leverage the artificial intelligence and squeeze the potential out of the relationships in the data you find will be a major difference maker between individuals who thrive in this environment and those who don’t.

The computer can show you possible connections but it can’t force you to see their importance or relevance. So in that way it increases the need for people to be able to grasp sophisticated relationships, to be able to analyze and make connections in interesting and useful ways. By making the data more accessible and showing that it is related, it increases the importance of the ability to make the connections as a basic skill.

It’s also important to see how this devalues certain skills as well. The obvious connections are going to be made by the computer. That’s going to make cognitive jumps, the ability to assimilate and relate seemingly unrelated material, the ability to understand the macro level of connections much more important. There’s beauty in being freed from the grunt work, but it makes the idea, the conceptual portion, that much more important.

Many of those worrying about white collar jobs being outsourced to China and India haven’t really thought it through far enough. If it’s repeatable logic computers are going to end up doing it. In some cases, they already are1.

Excerpts from
Robot makes discover all by itself (click for full article)

Meanwhile, some software programs can analyze data to generate hypotheses or conclusions, but they don’t interact with the physical realm. Adam is the first automated system to complete the cycle from hypothesis, to experiment, to reformulated hypothesis without human intervention.
….
They armed Adam with a model of yeast metabolism and a database of genes and proteins involved in metabolism in other species. Then they set the mechanical beast loose, only intervening to remove waste or replace consumed solutions. The results appear Thursday in Science.
…..
Still chugging along on its own, it designed experiments to test its hypotheses, and performed them using a fully automated array of centrifuges, incubators, pipettes, and growth analyzers.

After analyzing the data and running follow-up experiments — it can design and initiate over a thousand new experiments each day — Adam had uncovered three genes that together coded for an orphan enzyme.
King’s group confirmed the novel findings by hand.

Granted all of this is at an early stage but you can see glimpses of the future. And by “future” I mean 10 years (or less) down the road2.

What does it mean for education? It means a lot. We better get moving.


1 That sounds melodramatic but the cost of computational power is plunging while wages are rising. If you can be replaced cheaply, you will be.

2 Try imagining any of the things in this post in 1999.

One thought on “The Future of Search

Comments are closed.