Attack of the self-thinking computer

September 10th, 2008

On Monday, United Airlines’ stock plummeted from around $12 to $3. At first glance it was perplexing because it just dropped and United’s holding company, UAL hadn’t announced anything big enough to cause a 75% drop in price. Well…they announced something that would have, in 2002.

Silicon Valley Insider has a great analysis of exactly what happened, so go read that and come back here.

Something about this story feels like a piece of luddite science fiction. Just because the Chicago Tribune didn’t put a dateline on a story in 2002 that a corporation with enough problems of its own gets its stock hammered because some software was doing exactly what it was designed: dishing up popular stories to people who want to read such things. Suddenly United’s stock is very cheap for no logical reason. So much for rational investing, no? Actually, it shows that markets move according to the best possible information. But when that information is dished out by inflexible software programs, how efficient is it really?

There is so much information available to any one person, and so much potentially useful information, that we need tools to help us sort through that information. We can no longer rely on individuals at newspapers to decide what’s newsworthy, because even they can’t sort through all the information. Instead, we must use tools that can remove the signal from the noise. To give us the information that we need, that’s important, that we need to know.

But every website, every company, every entity online has their own systems for managing this information. And even in a single organization there can be several to hundreds of different ways of sorting information: by topic, popularity, etc. These tools don’t exist in a vacuum – the internet has no dark matter. So when Google crawls a website at an ungodly hour when all two visitors happen to be reading the same old story, the page gets cached and changes Google’s search results. Later in the morning, when a reporter at Bloomberg sees the page pop up on their Google Alerts, the wrong story goes out on the wire all because of the randomness of human action interfered with logical code.

This kind of thing is an informational flaw, a data mutation. Kinda like a biological mutation. Only cooler, because it’s easier for us to toy with information than with living organisms. This could cause all kinds of mutations, from viral videos to bands becoming suddenly popular to a spontaneous political movement to leaps in technological advancement. All because of the interaction of automated systems.

Similar posts:

Comments are closed.