Two communications satellites collided in geosynchronous orbit two days ago, and, while no one is expecting an interstellar solar array to crash into your backyard as a result, the impact of this collision can and should be heard across the business world, especially as we all ponder a brave new world replete with seemingly infinite data and, allegedly, the analytical resources to understand those data.
According to published reports, Iridium, the owner of the American satellite that disintegrated in the upper atmosphere, claimed that the collision with the defunct Russian satellite was unavoidable, the result of an increasingly crowded sky and a lot of bad luck. Or was that really what happened?
How about placing the blame where it belongs: a massive, and potentially highly costly data analysis failure. One that, if we believe the common wisdom about how easy it is to accumulate massive amounts of data and turn them into “actionable information,” should have been relatively easy to overcome. Except it wasn’t.
If you sense an object lesson, one of reality’s little dope slaps across the collective knuckle-headed wisdom of our industry, you’re catching my drift. Read on.
Here’s what happened from a data analysis standpoint. There are are 18,000 objects tracked by the Pentagon in inner space, some of which are space debris, others working satellites. Every day, Iridium said, it receives 400 “conjunction” reports from the Pentagon about possible collisions with Iridium’s 66 satellites. Inner space is an enormous three-dimensional area, in which these 18,000 objects caroom about in largely predictable orbits. This is a large, but also finite data analysis problem. It should be easy to do something to keep those 18,000 objects from running into one another. Heck, that data warehouse has to be smaller than Walmart’s. So analyzing the problem should be a no-brainer, right?
Wrong, according to Iridium. The company “didn’t have information prior to the collision to know that the collision would occur,” in the words of company spokeswoman Liz DeCastro.
Ms. DeCastro’s words were echoed by the Pentagon, which basically said it can’t track all those 18,000 objects all the time. Though it does manage to send Iridium, on which some of the Pentagon’s communications depend, those 400 reports per day. Something is being tracked, apparently.
So, a couple of questions are begged by this story. If the 400 conjunction reports are useless, as this collision seems to prove, why are they being collected in the first place? And, if the Pentagon can’t track all those 18,000 objects all the time, what are these conjunction reports about anyway? A waste of taxpayer’s money? Perhaps.
But the more salient questions for our world of enterprise software are the following: Was it really not possible to do some analysis that could have prevented this collision? And, as we collectively build data collection capabilities across the enterprise, sucking up every datum from the factory floor, the extended supply chain, and the consumer world, what are we doing about our ability to really analyze those data? I would warrant that the failure of Wall Street’s much-vaunted quants in keeping their own knickers out of the fire they helped create was a major proof point in the failure of our collective data analysis capabilities. And here’s another: despite all the watching, data gathering, and notification, there was nothing Iridium or the Pentagon, apparently, could do to stop the collision from happening.
Ms. DeCastro chose her words carefully: There wasn’t any information Iridium could have used to prevent this collision. There was, however, a payload’s worth of data, in real-time, that was intended to help prevent this kind of disaster. The fact that Iridium, which has a financial interest in keeping its satellites in the air and its customers communicating, and the Pentagon, which has a lot of complex interests in keeping space debris from hurting its spy-in-the-sky capabilities, not to mention any future space-bound weapon systems, cannot analyze the available data and predict this kind of outcome in an “actionable” way is an object lesson to the business intelligence world: Too much data is just as bad as too little data, especially when you get the same lousy result.
So, if you think those conjunction report-equivalents you’re producing by the hundreds are going to keep your satellite from crashing into someone else’s, think again. Those reports could end being as useful as the Pentagon’s conjunction reports: an excuse for failure and an example of the futility of the world of too much data. Further proof that we have a long way to go before our analytical capabilities catch up with our data gathering capabilities. A very long way indeed.