Trend watch: Data management and business intelligence technologies
A comprehensive collection of articles, videos and more, hand-picked by our editors
Science fiction writer Arthur C. Clarke's "third law of prediction" holds that the best technology is like magic. If so, that's both good and bad for IT and data managers, as well as business users: Good, because it truly describes the transformative effect technology can have, and bad because dressing up a new technology as magic is tempting for vendors. To the present: Thanks partly to big data technologies, we could be on the verge of a second machine age, with exponential change on par with the first machine age -- but it could also turn out to be an illusion.
Few technologies fall utterly flat, but many fail to justify their initial hype. Object-oriented databases, neural networks and artificial intelligence are just some examples of technologies that didn't live up their proponents' promises, though they ultimately did amount to something. Now, despite some success stories, the jury is still out on big data as a transforming technology.
Once you go beyond online advertising and marketing, successful big data use cases thin out. Some other killer applications may yet emerge. There are promising signs in agriculture, application management and other data-driven disciplines. But it hasn't quite happened yet.
As always, human factors are in play. Digitization is piling up data of all kinds at a prodigious rate. But the useful analysis of that data typically still relies on human judgment, and people have neither grown an extra brain nor exhibited giant leaps in IQ.
The technologist's answer is automation of intelligent operations, machine learning and predictive analytics. And the latter is a particularly charged topic -- foretelling the future has always been a challenge. What's different this time around for predictive analytics to find wider utility?
MIT professor predicts bright predictive future
That and other big data topics were much discussed at last month's MIT Sloan CIO Symposium in Cambridge, Massachusetts. According to Erik Brynjolfsson, an MIT professor and head of the MIT Center for Digital Business, predictive technology is improving at a significant clip. He thinks Internet-scale data is a key difference today. "We do a lot of predictive analytics in my group, especially using Google searches," he said. "I think it's like collective ESP."
Google searches are a window not so much into what people find, but what they're thinking of and what they want, Brynjolfsson said. He added that his group's work has shown you can use that information to predict how people will behave.
"For instance, we can predict with good accuracy what housing prices are going to be in the future," Brynjolfsson said. That is a prediction of a general trend, not what a specific individual will do, he acknowledged. But he cited recommendation engines and online advertising as predictive analytics success stories that are tailored to increasingly more narrow demographics.
"Whether it is who is going to click on an ad, or who is going to buy a book … that's working pretty well," Brynjolfsson said. He sees broader use of big data analytics in the future too, for example, to help predict who is going to get diabetes or other diseases. The data required for that is already accumulating, thanks to the use of loyalty cards that track food shopping preferences.
Notes on the new machine age
Brynjolfsson recently co-authored The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies, which outlines reasons why something new may be happening in milieus like predictive analytics. He and Andrew McAfee, also of MIT, wrote that incremental advances in both personal and business technologies may be adding up to great effect.
They said upcoming changes based on digital technology could be as momentous as those at the beginning of the steam engine era -- when James Watts' dedicated tinkering suddenly led to efficient steam engines that drove the Industrial Revolution forward, eventually leading to what came to be called the Machine Age in the late 19th and early 20th centuries.
The authors focused on several elements that they say indicate a new machine age is unfolding:
- Digitization has given computers more and more to crunch on.
- Exponential improvements like those seen in smartphone processing chips are creating more diverse data and more untethered clients.
- Different technologies feeding on those advances are now being combined in forms where the value exceeds the sum of the parts.
- Brynjolfsson and McAfee also cited the 2011 victory of IBM's Watson system on the TV game show Jeopardy!, the results of DARPA's autonomous vehicle challenge in 2004, the development of Apple's Siri personal assistant, and other examples as indicators that we're on course for a second machine age.
If you suggest to Brynjolfsson that effective big data and predictive analytics may be farther off than some software vendors imply, he's philosophical.
"We have come upon an inflection point to do that kind of thing -- whether it is Siri or Google Now or predictive analytics," he said. It's important to remember, he added, that flipping an on-off switch won't usher in the new era; it's continuous improvement in making small predictions that will help illuminate the way. That might not sound like magic, but it could become reality.
Learn about fast-rising cognitive computing -- its pros and cons
Catch up on your machine learning
Check out coverage of MIT's 2013 CIO Symposium
How DevOps may help to cope with changing ML models