Time to get in the game on data quality strategy, processes
As companies rely more and more on data analytics to help drive decision-making, good data quality is becoming even more important, lest errors lead to business missteps. And it appears that many organizations at least recognize the need to up their data quality game.
In a Magic Quadrant report published in November 2016, Gartner analyst Saul Judah and two colleagues wrote that they saw growing demand for data quality tools from large and midsize businesses. They predicted that worldwide revenue would top the $2 billion mark by 2018, up from $1.35 billion in 2015, making data quality one of the fastest-growing segments in the enterprise software market.
But it's too soon to declare victory in boosting data quality processes. Only 26% of 368 companies surveyed for the Gartner report said they enforced data quality standards at an enterprise level, and a mere 8% had created formal metrics to track the impact of their data quality efforts; 22% used informal metrics, but 59% weren't measuring progress at all, the analysts wrote.
They also cited a relatively low number of inquiries from Gartner clients about data quality in big data environments. Merv Adrian, another Gartner analyst, echoed that observation during a presentation on the pace of big data deployments at the 2017 Pacific Northwest BI & Analytics Summit in Grants Pass, Ore.
"The number of organizations I talk to that aren't focused on [data quality] is pretty scary to me," Adrian said. "They just haven't thought it through." He proposed a new mantra on the need for good data quality in the big data analytics process: "No data quality, no insights."
This handbook offers insight and advice to help IT and analytics teams put themselves in position for a win on effective data quality procedures to support accurate analytics.