BACKGROUND IMAGE: iSTOCK/GETTY IMAGES
The current worldwide financial crisis has spawned numerous new regulations in the U.S. and Europe. The likes of Dodd-Frank, Basel III and Solvency II have joined existing laws and standards such as Sarbanes-Oxley. This regulatory regimen imposes stronger risk management rules and requires companies to affirm that their financial results and other corporate reports are accurate, which, in turn, implies that the data on which such things are based is correct.
Therefore, you might reasonably expect that data quality measures and levels would have gotten a real boost over the past few years. After all, companies that perhaps didn't sort out their data quality issues very well in the past have been pushed harder to do so by government edict. But has that actually happened?
In 2009, my consultancy, The Information Difference, conducted a survey on the state of business data quality in organizations. We re-ran the survey this year, sponsored by SAP and with 210 respondents -- half from North America, 41% from Europe, the rest from elsewhere. More than half were from organizations with annual revenues above $1 billion. Unfortunately, the results point -- again -- to a rather dismal state of affairs.
In the 2009 survey, 66% of respondents reckoned that their overall data quality was "good" or better; in the new survey, that figure fell slightly, to 63%. Conversely, the proportion of respondents with an active data quality program has increased, from 37% to 57%. That suggests there has indeed been a marked increase in the level of activity around data quality management, but it's troubling that the net effect of all the expanded effort is a slight decrease in the overall perception of data quality.
Good data quality help is hard to find
Perhaps the responding companies should call in the experts to lend a hand? Truth be told, many of them have: In 2009, just 27% used outside consultants to help with their data quality improvement initiatives, but this year that rose to 53%, effectively doubling the use of external expertise. It doesn't appear, though, that the increased assistance has had any broad beneficial impact thus far.
Why is data quality such an intractable problem? The two biggest barriers identified by the 2013 survey respondents were: "Management does not see this as an imperative," and "It's very difficult to present a business case." Those are the same responses identified as the main barriers in 2009. Another consistent finding is that only about one-fifth of the survey takers have built a business case for a data quality program; and the percentage saying they have done so slipped slightly in the latest survey, from 22% to 21%.
One reason why building a business case and enlisting the support of senior management are such enduring issues can be seen in the answer to another survey question. When asked whether their companies measure the cost of poor data quality, 57% of this year's respondents said they had made no attempt to do so -- only a little better than the 63% in 2009. This is a sorry state of affairs. It's unlikely that many major data management initiatives are going to win over C-level executives and get funded without a business case, and any decent one will include hard-dollar data on current costs and expected benefits.
The survey also probed the scope of existing data quality assurance initiatives with this question: Did they cover all corporate data, enterprisewide? Only 28% of the 2013 respondents said their programs did. The most popular domain covered by quality efforts was name and address data (of customers or suppliers), which maps to what vendors of data quality tools have focused on; product data and financial data were next on the popularity list.
Data quality picture not a very pretty one
Big data was another survey topic, and 55% of the respondents said they believed that data quality was "very relevant" for big data projects. Yet that would seem to be whistling in the wind, given the poor state of affairs that the survey revealed about the quality of "small data" -- i.e., what most organizations actually have now.
And there is a business price to pay, even if companies don't always quantify it. Survey respondents were asked to volunteer real-world stories of data quality problems affecting business operations, which made for fascinating reading. They reported everything from the relatively minor "duplicate billing" and "customer goods shipped to incorrect locations," to major mishaps -- for example, "cancellation of a contract due to an undeliverable reminder letter" and "CEO had to revise price/earnings over concern that basic measurement is out of control."
More on data quality measures and strategies
Learn tips for improving data quality in a cost-effective manner
Enhance business intelligence success with sound data quality strategy
Get executive buy-in to implement data quality strategies that work
Overall, the survey paints a not-so-pretty picture of enterprise data quality. Despite the increased emphasis on regulation, and all the horror stories about the consequences of poor data quality, and a significant increase in the percentage of companies expending effort and money on data quality measures, reported quality levels haven't improved over the past four years. Indeed, they've worsened slightly.
Companies need to do better. And all of the government regulators pinning their hopes on shiny new risk management models and elaborate sign-off assurances to avoid another financial meltdown would be well advised to consider just how reliable the information is that they're being fed by large corporations.
Andy Hayler is co-founder and CEO of London-based consulting company The Information Difference Ltd. and a frequent keynote speaker at conferences on master data management, data governance and data quality. He also reviews restaurants and blogs about food on the website Andy Hayler's Restaurant Guide.