When engineering and construction services contractor BergerABAM built a cable trench at the Port of Seattle three years ago, it acted on bad data of the smallest of margins: The company made the trench 2.5 inches wide, instead of the required 2.52 inches. That missing 2/100ths of an inch caused more than $1 million in extra costs on the project, according to media reports. Overall, estimates of the added costs and business losses resulting from inaccurate data run into the billions of dollars per year.
Clearly, the stakes of data quality are high -- and they’re only getting higher in today’s fast-paced and technology-driven business environment. In an era of online systems and automated enterprises, there’s less margin for error on data accuracy. And with organizations depending more and more on business intelligence and analytics applications to drive decision making, an effective data quality process that starts with the business users who enter data into transaction systems is a must.
Add to that the fact that real- or near-real-time operations now represent the cultural nervous system in many companies. If a customer uses a mobile device to purchase, say, a set of cross country skis from an online retailer, the seller, in seconds, should process reams of data -- from the details of the transaction to customer service support data to information about additional purchases and more. Compare the possible business rewards of that data being correct with the problems that could result if some of it is inaccurate. That gulf could ultimately separate the business winners from the losers.
“More enterprises are realizing that effective real-time operations give them a competitive advantage,” said Andres Perez, president and senior information management consultant at IRM Consulting Ltd. in San Antonio. But he and other data management analysts indicated that such a realization should be accompanied by a full appreciation of the value of good data quality throughout an organization.
And that, they said, means rallying the troops to launch a comprehensive data quality program. From the top of a company to the bottom, efforts to maintain stellar levels of data quality should be viewed as a required part of the job. “Everybody in an organization needs to be aware that they influence data quality, and they should do their utmost to contribute to positive data quality efforts,” said Ted Friedman, an analyst at Gartner Inc. in Stamford, Conn.
The words ownership and accountability should never be left unspoken in data quality conversations, analysts said. Responsibility for implementing a corporate data quality strategy doesn’t harbor in IT or any single department or business unit; cooperation is needed from all quarters. In many cases, such programs involve operational changes and cultural shifts, which can pose organizational challenges. But the alternative -- dismissing data quality or trying to prop it up with minimal resources -- all too often involves prohibitive risks, from reduced productivity to lost business, customer defections and reduced profits.
Data quality software can help streamline quality improvement processes, but Friedman said to beware of employing technology as the sole instrument of data quality management. Data quality problems rarely can be resolved by a palliative of software tools. The real gains come from successfully incorporating business users into the mix of a data quality initiative.
The required business-process changes often encounter resistance, particularly if the expected benefits aren’t apparent to users. As a result, there’s a need for detailed education and explanation as part of a data quality training program, analysts said. In some cases, they added, IT and data quality managers as well as data stewards assume that business users already understand the value of the data quality process and the risks of letting faulty data creep into systems. Showing them the cause-and-effect relationship between data quality and business results should help drive home the point.
If only business users didn’t have day jobs. Assuming responsibility and accountability for data quality can also seem onerous if the required actions further complicate those jobs. The solution? Include representatives from business units in the process of drawing up new data quality standards, said William McKnight, president of McKnight Consulting Group in Plano, Texas. While doing so comes with its own challenges, such as the inherent difficulty of bridging the IT-business divide, “the business side getting involved in figuring out the rules is critical,” McKnight said.
Longer term, business users should be included in overseeing data quality improvement efforts through a data governance program that gives central roles to departmental managers or workers, analysts said.
What’s vital, they added, is to make sure that data quality is a priority up and down an organization and to do what’s necessary to ensure that it gets the required attention.
“More and more businesses need to invest resources for quality, useful information,” said Lyn Robison, another Gartner analyst. “A business that can’t produce useful information is like an airplane that can’t fly. How useful is that? Not very.”
Roger du Mars is a freelance writer based in Redmond, Wash. He has written for publications such as Time, USA Today and The Boston Globe, and he previously was the Seoul, South Korea, bureau chief of Asiaweek and the South China Morning Post.