BACKGROUND IMAGE: iSTOCK/GETTY IMAGES
If corporate executives are faced with a choice between a proactive data quality plan that includes active participation by business users and traditional, IT-led approaches to fixing data quality problems, which will get the nod? The answer might well depend on whether IT managers can get the execs to understand the business ramifications of bad data.
Reactive responses to data quality issues are commonplace but often ill-conceived, according to data management analysts. Most pointedly, they said, trying to identify and resolve data errors that have found their way into an organization’s systems takes up valuable time and can cast data quality professionals into a purgatory of inefficient scrambling.
Not being proactive about dealing with data quality shortcomings can be particularly problematic in today’s business environment, with technology driving business processes and data being processed at breakneck speed. “Good enough is not good enough” -- the maxim of perfectionists -- now applies to data quality like never before. Or, at least, it should, said analysts such as William McKnight, president of McKnight Consulting Group in Plano, Texas.
What corporate executives definitely understand is the importance of making good decisions. But do they recognize the connection between the data quality process and sound decision making? “Decisions based on quality data can be quality decisions, and the value of a quality decision is almost immeasurable,” McKnight said.
The forward-thinking approach is to establish and maintain top-grade data quality from the start of the data entry process, with business users hewing to internal data quality standards as they input and update information. To win the support of senior executives for such efforts, IT managers should plan to demonstrate the cause-and-effect relationship between data quality and the bottom line and offer a comparison between the costs of being reactive and the potential benefits of a proactive data quality strategy.
In many organizations, that might require a convincing explanation. “Spelling out to corporate executives how data quality is central to the business strategy by showing concrete anecdotal and empirical evidence is a good idea,” said Lyn Robison, a research vice president at Gartner Inc. in Stamford, Conn. “You would think that they would already connect these dots, but they have a lot of things on their minds.”
In addition, busy executives frequently assume that improving data quality isn’t feasible or that developing and implementing a robust data quality program is a daunting task, according to Robison. “Information science is a relatively new discipline and to corporate executives it can seem like black magic, when in fact improving data is relatively easy,” he said.
Eyeing quality on the data assembly line
Larry English, president of consultancy Information Impact International Inc. in Brentwood, Tenn., recommends that organizations look at information management and data quality based on the principles advocated by W. Edwards Deming, an author and consultant who was one of the catalysts of Japan’s manufacturing revolution in the 1950s. Applying Deming’s theories, English views data as a raw material from which polished information is derived and presented to business users -- a concept that he said can help frame discussions about data quality best practices with corporate executives.
“Whenever I go into an organization, I try to get into the executive office and help them understand the absolutes [of information quality] so they understand that if they have process failures it’s going to create problems and losses across the organization,” English said. That, he added, can clinch the deal for marshaling a comprehensive data quality plan that reaches from the top to the bottom of a company.
In addition, business leaders should be made to understand that using data quality software to find and fix data errors represents only a piece of the data quality puzzle. Data profiling tools and other technologies can help facilitate data quality efforts, said David Loshin, president of Knowledge Integrity Inc., a consultancy in Silver Spring, Md. But, he added, “automated processes alone don’t solve data quality issues.”
Once corporate executives are convinced of the need to approve and fund a program that gives business users some of the responsibility for data quality, they might also need to spearhead the initiative in order to make sure that everyone in an organization gets on board.
Internal resistance to new data quality standards and mandates should be expected; analysts said that the best way to foster enterprisewide cooperation and a sense of esprit de corps among business users is to have corporate leaders explain the importance of high-quality data to the troops and then continue to discuss data quality on a regular basis. Lack of cooperation primarily stems from a lack of communication, they added.
Roger du Mars is a freelance writer based in Redmond, Wash. He has written for publications such as Time, USA Today and The Boston Globe, and he previously was the Seoul, South Korea, bureau chief of Asiaweek and the South China Morning Post.