Organizations need to focus on being more proactive when designing or updating data quality management strategies, according to IT industry consultants and technology professionals.
Experts say organizations spend lots of valuable time cleaning up data quality problems in the data warehouse and business intelligence (BI) environment, when a more proactive strategy could save time, ensure more reliable information and ultimately help business workers make better decisions.
"You have to be proactive," said Suvendu Datta, a data warehouse team leader with a large insurance company. Datta's team is in the process of implementing a new data warehouse, along with new data quality policies. "I think you should take care of data quality [in] the design phase."
If we don't solve the root-cause problem, the process improvements that we attempt to implement may fail.
Larry English, founder and principal, Information Impact International Inc.
But experts warn that the decision to design a more proactive data quality management strategy comes with a special set of challenges. For one, it's difficult to justify an investment in improving the data capture and validation processes that occur "upstream" from the data warehouse, said Rob Karel, a data management analyst with Cambridge, Mass.-based Forrester Research Inc.
Upstream data quality projects can take years to perfect, Karel said, and the return on investment (ROI) may be difficult to measure. Instead, most organizations opt for data quality tools that are geared toward normalizing data in the warehouse. Such "downstream" tools are easier to justify because they produce tangible results more quickly.
Yet while these downstream data quality tools do their jobs reasonably well, Karel thinks they're no substitute for capturing data properly in the first place. By taking the time to engineer more effective data collection and governance techniques, he said, organizations will greatly reduce inefficiencies and increase the level of confidence in business information.
"The data in the data warehousing environment [isn't] created there," Karel said, "so any BI or data warehousing professional trying to build out a solution is at the mercy of the data coming in. They're really just chasing their tail when implementing batch data quality within the data warehouse."
Karel explained that many data quality problems arise at the point of data entry. For example, a call center agent might accidentally type in the wrong address or phone number of a new customer. These errors can then be propagated to any number of other source systems, such as customer relationship management or enterprise resource planning applications. It's a tough problem to address, he said, because business units charged with collecting customer information tend to value speed over accuracy.
"The reason people don't do [upstream data quality] is that it's a heck of a lot more invasive. You're actually changing the customer experience or changing the call center process or changing the account management process," Karel said. "That could impact [customer] experience or increase average handle time."
Create a more proactive data quality management strategy
Despite the obstacles, there are several steps organizations can take that will result in more proactive data quality management, according to experts.
The first step is to focus on the system that is causing the most headaches, according to Karel. For example, if a CRM application is wrought with data quality issues, it is probably a good idea to focus on the information collection processes associated with that application.
"When you're trying to be proactive, don't try to identify every single point of data capture and update," Karel said. "Try to identify which systems or applications or processes or teams are impacting the highest volume of valuable data and start using that as your pilot."
But be careful not to take a "siloed" approach. When focusing on individual processes, be sure to keep enterprisewide data governance goals in mind.
"Work collaboratively with the process owners [and] use that as a testing ground," the analyst said. "If you're able to implement that effectively, then you can extend the same logic to other touchpoints."
It's also important to seek out the root causes of data quality problems and not simply the precipitating causes, added information quality consultant Larry English, the founder and principal of Nashville, Tenn.-based Information Impact International Inc. For example, if that call center agent repeatedly makes mistakes when capturing customer information, the root cause is likely a lack of proper training.
"If we don't solve the root-cause problem, the process improvements that we attempt to implement may fail," English said. "We must implement error-proofing techniques in the processes, both manual [and electronic] processes."
Rethinking data quality management
Richard Ordowich is another consultant who believes that data quality management strategies tend to be too reactive. A senior partner with Princeton, N.J.-based STS Associates Inc., Ordowich says organizations that want to be more proactive must adopt a culture of quality -- and not just data quality.
"If the organization has established a quality culture then evolving to a data quality program is possible," Ordowich said in an email interview. "If the organization has no quality culture, then the concepts of data quality will be abstract, distracting and annoying."
Simply writing new information capture procedures and putting them into a handbook will not be enough to change the culture of the company, the consultant continued. Instead, it takes training, collaboration and a willingness to make fundamental changes.
"Manuals will be put on the shelf and ignored without the cultural adoption," he wrote. "Establish a culture of quality before embarking on data quality."
Once the culture of quality is established, organizations can begin the task of implementing proactive data quality measures. Analysts often say this is a good time to make sure that the responsibility for data quality falls on the shoulders of business units -- the people who create the data. But Ordowich questions this conventional wisdom. Instead, he thinks organization's should consider making it the chief financial officer's responsibility.
"CFOs have been dealing with data quality since the early days of the profession. They are the most aware of the potential consequences of bad data since the result to them may be jail," he wrote. "If you can’t convince the CFO for the need for data quality, then perhaps the organization is not ready or the ROI is not adequately justified."