IT fix-it approach not enough for effective data quality strategy

Relying on data warehousing and integration teams to clean up data errors is a common practice. But preventing data quality problems is better than trying to cure systems of them later on.

In many organizations, the prevailing view of data quality management is that someone else will take care of it. Front-line business users aren’t asked to focus on avoiding data errors, so they don’t make it a priority as they enter and update information in corporate systems. The data quality problems that result from such laissez-faire attitudes are left to be cleaned up later, often when data from different transaction systems is being consolidated for loading into data warehouses.

That might not cause big business problems for companies -- not for the fortunate ones, at least. But as freelance writer Roger du Mars reported in June on, the overall cost of bad data likely can be counted in the billions of dollars each year. And implementing an effective data quality strategy is becoming increasingly important because of a combination of factors that leave less room for data mistakes: the proliferation of online systems, increased automation of business processes, and user demands for real- or near-real-time access to data. What to do? For one thing, get end users more involved in the data quality process, according to data management analysts.

“Everybody in an organization needs to be aware that they influence data quality, and they should do their utmost to contribute to positive data quality efforts,” Gartner analyst Ted Friedman told du Mars. Of course, backing from the top of a company is a prerequisite to imposing new data quality standards all the way down to the bottom of the org chart. Time to put on a sales hat: It would be nice if corporate executives reflexively understood the business downsides of low-quality data, but IT managers likely will have to connect the dots for them to get the go-ahead for an all-in data quality initiative.

One of the big downsides that should be part of the internal sales pitch is the detrimental effect of faulty data on business intelligence (BI) processes. As mentioned above, fixing data glitches and inconsistencies is part of the data warehouse loading stage in many organizations. But data errors often slip through the BI data integration net. Friedman, talking this time to freelancer Alan R. Earls, said he frequently consults with companies that aren’t getting the expected value from their BI investments “because the quality of the data is not good enough, and they haven’t done the right things to fix that.”

What’s the state of your data quality strategy? Email me and let me know if your organization has unlocked the secrets to effective data quality management or is still looking for the key.


Twitter: @sDataManagement

Each month, editors choose recent articles and other content to highlight here for our readers. We welcome your feedback on these items and our site in general -- you can contact us directly or at

Next Steps

Breitburn Partners pursues data quality improvements

Dig Deeper on Data quality techniques and best practices

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.