Using data quality metrics to build a data quality business case

Get expert tips on calculating data quality metrics to document the financial impact of poor data quality and help build a business case for a data quality improvement program.

Most organizations depend on high-quality data for proper execution of operational and transactional processes and to get reliable results from the reports and analytic queries used to identify opportunities for improved efficiency or business growth. So it’s critical to ensure that data quality is sufficient to meet your business needs and objectives. An enterprise data quality program will introduce processes for assessing, reporting...

on, reacting to and controlling the business risks associated with poor data quality.

To what extent does flawed data affect your organization’s business? Due to the dynamic nature of data, which typically is a generated through numerous business processes and information feeds that are combined, stored and used across various systems, there are challenges in establishing ways to assess the impact of low-quality data. In fact, the magnitude and challenge of correlating business impacts with data failures may appear to be too large to manage. But we can compare the job to eating an elephant: It seems big, but if you carve it into small enough chunks, it can be done one bite at a time.

To communicate the value of good data quality and build a business case for a data quality improvement project, IT managers and data management professionals should get out their carving knives and chop up the process into these four steps for creating data quality metrics:

  • Identifying business problems that are attributable to poor data quality.
  • Identifying potential data quality remediation tasks and their estimated costs.
  • Assessing the loss of business value resulting from the quality problems – i.e., the “value gap.”
  • Calculating the “opportunity value,” which measures the difference between the lost value and the remediation costs.

Proposed remediation tasks can then be prioritized based on those calculations. The net result is a set of estimated return on investment metrics that can be used to help establish a business case for a data quality management program. The same metrics can then be used as performance targets to continually monitor the success of the program, a process that will be discussed in a related article.

Beginning the process of developing data quality metrics

Assessing how poor data quality impedes business success involves identifying data issues, categorizing their associated business impacts and then prioritizing the issues according to their severity from a business standpoint. A simple taxonomy can provide primary categories for classifying and evaluating the negative impacts related to data errors:

  • Financial impacts, such as increased costs, decreased revenues or higher penalties, fines and other direct costs stemming from data quality problems.
  • Reductions in confidence and satisfaction ratings on customer, employee and supplier surveys, as well as impaired decision-making processes.
  • Employee productivity issues, such as increased workloads, decreased throughput and longer processing times.
  • Corporate risk and compliance problems associated with credit assessments, the evaluation of investment and competitive risks, and compliance with government regulations and industry rules or guidelines.

Although data quality teams often focus on the financial impact category, the others often are compromised by poor data quality as well. Determining which of the affected indicators are most important to your organization can you help you start preparing a data quality business case that will resonate with corporate executives and business managers.

The next step in creating data quality metrics to buttress your business case involves understanding the root causes of the data quality issues and determining how they can be addressed. That means reviewing how data flows through individual business processes to determine where errors are being introduced. (Many data quality problems are caused by process failures, which can complicate things – but correcting a faulty business process typically will result in more effective quality improvements than correcting bad data downstream.)

Once the source of a data error is identified, IT managers and data analysts should consider alternatives for eliminating the cause, instituting preventive techniques or taking some other corrective action. Each of these alternatives requires an investment of both money and resources – for example, to buy any data quality tools that are needed for the remediation work and to fund training sessions and ongoing maintenance of the software. Staff time will also be required for fully analyzing data quality problems and then designing and implementing the data quality program. Developing an estimate of the anticipated expenses will establish a baseline cost for the remediation work.

Looking beyond monetary factors for your data quality metrics

Next, work to calculate the organizational costs of the various business impacts and project those costs over a year’s time. Not all of the indicators will necessarily be quantifiable in monetary terms, so in some cases you may need to utilize non-financial metrics, such as risk exposure ratings or the percentage of compliant records. As long as there is a measurable impact that can be associated with the occurrence of the data error, documenting it establishes a correlation.

Of course, the business case cannot account only for the potential upside benefits of a data quality program – the costs associated with the remediation tasks must also be factored in.

 

Sympathetic business leaders – for example, executives whose operations are being affected by data quality problems – may be able to help with these value gap estimates. But because of the challenges of determining the actual costs of data errors, be conservative. Provide estimates that are believable and supportable – and that can be used to establish achievable data quality improvement goals.

Of course, the business case cannot account only for the potential upside benefits of a data quality program – the costs associated with the remediation tasks must also be factored in.

One thing to keep in mind: when multiple quality problems can be fixed via the same solution, there is a possibility for economies of scale, which can allow the required investments in both internal resources and data quality software to be amortized across a variety of the identified issues.

Prioritizing the remediation tasks as part of the business case can essentially be boiled down to simple arithmetic: for each data issue, calculate the opportunity value as the value gap minus the remediation cost. Sorting the issues by their opportunity value highlights the ones whose remediation would provide the most potential upside to the organization. While this model is a simple starting point, other dimensions can be integrated into the calculations as well, such as time to value, available skills and the staff learning curve that would be required to fix a particular problem.

Taken as a whole, this process provides a quantifiable measurement of the value proposition for instituting a data quality program. And by casting these data points within the context of execution feasibility, the IT managers and data analysts who are hoping to sell corporate executives on a data quality initiative should be able to assemble a business case that projects a reasonable return on the required investments.

About the author:
David Loshin is the president of Knowledge Integrity, Inc., a consulting company focusing on customized information management solutions in technology areas including data quality, business intelligence, metadata and data standards management. Loshin writes for numerous publications, including both SearchDataManagement.com and SearchBusinessAnalytics.com. He also develops and teaches courses for The Data Warehousing Institute and other organizations, and he is a regular speaker at industry events. In addition, he is the author of Enterprise Knowledge Management – The Data Quality Approach and Business Intelligence: The Savvy Manager's Guide. Loshin can be reached via his website: knowledge-integrity.com.

Dig deeper on Data quality techniques and best practices

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchBusinessAnalytics

SearchAWS

SearchContentManagement

SearchOracle

SearchSAP

SearchSOA

SearchSQLServer

Close