A never-ending story: Improving data quality and integration for BI

Data quality and integration need continual attention in business intelligence systems. But be sure you can avoid setbacks and keep moving forward.

Data quality demands never let up in business intelligence environments. Forrester Research Inc. analyst Boris Evelson set the scene aptly in a November 2012 blog post: No large organization "can ever hope to clean up all of its data -- it's always a continuous journey," he wrote. "The key is knowing what data sources feed your BI applications and how confident you are about the accuracy of data coming from each source." In addition to working on improving data quality as much as possible, Evelson recommended that BI teams create a "data confidence index" and assign scores to the transactional records in data warehouses to give business users an indication of how much trust they should put in the accuracy of information.

The same lack of let-up is true of data integration requirements: There's always more data from new sources needing to be pulled into BI systems, especially when big data is part of the picture. That further adds to the data quality burden shouldered by BI managers as well as their IT and data warehouse counterparts. In a story we published in September 2013, Michele Goetz, another Forrester analyst, said organizations looking to integrate data warehouses and big data systems to feed BI and analytics applications should start by creating a "contextual services" layer that underpins the effort. That should include data quality and data governance policies along with elements such as a metadata repository and an enterprise-wide glossary of business terms, according to Goetz.

While you might not be able to definitively declare victory on BI data quality and integration, you'll surely hear about any tactical defeats. SearchDataManagement and its sister site SearchBusinessAnalytics have published a selection of articles offering advice on how to avoid such unpleasant occurrences. In one, consultant Lyndsay Wise provides tips on implementing a successful BI data quality strategy. Another details the BI woes caused by data errors and inconsistencies, while a third story looks at the data integration challenges created by big data and real-time BI applications. A fourth delves more deeply into the new approaches to integrating and cleansing data required in big data environments. In addition, information management expert Andy Hayler assesses the stagnant state of corporate data quality levels, and consultant David Loshin outlines a five-step process for improving data quality. Good luck staying on the plus side of your organization's BI data quality and integration ledger.

Craig Stedman is executive editor of SearchDataManagement. Email him at cstedman@techtarget.com and follow us on Twitter: @sDataManagement.

This was first published in February 2014
This Content Component encountered an error

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchBusinessAnalytics

SearchAWS

SearchContentManagement

SearchOracle

SearchSAP

SearchSOA

SearchSQLServer

Close