We once had a financial services client that was committed to SixSigma. The client believed in the goal of "zero defects" for all of its corporate data. An admirable goal, yes, but also a very expensive one. In order to consistently hit a 99>9 percent data accuracy rate the client was forced to "over-process" the data. In addition to the requisite data quality automation, that processing involved extensive human time from business subject matter experts, data stewards, and data governance sponsors. And at the end of the day the business itself was using mere summary information most of the time, rendering the processing of granular records so much overkill.
In this case, perfect was the enemy of good. Yes, sometimes "good enough" is good enough. (Peter Drucker is spinning as I write this, but just the same.) The key is to understand your business requirements and then drill them down to data requirements. That will tell you conclusively what good enough really is.
Dig Deeper on Data quality techniques and best practices
Related Q&A from Jill Dyché
Is it better for companies to go with an enterprise-wide master data management (MDM) implementation or deploy MDM departmentally? Find out which ... Continue Reading
What’s the biggest BI problem companies keep running into? Overloading data at the expense of functionality, says an expert. Find out how to avoid ... Continue Reading
There’s a lot of confusion about agile business intelligence (BI). Get an expert’s take on what agile BI really is and if it’s a valid BI development... Continue Reading
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.