Don't miss the other installments in this data quality management guide Managing data quality programs during a...
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
recession Trends in the data quality market Avoiding data quality pitfalls and using data quality tools for discovering new opportunities Q/A: Identifying data quality problems with a data quality assessment FAQ: Best practices/tips for data quality
Not even a worldwide recession could hold down the data quality software and tools market, which continues to grow as more organizations recognize the importance of quality data to master data management, business intelligence and other information management initiatives, according to Gartner's latest Magic Quadrant report.
More organizations are also coming to the realization that poor data quality is a pervasive issue that puts a drag on performance, revenue and profits, said Andreas Bitterer, an analyst with the Stamford, Conn.-based analyst firm and co-author of the report, along with Gartner analyst Ted Friedman.
"Companies are realizing bad data is having a negative impact on performance," Bitterer said. "There's not one company on this planet that doesn't have a data quality problem."
The data quality and data integration markets also continued to converge in 2008, a trend Gartner identified in last year's data quality Magic Quadrant report, though not as quickly as Bitterer expected. It will still be several years before data quality and data integration software are routinely bundled in a single platform, he said, but the market is headed in that direction.
"Whenever you move data [be it through ETL, data federation or other methods] while you have [the data] in your hands, you should make sure whatever arrives at the other end is consistent and complete," Bitterer said.
The data quality market, which Gartner estimates stood at between $400 million and $500 million as of the end of last year, is largely divided between big, incumbent vendors that offer a wide breadth of functionality -- including data profiling, standardization, cleansing, matching and enrichment -- and small niche vendors with more targeted but less expansive capabilities, according to the report.
The large, incumbent vendors, notably SAP BusinessObjects and IBM, "increasingly focus on data quality capabilities as complementary to various components of their portfolios," Bitterer and Friedman wrote in the report. "While they sell data quality tools in a standalone manner [as individual products], these tools are increasingly sold as part of a larger transaction involving related products [such as data integration tools and MDM solutions]."
Data quality analysis tools for non-IT workers, like data quality dashboards and visualization applications, also continued to develop as organizations and vendors alike increasingly recognize data quality as a business problem, not just an IT problem, Bitterer said.
DataFlux software and tools top data quality rankings
DataFlux, a subsidiary of SAS Institute, based in Cary, N.C., topped Gartner's data quality vendor rankings this year. The vendor's broad data quality capabilities are easily integrated with other applications and relatively easy to use, especially for non-IT workers, the report found.
"With its 1,200 customers, DataFlux has become the enterprise-wide data quality standard in many large accounts," Bitterer and Friedman wrote. "The company has one of the highest ratios of reinvesting revenue in R&D and enjoys a maintenance renewal rate of over 95%."
Joining DataFlux in the leaders' quadrant were Trillium Software, IBM, Informatica and SAP BusinessObjects. Gartner's Magic Quadrant methodology places vendors that meet its inclusion criteria into one of four quadrants based on "completeness of vision" and "ability to execute." The quadrants are niche vendors, visionaries, challengers and leaders.
Trillium Software was cited for its "diversity of use cases, including those within BI activities, MDM solutions and in support of data governance programs," while Informatica landed in the leaders' quadrant on the strength of its data profiling functionality and domain-agnostic data parsing, standardization and matching capabilities, according to the report.
The niche players' quadrant was also crowded this year, with six vendors sharing the space. They are DataLever, Uniserv, Innovative Systems, DataMentors, Netrics and Datactics. While these vendors, among other smaller data quality vendors, offer mature if narrow data quality capabilities, they are likely to struggle to gain market share from their large, incumbent competitors, according to the report.
"With the increasing trend toward embedding data quality capabilities in business applications, data integration tools and other software offerings from larger vendors, these small competitors will face significant challenges as they attempt to survive and grow," the report stated.
Other vendors meeting Gartner's evaluation requirements included visionaries Human Inference and Datanomic, and Pitney Bowes Business Insight, the lone challenger.
Data quality not just an IT problem anymore
Data quality is as much an IT issue as it is a business issue, Bitterer said, so companies evaluating data quality vendors should heavily weigh the software's ease-of-use, especially for non-IT workers.
"The content is owned by the business," Bitterer said, and in most cases the business is the group that understands the nuances of the data best, those aspects most critical to data quality.
Therefore, the business must play a significant role in any data quality initiative, with tools to match their abilities and understanding. Vendors have largely made this connection already, with most offering data quality dashboards and other visualization tools to help non-IT users monitor and manage data quality projects, Bitterer said. And many organizations have already adopted them.
Bitterer also recommends that companies consider data quality vendors' location, as many vendors are stronger in European markets than in the U.S. and vice versa. Data quality is often affected by regional variations – the way an address is formatted, for example – and a vendor located in its customer's geographic area is likely to understand those differences better, he said.
Finally, Bitterer said that all companies evaluating data quality vendors should consider both the large, incumbent vendors as well as the smaller, niche players. In some cases, companies with more targeted data quality issues can get the functionality they need from the smaller vendors at a much better price than from their larger counterparts.
"The companies in the upper right corner [of the Magic Quadrant] are a lot more expensive than the ones in the lower left corner," Bitterer said. "Not everybody needs the 38-ton truck."
Don't miss the other installments in this data quality management guide
Managing data quality programs during a recession
Trends in the data quality market
Avoiding data quality pitfalls and using data quality tools for discovering new opportunities
Q/A: Identifying data quality problems with a data quality assessment
FAQ: Best practices/tips for data quality