The data quality and data integration tools markets continue to converge, as organizations increasingly realize...
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
that the accuracy of data is just as important as delivering it, according to Gartner's recent Magic Quadrant.
A number of data integration vendors, recognizing customer demand, have recently acquired smaller data quality firms to plug holes in their product offerings, a trend that began three to four years ago and continues today, said Ted Friedman, vice president and distinguished analyst with the Stamford, Conn.-based analyst firm and co-author of the study. Data quality tools today are more often sold in packaged suites rather than as separate point solutions, he said. Smaller data quality vendors, meanwhile, are finding it harder to break through.
"The market has moved away from seeking point solutions as companies have begun thinking of the data quality problem as very far-reaching and pervasive," Friedman said. "[But] unless you're ensuring the quality of what's being delivered, then you're just like a faster manure spreader. You're just taking the crap data that's sitting in most companies' systems and you're spreading it around the organization."
The report also identified a proliferation of "domain agnostic" data quality tools. What began as a market focused on customer data quality has evolved into one that addresses a greater breadth of data domains, Friedman said. Companies want to ensure the quality of product, financial and other non-customer-related data, and vendors have responded by updating products with capabilities to address various data types. Vendors that remain focused on just one data domain, the report said, will increasingly find it difficult to compete.
"Demand in the market has shifted. Yes, customer data quality remains hugely important, but people have data quality issues in a lot of other domains as well," Friedman said. "The newer players in the space have come to recognize this, and so they've delivered technology that is not so fully optimized for just customer data, but it's suitable for use in a range of different domains. And even some of the larger and older providers that have played in the past in the customer data space have begun to evolve their offerings to be more useful in other domains."
Business Objects, for instance, last summer unveiled its Universal Data Cleanse (UDC) tool, an add-on to its Data Quality XI suite. UDC extends Business Objects' customer data quality capabilities to product and regional data domains.
Data quality Magic Quadrant: Business Objects, DataFlux come out on top
The report placed Business Objects, which has dual headquarters in San Jose, Calif., and Paris; and Cary, N.C.-based DataFlux atop the leaders' quadrant. Gartner's Magic Quadrant methodology places vendors that meet its inclusion criteria into one of four quadrants based on "completeness of vision" and "ability to execute."
Leaders are those vendors that excel in both ability to execute and completeness of vision; challengers have the ability to execute but lack strong vision; visionaries are market-thought leaders, but they struggle with functionality issues; and niche players concentrate on just one or two specific segments of the data quality market, but do it well.
The report cited Business Objects for its large base of data quality customers, most of which were obtained through its 2006 acquisition of data quality vendor Firstlogic. Business Objects' data quality offerings "provide a good breadth" of functionality, including data profiling and data cleansing capabilities, the report stated. Friedman said that customers should be wary, however, until SAP, which acquired Business Objects in January, more fully lays out plans for its data quality tools.
DataFlux, which is owned by The SAS Institute, delivers technology that is easy to use and deploy, with attractive pricing, Friedman said. It won high marks for its largely domain-agnostic product offerings and its vision. "They're thinking beyond just the core data quality functionality," he said. "They're thinking about helping their customers build out comprehensive capabilities for data governance."
Also landing in the leaders' quadrant are IBM Corp., Trillium Software and Informatica Corp. Seven vendors -- DataLever Corp., Uniserv GmbH, Innovative Systems Inc., DataMentors Inc., Datanomic Ltd., Netrics Inc. and Datactics Ltd. -- are placed in the niche players' quadrant.
Interestingly, the challengers' and visionaries' quadrants have only one vendor each – Pitney Bowes Software and Human Inference, respectively – a result of continued market consolidation, Friedman said.
Data quality tools buying advice
Companies looking to purchase data quality tools should first evaluate their "ecosystem of IT technologies," Friedman recommends. "If you're heavily invested in IBM's technology or SAP's technology, then there's a strong argument for considering the data quality functionality from those providers," he said. "Best-of-breed approaches are getting harder and harder to implement because the vendors are less interested in supporting interfaces to competitors."
Friedman also suggested evaluating vendors whose offerings are domain agnostic and can deliver data quality technology in a service-oriented fashion. With the typical data quality tools investment costing between $200,000 and $400,000, creative pricing models should also be a top evaluation criterion, he said.
"Look for pricing models where pricing is done not in a traditional way, where you license the software for a very large lump sum up front and then pay a small bit of maintenance ongoing," Friedman said. Instead, customers should push vendors to consider alternative pricing models, including lease-oriented models, which allow "customers to get into the software at initially lower price points."