There are some new faces in Gartner's Magic Quadrant for Data Quality, as more vendors emerge to capitalize on
a market that is evolving to include unstructured and international data.
Six vendors were added to the annual study of data quality management software, according to Ted Friedman, co-author of the study and vice president, distinguished analyst with the Stamford, Conn.-based analyst firm. But that's a reflection of the changing market, not Gartner methodology, he said. There was no change in the inclusion criteria from last year's Magic Quadrant for data quality tools. The additional vendors, many European, simply grew or expanded their products or presence sufficiently to be included in the study, which placed companies into one of four quadrants based on their ability to execute at customer sites and the completeness of their strategic vision.
The 15 vendors are all vying for a small but growing market. Gartner forecasts that the data quality market will be worth $677 million by 2011, representing a compound annual growth rate of 17.6%. The growth, new tools and emerging trends give data quality management software buyers a lot to consider.
"Interestingly, much of the innovation is coming from outside the United States," the report says. "As a result, the veteran data quality tool vendors are being challenged by entrants that have an international focus and propensity toward designing and deploying domain-agnostic data quality services (standalone or embedded in applications), based on a centrally managed set of business rules."
Organizations are increasingly demanding these domain-agnostic tools, Friedman said, because they need to manage more than just customer data, the historic sweet spot for data quality. Compliance concerns and external reporting requirements are driving more interest in financial data quality management. Product data is also increasingly being addressed with data quality tools, he said, owing partly to more initiatives around product information management (PIM) systems, recently ranked in another Gartner Magic Quadrant. And there's a growing desire to use data quality tools for enterprise content management (ECM) systems, Friedman said -- a particularly interesting trend since ECM deals primarily with unstructured data.
"We're beginning to see the application of techniques like content analytics and text mining to discern some structure from the unstructured data," Friedman said. "Then you can perform data quality operations upon the structured bits that you've distilled down from the unstructured. But the thinking on this is still in the formative stages. There may be a whole different set of data quality dimensions that might be relevant to the content world."
Another key trend is that organizations are expanding their global reach and requiring tools that can handle international data differences, Friedman said. That requires such features as support for Unicode and rules engines that can deliver local address validation and other functions across multiple languages and formats -- an area where some of the international vendors are doing well.
Data quality also continues to "bleed over" into other technology areas, such as data integration, Friedman said. That trend is manifesting itself in the tools market, as some vendors add or enhance features like application-programming interfaces (APIs) and service-oriented architecture (SOA) support -- and some of the larger players incorporate data quality functions into data integration platforms and other products.
In the long term, the data quality and integration markets may merge, Friedman said, but that's probably "quite a few years out," he added. For now, he said, organizations still have data quality needs that transcend integration projects, such as standalone quality improvement initiatives and data stewardship programs.
Data quality management software product rankings
The leadership quadrant of the study, which includes vendors with a strong vision and proven ability to execute with customers, included familiar names. Redwood City, Calif.-based Informatica Corp. moved up to the leaders quadrant from the visionaries space, where it appeared last year -- joining IBM; Cary, N.C.-based DataFlux; Paris-based Business Objects; and Trillium Software, a division of Harte-Hanks based in Billerica, Mass., which were all named leaders last year.
The challengers quadrant of vendors with a proven ability to execute with customers, but a less clear vision, included Lanham, Md.-based Group 1 Software Inc. and a newcomer to the quadrant, Germany-based Fuzzy Informatik. While Group 1 improved its relative position from last year, when it was in the niche quadrant, it still focuses mostly on customer data, a stance that could hinder its long-term progress, Friedman said.
Human Inference, headquartered in the Netherlands, was the sole name in the visionaries quadrant, reserved for vendors with less customer execution experience but a strong strategic vision. It also appeared in this quadrant last year, thanks to strength in European banking and insurance markets, as well as some unique technology differentiators, according to the report.
Most of the names new to the study appeared in the niche quadrant of vendors with less proven customer experience and vision -- but enough traction to be viable in the market. Two of the vendors in this quadrant appeared here last year: Pittsburgh-based Innovative Systems Inc.; and Boulder, Colo.-based DataLever. New names included Germany-based Uniserv GmbH; Wesley Chapel, Fla.-based DataMentors Inc.; Princeton, N.J.-based Netrics Inc.; U.K.-based Datanomic; and Northern Ireland-based Datactics Inc.
Data quality management software evaluation and buying advice
In addition to Friedman's past buying and evaluation advice, such as buying domain-agnostic data quality tools, he said there are some new requirements that buyers should evaluate. The interface and ease-of-use of data quality tools is now extremely important, he said, since businesspeople -- not just IT staff -- should be involved in programs. And, he added, SOA technology opens up new opportunities, and another potential requirement, for buyers.
"The ability to expose the data quality functionality as services to be plugged into lots of [systems] grows more and more important," Friedman said. "There's a lot of value in establishing a single set of business rules that define data quality for your enterprise -- and leveraging those in a universal way from all of the different applications that are working with these critical master data objects."