Product announcements in the data quality market have sounded a strikingly similar note recently as vendors try...
to position their products in the wider enterprise.
On Monday, Cary, N.C.-based DataFlux Corp. previewed Version 8, the newest release of its core data quality technology platform, which enables a faster, more efficient path to a single version of the truth, according to the company. The same day, Redwood City, Calif.-based Informatica Corp. released Data Quality 3.1 and Data Explorer 5.0, which are tightly interwoven with its PowerCenter data integration platform. That was followed shortly by Lanham, Md.-based Pitney Bowes Group 1 Software Inc.'s introduction of a new Customer Data Quality Platform. A few weeks ago, San Jose, Calif.-based Business Objects Inc. announced general availability of Data Quality XI for Siebel Customer Relationship Management applications, and IBM recently unveiled its Information Server, still in beta, which it says includes enhanced data quality functionality.
These products are designed to help companies make data quality an integral part of daily business, rather than just a periodic batch-cleansing kind of activity, explained Ted Friedman, research vice president with Stamford, Conn.-based Gartner Inc.
"All these announcements in some shape or form are pushing toward the idea of making data quality more pervasive across the enterprise," Friedman said. "These products continue to evolve in a positive direction, and people are going more for integration with other things, like data integration tools or packaged applications."
Informatica, DataFlux add and enhance functionality
Informatica's goal is to infuse data quality into the data integration lifecycle, according to Karen Hsu, senior product marketing manager. The new Data Quality 3.1 and Data Explorer 5.0 are based on technology from Informatica's acquisition of Similarity Systems and offer "end-to-end" integration with its PowerCenter data integration platform, she said. The products tout new features for PowerCenter developers, as well as user-friendly interface enhancements that enable more business users to get involved with the data quality process. For example, there are new "workbenches," designed for business users to define rules and metrics, as well as visual "data quality scorecards" to monitor integrity and quality. Data Explorer has enhanced auto-profiling and will make it easier for companies to set up complex batch profiling processes, Hsu said. The new products will be available this month, either as PowerCenter options or standalone systems.
Informatica isn't the only company trying to make data quality more accessible across the enterprise.
DataFlux's Version 8, slated to be generally available in the first quarter of 2007, includes enhanced features for business rules, an updated metadata exploration tool, new data quality accelerators, and expanded international features, including support for double byte character sets. The goal is to help make data quality initiatives more efficient, according to Tony Fisher, president and CEO of DataFlux. For example, the updated dfPower Explorer enables organizations to analyze metadata within different applications to determine where similar records live, which can help quickly determine the relevant content for data projects. The new DataFlux accelerators are pre-built workflows, templates and best practices for common data quality tasks around customers, products and criminal watch-list compliance, Fisher said. Finally, an expanded business rules engine helps companies create and monitor customized rules to ensure that data meets internal standards for quality and integrity.
The enhanced metadata discovery features, particularly in the new DataFlux product, caught Friedman's attention. Many companies are challenged by "pulling together a complete metadata view of their world," and this kind of feature could really help, Friedman said. His interest was also piqued by the business rules monitoring features incorporated into both the DataFlux and Informatica products.
"Data quality has to be an everywhere-and-all-the-time sort of a thing," Friedman said. "From that point of view, I like the idea of interjecting rules into the data flow to monitor for cases where quality has strayed outside of the boundaries of what is expected."
While DataFlux, Informatica and IBM are moving in similar directions to make data quality more pervasive, Friedman noted, Group 1's Customer Data Quality platform appears to remain focused solely on customer data. This makes sense given its ownership by Pitney Bowes, but it's a notable difference from its competitors, which are designed to be "domain agnostic." The product's focus seems a bit contrary to the market trend of companies wanting data quality tools for a broad range of subject areas, not just customer data, Friedman said.
Advice for buyers
In fact, one of Friedman's primary recommendations in Gartner's 2006 Magic Quadrant for Data Quality Tools was that buyers should look for domain-agnostic data quality tools. That and his other advice still stands, he said, but he also noted that the lines between data integration and data quality vendors are rapidly blurring -- a trend to take into consideration when purchasing either type of tool.
"We see data quality as being fundamental to doing any sort of data integration," Friedman said. "Therefore, if you're buying ETL or other data integration tools, you want to look to [integration] vendors that have such partnerships or have data quality technology directly in their portfolio."