News Stay informed about the latest enterprise technology news and product updates.

Where have all the data quality vendors gone?

The data quality market is under scrutiny as pure-play vendors get acquired, assimilated and transformed. Their future is in question.

Are pure-play data quality vendors a dying breed?

As technology vendors and IT organizations alike are placing ever greater emphasis on information quality, standalone tool vendors seem to be disappearing. Some have been swallowed up by major IT vendors. Other, formerly pure-play data quality vendors are focusing on new markets. The number of vendors simply offering data quality tools seems to be shrinking rapidly. Not to worry -- the market is just changing, according to Ted Friedman, research vice president at Stamford, Conn.-based Gartner Inc.

For more on data quality

Check out our special report on data quality

View our data quality expert's answers to readers' questions (or pose a question of your own).

 

Read columnist Rick Sherman's tips for data quality success

"The trend is toward standalone data quality vendors going away, but we're not going to get to a place where there are none," Friedman said.

Nine pure-play vendors appeared on Gartner's latest data quality tool Magic Quadrant, and some tools are morphing into the foundation of business applications and processes, he said. The last few months have seen Business Objects acquire Firstlogic, Informatica acquire Similarity Systems and Hyperion acquire UpStream for financial data quality. Also, SAS-owned DataFlux Corp., based in Cary, N.C., has begun touting customer data integration and product information management tools. The industry is moving toward data quality that is integrated with an organization's range of applications and processes, Friedman said.

Integrated data quality is OK in some cases, according to Danette McGilvray, president and principal of Granite Falls Consulting Inc., a Fremont, Calif.-based information quality management consulting firm. If a company is doing a data migration project and its extract, transform and load (ETL) tool has integrated data quality functions, it will be easier to incorporate data quality activities, McGilvray said. That approach may not address data quality problems across an organization, however, which is why pure-play tools are still needed.

"There need to be different ways of accessing data quality tools," McGilvray said.

If data quality functions are available only through certain platforms, other operational groups suffering from quality problems could find it more difficult to use the tools effectively, McGilvray explained. For example, if a customer deploys data quality with its customer data integration system, product or operational data might not benefit from the new functions. She is concerned that with all the acquisitions, integration and new tool sets from vendors, the standalone data quality tools that are needed won't see new development and will be difficult for companies to find.

At least one vendor has pledged that its data quality tools won't get sidelined. Even though DataFlux has launched new data integration software, it still plans to develop its tactical, standalone data quality products, according to Daniel Teachey, the company's corporate communications director.

"Between the profiling side, quality side, and the ID management or matching side, those three basic foundations will continue to be the drivers for everything we do," Teachey said.

Further, as established pure-play vendors shift focus, new players will sprout up to take their place, Friedman added. The paradigm is moving from periodic cleansing projects to real-time data quality as a component of a Web services-based architecture, he said, and that will change the market. Hosted data quality vendors such as Little Rock, Ark.-based Acxiom Corp. and Lanham, Md.-based Group 1 Software Inc. will probably need to transform or they will find their roles in the market changing, Friedman said. A customer usually cannot manage its own business rules with hosted services, which will be an important feature as data quality becomes more embedded in business processes, he explained.

The market is also increasing its emphasis on measurement and metric development, and data profiling tools, Friedman noted. Profiling tools assess data sets for completeness, conformity and a host of other custom metrics, and they are important resources during the process of combining multiple databases. Moreover, he said, there is a growing acceptance of the need for data quality best practices beyond tools -- such as data governance and stewardship.

"If you focus only on deploying the tools, you're likely to fail," Friedman said. "There's the whole organizational side -- like having a corporate sponsor of data quality in the business and ideas like data stewardship, which is about putting accountability for data quality in the business. The people side of it is absolutely huge."

Dig Deeper on Data quality management software

PRO+

Content

Find more PRO+ content and other member only offers, here.

Start the conversation

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

-ADS BY GOOGLE

SearchBusinessAnalytics

SearchAWS

SearchContentManagement

SearchOracle

SearchSAP

SearchSQLServer

Close