everythingpossible - Fotolia

Evaluate Weigh the pros and cons of technologies, products and projects you are considering.

Time to tool up on data integration software?

The market for packaged data integration tools is expected to grow by nearly 10% annually. But a lot of holdout organizations might need convincing to move beyond manual coding.

Data integration software is a growth market, according to Gartner Inc.: The consulting and market research company forecasts a compound annual growth rate of 9.6% in worldwide sales through 2018. That would boost the size of the market to $3.6 billion, up from $2.2 billion in 2013, Gartner said. In its 2014 Magic Quadrant for Data Integration Tools report, analysts Eric Thoo and Mark A. Beyer wrote that the expected increase in uptake is being driven partly by the need for organizations to make their information architectures more flexible so they can take advantage of new data types, more data sources and faster data delivery capabilities -- all hallmarks of big data applications.

But another reason there's room for the market to grow is that a lot of potential users have yet to buy into the concept of packaged integration software at all. For example, 44% of 285 respondents to a 2014 TechTarget survey on data integration said their organizations weren't using any commercial tools. Manual coding of integration routines still holds sway in such environments.

It could get harder to sustain that approach as business needs push companies to augment batch-oriented extract, transform and load integration with things like real-time integration, data virtualization and complex event processing. The cloud and big data also add new integration options -- and issues.

You can get insight and advice on all of these topics on SearchDataManagement and SearchBusinessAnalytics. For example, consultant Rick Sherman states the case for using automated data integration tools in a Q&A and writes about what's holding back their adoption by more organizations. Rick van der Lans, another consultant, discusses why the increasingly distributed nature of data calls for new approaches to data integration, such as the possible use of data virtualization software. We explore that technology more deeply in a guide to data virtualization, including stories on deployments at pharmaceutical maker Pfizer Inc. and semiconductor developer Qualcomm Inc.

On the big data front, we assess the required integration processes and skills and look at the tools that are becoming available for doing big data integration. Also, consultant David Loshin offers tips on how to successfully integrate big data systems and data warehouses. And we spotlight the importance of solid integration and data management capabilities to a healthcare company's efforts to analyze a diverse mix of data. Data integration might not be glamorous -- but it's the glue that pulls organizations together. And the business need for that is only likely to increase in the years ahead.

Craig Stedman is executive editor of SearchDataManagement. Email him at cstedman@techtarget.com and follow us on Twitter: @sDataManagement.

Next Steps

Consultant Ron Bodkin offers advice on breaking down big data silos

Data integration has a big role to play in addressing global problems

Efforts to integrate big data require a solid understanding of the available information

This was last published in January 2015

Dig Deeper on Enterprise data integration (EDI) software

PRO+

Content

Find more PRO+ content and other member only offers, here.

Join the conversation

2 comments

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

What's more prevalent in your organization: commercial data integration technology or manual integration coding?
Cancel
I think data integrity is of growing importance in the cloud era.  However, I also suspect that a lot depends on what kind of business you are in.  A system that handles high availability real time financial data for example has a higher need for accuracy than a system that manages your a system that has slower running processes (and can therefore take a bit more time with its data integrity checking.  I still am surprised at the 10% number, I would expect companies that use ETL techniques to be more common.
Cancel

-ADS BY GOOGLE

SearchBusinessAnalytics

SearchAWS

SearchContentManagement

SearchOracle

SearchSAP

SearchSQLServer

Close