BACKGROUND IMAGE: iSTOCK/GETTY IMAGES
Demand for data virtualization software will increase significantly over the next three years as organizations look for faster alternatives to traditional data integration techniques, according to a new report from Cambridge, Mass.-based Forrester Research Inc.
Despite somewhat inauspicious beginnings, data virtualization tools are maturing nicely, according to the report, which cites several Forrester surveys and client interviews. While early incarnations of the products often fell short of user expectations, Forrester says performance improvements and new capabilities are increasingly making them look like a smart investment.
"Forrester estimates that less than 20% of IT organizations have incorporated data virtualization technology into their integration tool kits and even fewer are realizing its true potential," Forrester analyst Brian Hopkins wrote in the report. "Over the next 18 to 36 months, we expect this market attitude to change as technology advancement, more third-party integration, and new usage patterns lead to increasing awareness of data virtualization’s potential."
Data virtualization technology creates a middleware layer that is used to connect data from disparate sources. It also provides a centralized view of information that can be accessed and used by people, business intelligence (BI) dashboards and applications.
Forrester says concerns about traditional integration choices -- including extract, transform and load (ETL) tools and database consolidation projects -- will increasingly drive users to data virtualization.
"Integration by ETL creates data quality problems and delays information delivery," Hopkins wrote. "Integration by DBMS consolidation is high impact, expensive, and risky."
Despite the progress they've made in recent years, however, data virtualization software vendors will continue to find it difficult to change some users' perceptions about the technology.
Andrew Kerber, a senior database administrator at a Kansas City, Mo.-based e-commerce company, said he thinks data virtualization can easily lead to security, privacy and regulatory compliance problems.
"I think it just raises so many questions that I just don't see it happening any time soon," Kerber said. "It just strikes me as very dangerous."
Data virtualization software adds new capabilities
"Virtualization technology is already capable of passing user credentials from virtual to underlying physical data stores, providing record-level security when application authorization architectures support it," the report reads. "An emerging capability for masking values in virtual data stores delivers additional benefits as firms expose this functionality in their data services."
According to the report, other new capabilities being added to data virtualization tools include "big data adaptors," which allows users to expose pared down versions of enormous data sets; improved discovery tools which ease the process of building virtual data stores; and prebuilt integrations designed to ease the process of connecting to third party data sources.
Getting ready for data virtualization technology
Forrester recommends that organizations begin the process of introducing data virtualization into their integration environments.
The best way to start, according to the report, is by taking a close look at any data integration projects that have been put off due to other priorities, and see if they're right for virtualization. Forrester says a good question to ask when looking at those projects is: Do we really need to move data via ETL here?
As organizations begin adopting the software, Forrester says it's also important to develop integration patterns that expose information to the virtualization layers.
"As you successfully deploy virtual data sources, develop service-oriented architecture integration patterns that solution architects can easily leverage in projects," the report reads. "This will drive adoption of the technology in the context of your larger [Information-as-a-Service] strategy."