Published: 24 Jan 2012
While still relatively new to the IT scene, data virtualization tools have evolved beyond the usual growing pains...
associated with emerging technologies and organizations are increasingly putting them to use in corporate applications, according to IT professionals and data management analysts. The key, they said, is finding the right applications for the technology.
Data virtualization software provides a means of integrating information from different data sources, often on the fly as an alternative to coalescing the data in a data warehouse or data marts. The most popular reasons to deploy the technology include obtaining the proverbial "single source of the truth" on corporate data, enabling real-time or near-real-time business intelligence (BI) and supporting high-performance transaction processing applications, according to a report issued this month by Forrester Research Inc. in Cambridge, Mass. But the list of potential use cases doesn't end there.
Forrester said that organizations are also tapping data virtualization to power enterprise search applications, feed data to mobile applications, make the results of "big data" analytics activities available to business users, federate views of data across multiple internal domains, improve information security, and integrate internally housed data with cloud applications, business partner data and data from social media sites.
San Diego-based wireless technology vendor Qualcomm Inc. has been using data virtualization tools for more than two years now, mainly to provide business users with quick access to aggregated views of information stored in various systems, such as its enterprise resource planning and customer relationship management applications.
Doors open to broader data virtualization uses
But Qualcomm has been broadening its use of the technology, according to Mark Morgan, a senior IT manager at the company. Most recently, he said, it turned to data virtualization to compile product logistics data into a new portal application created to help increase the productivity of program managers who lead the design and delivery of Qualcomm's wireless chipsets.
Data virtualization, explained
Data virtualization technology adds an abstraction, or services, layer to IT architectures that enables information from multiple, heterogeneous data sources to be integrated in real time, near real time or batch, as needed.
Data virtualization tools can be used to support many types of applications and processes, such as real-time business intelligence and reporting, enterprise search and high-performance transaction processing. Unlike data federation tools, which typically offer read-only access to aggregated information, data virtualization software gives users the ability to write back changes to the original data sources via the services layer.
Adapted from The Forrester Wave: Data Virtualization, Q1 2012, published by Forrester Research Inc.
Qualcomm, whose products are built into devices like smartphones, e-readers and tablet PCs, outsources its chip manufacturing to business partners. As a result, the design and manufacturing process can get complex, Morgan said. The new portal, called Oasis, was designed to reduce some of that complexity by giving the program managers easy access to information that they can use to stay abreast of project developments and ensure that deadlines are met. But to make that possible, he said, the portal needs to pull together data "from many, many different sources," which is where data virtualization comes in.
The company also uses data virtualization tools to integrate sales data from a cloud-based Salesforce.com application with information in internal systems. In addition, Morgan said the technology is helping Qualcomm to speed up IT development work and modifications.
"Having an abstraction layer on top of our projects allows us to not only be more agile in how we run our projects, but also in how we manage our data sources [and] how we deliver that data," he said. "As the business changes and evolves over time, [that] abstraction layer becomes an increasingly valuable component of our architecture that enables some changes to happen without a lot of disruption to the existing work we've done."
Data virtualization technology can also make it easier to provide users in disparate locations with access to large amounts of corporate information, without needing to create potentially error-prone duplicate copies of the data, according to John Boyer, who heads the BI custom development and enablement team at The Nielsen Co., a New York-based consumer research outfit.
Boyer, who is also co-author of the book Business Intelligence Strategy: A Practical Guide for Achieving BI Excellence, said Nielsen once looked into the prospect of moving 5 to 10 TB of data between data centers in Florida and Ohio. "The research we did found that the fastest way to move that data was with FedEx," he quipped. "But with [data virtualization], we were able to access the data in place rather than actually move it."
Data virtualization tools all grown up ...
When data virtualization tools first emerged about six years ago, adoption was primarily focused in the financial services, telecommunications and government sectors, according to the new Forrester Wave report. But over the past 24 months, there has been a significant increase in adoption by companies in various industries, including health care, insurance, retail, manufacturing, e-commerce and entertainment, said Noel Yuhanna, a data management analyst at Forrester and co-author of the report.
Yuhanna said the technology's growing popularity can be attributed to technical strides, such as increased support for unstructured data, cloud data and data services development -- but also to the notion that data virtualization is the right technology for certain applications.
With the pace of business quickening, the number of data sources exploding and business users seeking fast access to both structured and unstructured information, data virtualization can help IT managers avoid some sleepless nights when there's a good fit for the technology, Yuhanna said. "We have so much data in organizations today and it's difficult to get a hold of," he noted.
...but tread carefully
Nevertheless, if you do decide to adopt data virtualization tools, Yuhanna recommends beginning with a relatively simple initiative, such as a real-time BI project that requires the aggregation of information from just two or three data sources. "You don't want to boil the ocean and take on a big data virtualization project all at once," he said.
Morgan said prospective users should also be cognizant of the system performance issues that can arise as a result of adding a new data virtualization layer to an organization's IT stack. The virtualization layer may access some important applications, and those applications should be watched closely for performance hits, he cautioned.
Application performance isn't likely to be "a showstopper," Morgan said, but it's "something that you have to pay attention to and be aware of."