Essential Guide

Exploring data virtualization tools and technologies

A comprehensive collection of articles, videos and more, hand-picked by our editors

IT pros reveal benefits, drawbacks of data virtualization software

Attendees at Composite Software's Data Virtualization Day conference had plenty of advice for companies considering a data virtualization project.

NEW YORK -- Performance concerns are bound to pop up when organizations mull over the prospect of deploying data virtualization technology. But with recent advances in hardware and the right approach to tuning the software, it is certainly possible to mitigate those fears, according to attendees at this week's Data Virtualization Day 2012 conference.

The annual conference, which is hosted by data-virtualization software maker Composite Software Inc., brings together current and prospective data virtualization users, consultants and IT industry analysts. Several took time in between conference sessions to talk about the biggest benefits and toughest obstacles associated with the technology. They also had plenty of advice for anyone considering a data virtualization project.

Ted Hills, an enterprise information architecture executive at Bank of America Corp., said he has definitely seen performance concerns arise when IT professionals discuss the topic of data virtualization. But he has also seen the technology evolve over the years to incorporate new and improved features that make performance management tasks less of a headache.

For more on data virtualization software

Read about Qualcomm's decision to deploy data virtualization software

See what Pfizer has to say about implementing data virtualization software

Bank of America began to gradually roll out data virtualization software about three years ago as part of an effort to enable faster business intelligence reporting capabilities, among other things.

"I very much see the fear in technologists that a newer technology like Composite won't perform well at scale or in production, and I think it's a matter of people trying it," Hills said. "I won't say performance is not an issue. It's always an issue. But with Composite Information Server, you have all of the same tools and techniques available for tuning as you do for a traditional DBMS [database management system]."

For example, Hills continued, Composite users can take advantage of "hints" and "explain plan" performance-tuning functions. In database management, a hint is a directive that tells a SQL optimizer how to execute a query plan. An explain plan is a method of analyzing how a query will be processed by a database.

"It's not magic," Hills said of database virtualization. "It's just technology, and it works just fine at production levels."

Hardware advances lead to improved performance

Performance is a concern with data virtualization software, but it's one that can be effectively dealt with, according to Niraj Juneja, principal of business consulting at Infosys Ltd., a Lisle, Ill.-based systems integration firm.

Two IT industry trends in particular have already begun helping organizations improve the performance of data virtualization environments, Juneja said. They include the rise of commodity servers and the more recent emergence of in-memory computing. "In-memory can cache all of the databases behind [a data virtualization implementation] and serve to it in a very faster way," he explained.  

Juneja, who has helped companies deploy data virtualization software, cautioned that while additional servers can ensure solid performance, it's sometimes easy to let hardware costs grow out of control. "One of our clients actually ended up buying a lot of high-performance boxes just to make it work," he said. "The cost of hardware can go up significantly when you do data virtualization."

But when it's up, running and tuned properly, data virtualization software can offer several benefits, Juneja said. They include a relatively painless implementation and the ability to quickly begin realizing a return on investment. The key to achieving success with data virtualization, he said, is to begin with a small project, tune the system appropriately and gradually expand the scope of the deployment from there.

It's a point that made sense to Richard Lemieux, a data architect with a large insurance company, who was attending the conference to learn more about the technology. Lemieux's company is seriously considering rolling out data virtualization to speed up the time it takes to produce business intelligence reports for business analysts and other users.  

Lemieux's company recently acquired another firm and is struggling to develop a "360-degree customer view." He thinks data virtualization can help.

"Our users ask us for one column on one table [for reporting purposes], and it's going to take us seven weeks. Then the next week, they ask for a different one and we just can't keep up. Our staff is too small," Lemieux said. "So, yes, this could really help us a ton if we could get ideas out as fast as the users can generate them."

More data virtualization advice

Jason Hull, a data integration specialist with the northeast division of cable television and broadband provider Comcast Corp., echoed Juneja's suggestion that it's important to start small with data virtualization to avoid performance concerns.

"Find that one project that you can succeed on, one small thing that you can make work," Hull said. "Then try and get that sponsorship, once you have that success, and continue to go from there and grow."

Located in Manchester, N.H., Hull's division of Comcast uses Composite software as a "one stop" data access layer for business users and developers. The company also uses Informatica Corp. extract, transform and load (ETL) software.

"As we bring new objects in through the ETL layer, we first bring them into Composite so that all of our ETL goes through the Composite layer," Hull explained.  "Instead of worrying about if it is actually a SQL Server, a flat file or a Web service, Informatica just sees Composite."

Informatica vs. Composite

Comcast was already using Composite when Informatica introduced a new data virtualization offering of its own. But Comcast chose to stick with Composite, rather than standardize on Informatica for both ETL and data virtualization.   

"We were already invested in Composite and saw no reason to migrate over to Informatica. We tried to do some benchmarking and worked with Informatica, and they couldn't prove that they were going to give us any added value," Hull said. "Informatica is great at ETL. I'd say they're even great at [developing] their metadata repository related to all of that ETL. But Composite is definitely the leader in data virtualization."


Mark Brunelli is the news director for SearchDataManagement.com. Follow him on Twitter: @Brunola88.

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

Essential Guide

Exploring data virtualization tools and technologies

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchBusinessAnalytics

SearchAWS

SearchContentManagement

SearchOracle

SearchSAP

SearchSOA

SearchSQLServer

Close