BACKGROUND IMAGE: iSTOCK/GETTY IMAGES
The market for data virtualization software has been picking up considerable speed in recent years as organizations seek new ways to quickly access both structured and unstructured data and integrate that information into highly effective and actionable dashboards and reports. The technology made headlines just this week when networking giant Cisco announced plans to purchase data virtualization vendor Composite Software Inc. in a deal worth $180 million.
Denodo Technologies is another leading data virtualization software vendor that is focused on virtualization data for use in both business intelligence and application development activities. SearchDataManagement got on the phone recently with Suresh Chandrasekaran, Denodo's senior vice president for North America, to learn more about the company. Chandrasekaran talked about how Denodo's approach is different from Composite's and the rest of the competition's. He also talked about the growing demand for data virtualization software and how the technology fits in with big data implementations. Here are some excerpts from that conversation, which have been edited for length and clarity.
Can you tell me a little bit about the history of Denodo Technologies?
Suresh Chandrasekaran: We are an 11-year-old company. The company was founded in Spain, and for the first two or three years, they were sort of a services company and gradually expanded and began doing a lot of work for telcos and financial services companies and aggregating internal and external information. In late 2006 to early 2007, the founder decided to move the company's headquarters to Palo Alto, and I met him through some venture capitalists. I've been involved with the company since its founding in the U.S., you might say.
How is Denodo different from its competitors in the data virtualization space?
Chandrasekaran: One of the key things that Denodo does is that we have a much broader vision of what data virtualization is than has been typical in this market. The data virtualization market sort of grew out of data federation and enterprise information integration technologies. Those technologies pretty much grew out of database groups, or maybe extract, transform and load (ETL) people saying: 'Well wouldn't it be nice if I could get to some of this additional data outside the warehouse more quickly.' Early products included IBM's data federator and MetaMatrix, which is now Red Hat. But in a sense, data virtualization is kind of an alternative to the warehouse or an extension of the warehouse, a more agile way than ETL.
Could you give an example of how your approach is different from that?
Chandrasekaran: For example, Composite Software is our main competitor in this space. They are pretty much focused on the business intelligence [BI] side -- and this is all wonderful. We like that and we have lots of customers like that. And therefore, Composite's data source focus was always focused on databases, warehouses, maybe some flat files. Denodo's focus has been significantly broader both at the bottom and at the top. For example, large companies typically have sort of an integration stack to feed BI. They also have another integration stack to feed processes and applications. This would be Web services endpoints, enterprise service bus, VPN, SOA, service registries, and then they have application development frameworks. They also have knowledge management, content management and unstructured data. What Denodo has offered is at one level another way of integrating data in a real-time, on-demand way. But more broadly, what is very important here is that we have very strong customers who use us just as much for Agile application development or this idea of data services -- what we call universal data publishing -- which means that we expose all of your data assets at the bottom.
For more on data virtualization software
Read the Whatis definition of data virtualization
Find out how Qualcomm uses data virtualization software
Has Denodo made any mistakes along the way?
Chandrasekaran: We made a couple of mistakes about four years ago when we big time expanded our sales and marketing and everything. The market wasn't ready. So we sort of cut back. But we think the market is ready right now and so we're going to put more fuel in the gas tank.
What makes you think the market is ready now?
Chandrasekaran: Well, I'll give you some indicators. Four years ago, I visited a Data Warehousing Institute conference and the people's mindset there was, 'Don't tell me about anything else. I have a data warehouse. Tell me how I can make it more efficient and faster and more this and more that.' But this year, the title of the conference was 'Going Beyond the Data Warehouse.' They had a full-day course on data virtualization and this was the fourth time they've done it. They did it in Orlando. They did it in Boston. They did it in San Diego, and they did it again in Vegas, and they want to continue it. So that is just one example of how we see data virtualization coming into the mainstream.
How does data virtualization fit into the world of big data technologies?
Chandrasekaran: In the big data space, there is too much emphasis I think is going just into the storage technologies and the processing technologies. Not enough time and thinking is going into the thought that once I get that working, how am I going to let the rest of the enterprise benefit from it? Not enough thought is going into that. Therefore, we think that data virtualization should be part of that thinking process right from the beginning. Questions to consider include: How are we going to leverage our cloud programs, and how are we going to leverage our big data and not just feed them data but access them in a holistic way? At the source level, therefore, we are constantly increasing our connectivity.