IBM unveiled a new version of its flagship data integration product -- IBM InfoSphere Information Server 8.5 --...
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
at its Information on Demand conference last week in Las Vegas. Big Blue also took the wraps off the latest version of its mainstay database management system, IBM DB2.
SearchDataManagement.com was at the conference and sat down with Bernie Spang, IBM’s director of information management product strategy, to get more details about the new releases. Spang talked about the history of InfoSphere Information Server and DB2’s new capabilities, and he explained one of the reasons why IBM is so interested in acquiring data warehouse appliance vendor Netezza. Here are some excerpts from that conversation:
Could you give me a quick history lesson on the IBM InfoSphere product line?
Bernie Spang: It actually has multifaceted origins. The data stage and quality stage, cleansing and ETL capabilities come from the Ascential acquisition a number of years ago. The federation and replication capabilities that are part of InfoSphere Information Server have a heritage back in IBM under different names at different times.
What are some of the new capabilities in InfoSphere Information Server 8.5?
Spang: One of the exciting things about the InfoSphere Information Server is the tool set that comes along with it for accelerating the production of integration jobs, as well as new fast-track capabilities and new business glossary capabilities [that] enable the collaboration between business and IT on what the meaning of data is and how it flows together.
What is the new InfoSphere Blueprint Director?
Spang: That gives users the ability to capture the best practices for designing and building and laying out an integration job to ensure that you’re actually in line with business needs and you’re pulling the right information together until they’re in the way. It’s another layer of collaboration that we’ve built into the product, and it allows users to see the quality metrics associated with each piece of data as it moves through the process.
What does Blueprint Director look like to the end user?
Spang: It’s a visual environment where you’re laying out the integration and you’re defining it and then you can use the fast-track capability to generate the ETL jobs. It’s that visual toolset for defining your integration project. And it ties with the Business Glossary, where the business users and IT are agreeing on the definition of terms.
What features have you added in the new version of DB2?
Spang: IBM DB2 Version 10 is a new product that we’re delivering this week. [It offers] out-of-the-box performance improvements up to 40% for some workloads [and] greater scalability. The other interesting thing is a new capability that we’re calling DB2 time travel query – the ability to query information in the present, in the past and in the future. If you’ve loaded information, like new pricing information for next quarter, you can do queries as if it were next quarter. If you have business agreements or policies that are over a term, you can do queries in the future and base it on how the policies will be in effect at that time. Companies already do this today, but largely by writing application code. By pushing it down into the database software, we’re greatly simplifying the process and greatly reducing the amount of code.
IBM is in the process of acquiring Westboro, Mass.-based data warehouse appliance vendor Netezza and its field programmable gate array processor technology. What exactly is the value of this technology?
Spang: Processing speed is reaching the laws of physics [in terms of its] ability to continue to grow, while at the same time the need to process more information and do more transactions is growing unabated. So how do you get those next-generation performance improvements? You put the pieces together and highly optimize them for specific workloads. That means you have to have the software optimized for the hardware even down to the processor level. The field programmable gate array allows you to actually program at a chip level, [and that leads to] much greater speeds than having it written in software running on a general-purpose processor.