Gartner's 2007 data warehouse DBMS Magic Quadrant found that the market is returning to tried-and-true IT mant...
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
This year is notably different from a few years ago, when the data warehousing market was experiencing an unusual trend for the IT industry, according to Mark Beyer, research director with the Stamford, Conn.-based analyst firm and co-author of the study. Companies were willing to spend significant money, time and effort on data warehousing in order to achieve ideal implementations, Beyer said. This year, he found that companies are going back to IT basics, wanting to "do better with less." That is, they want to spend less money and time but achieve better results. That's due in part to a maturing market, he explained. Data warehousing has been around for about 18 years now, and customers expect that software vendors have successfully and cost-effectively solved problems with data warehouse physics, mixed workloads and hardware platforms -- issues Beyer discussed in last year's study. There's another reason customers are less forgiving than in years past.
"The data warehouse is becoming mission-critical, which means that it needs to be highly available 24/7, and it needs to be disaster recoverable," Beyer said. "Operational applications are being connected to data warehouse data in order to do their inline analytics. If an operational system is brought down by the warehouse, the warehouse will never be forgiven."
So vendors are responding to another classic IT mantra, Beyer said, delivering systems that are "easy to understand, cheap to implement and fast to market." The market is demanding data warehousing appliances -- preconfigured software and hardware bundles that are self-managed, self-expanding, balanced environments. The majority of vendors on the 2007 study have a data warehouse appliance, Beyer said.
But along with demands for simplicity, vendors must also respond to increasingly complex processing requirements and environments, he said. Organizations have more data miners who are hitting data warehouses with more ad hoc queries. Queries are often more sophisticated, as analysts look for small "micro-trends" and seek more detailed insight into large datasets, he explained. This forces data warehouses to deliver faster, more efficient processing. And, matching the corporate trend of globally- distributed organizations, Beyer is also seeing more "distributed data warehouses," which utilize a single logical model covering multiple physical locations.
Data warehouse DBMS vendor rankings show subtle shifts
In the Leaders quadrant, for vendors with high marks for vision and customer execution, were Dayton, Ohio-based Teradata Corp., Oracle, IBM and Microsoft. The positions are a slight change from last year, with Oracle pulling ahead of IBM thanks to the
The Challengers quadrant, of vendors with customer execution experience but less vision, was empty this year.
The Niche quadrant featured the same three names as last year, with slightly different rankings. This space includes vendors that met the inclusion criteria for the study but had less execution experience or vision than those in the other sections. Swedish MySQL AB switched places with U.K-based Kognitio Ltd., appearing above it this year. Westmount, Quebec-based Sand Technology Inc.'s position remained about the same.
Gartner's data warehouse buying best practices
The maturity of the data warehousing DBMS industry means that buyers should demand reference customers from similar industries, with similar implementations, Beyer said.
Buyers should always complete "proof of concept" (POC) projects with their final shortlist of vendors, he said. The study recommends several best practices, such as using as much real source system data as possible during POCs, making data-loading part of the process and involving many users to better simulate production-level workloads. Finally, the study recommends holding back some sample queries, not giving them all to the vendor in advance, so that complex queries can be tested without any system pre-tuning.
"POCs are becoming more prevalent in the market," Beyer said. "Everyone claims to be faster. So a lot of companies are saying, 'Prove it!' "