Even as recently as 10 years ago, it would have been unthinkable to imagine enterprise-class database management systems running primarily in main memory. Yet over time, RAM prices have steadily declined to the point where doing that is no longer prohibitively expensive. The cost of memory is orders-of-magnitude less expensive than it used to be, and the plummeting prices have opened new opportunities for configuring database systems to take advantage of increased main-memory capacities.
And it's no longer just startup companies developing in-memory databases designed to support high-performance processing needs. Leading database and software vendors -- IBM, Oracle, Microsoft, SAP, Teradata -- are marketing database technologies that support in-memory processing, putting money behind their belief that mainstream organizations are ready to consider incorporating such software into IT systems.
In-memory databases provide accelerated application performance in two ways. First and foremost, maintaining data in main memory instead of significantly slower disk-based storage minimizes or even eliminates the data latency typically associated with database queries. Second, alternative database architectures enable more efficient use of the available memory. For example, many in-memory technologies use a columnar layout in tables instead of a row-based orientation. Values aligned along columns are more suitable for compression, and the ability to rapidly scan all column values speeds query execution.
A winning in-memory hand?
Conceptually, it's hard to argue against application speedup and optimized organization of data. But in the real world, when should IT and data management practitioners recommend that transaction processing or business analytics needs warrant the investments in technology, resources and new skills required to transition to an in-memory framework?
The practical aspects of that question involve weighing the need for increased database performance versus the associated costs of acquiring and deploying an in-memory platform. Even though RAM costs have decreased dramatically, systems with large-scale memory configurations will still carry a healthy price tag compared with database servers that stay with disk storage only. Corporate and business executives might experience sticker shock when they see the in-memory bill. To make an in-memory database technology purchase pay off, you need to find applications with characteristics that make them a good fit.
The answer lies partly in assessing your organization's demand for processing increased data volumes and the business value that could be delivered as a result of reduced database response times. Consider this supply chain management example fueled by in-memory software: Enabling real-time analysis of a variety of data streams -- inventory data from warehouses and retail locations, information about items in transit on trucks or rail cars, updates on traffic and weather conditions -- could help drive faster decisions on routing and distribution to ensure that goods get to where they need to be, when they need to be there. A resulting increase in sales clearly could justify the in-memory investment.
Look inside before going in-memory
It's also a good idea to take the overall characteristics of your organization into account. In-memory databases are worth considering if one or more of the following terms can be used to describe the environment you work in.
Open to investing in IT. Corporate execs must be willing to spend money on hardware with enough memory to satisfy the processing needs of business applications, even though scaling out systems to support in-memory computing carries a higher price than buying disk-heavy database servers does.
Analytically agile. In-memory systems can power reporting and analysis applications that help improve business processes -- and results -- by enabling end users to make informed decisions on a shorter cycle. For example, transitioning from weekly to hourly sales forecasting can lead to the creation of real-time product pricing models that increase profitability -- as long as pricing decisions can also be communicated and executed rapidly.
More on in-memory processing tools and applications
Read about recent rollouts of in-memory functionality in relational databases
See why in-memory analytics software packs a potential big data punch
Learn about in-memory data grid technology and its possible uses
Supportive of mixed-use development. Allowing transactional and analytical applications to simultaneously access the same database is another way to provide real-time analytics capabilities. But resource conflicts can cause performance problems with a conventional relational database, largely due to the latency associated with finding and accessing data records stored on disk. With an in-memory configuration, latency becomes less of an issue.
Data-aware. In-memory technology can also be a valuable tool when a large percentage of data access calls touch only a small part of a database. According to a 2013 white paper from data warehouse and database vendor Teradata, 43% of queries against data warehouses it studied accessed just 1% of the available information, while 92% of queries used only 20% of the data at hand. Identifying "hot" data that's accessed frequently and keeping it in memory should greatly reduce query response times.
In summary, organizations whose business processes can benefit from real-time data availability, simultaneous mixed-use applications, and noticeably faster reporting and analytics are good candidates for deploying in-memory databases. There are some scenarios in which the decision to do so is a no-brainer. But in most cases, the consideration of in-memory software must be aligned with IT spending priorities and corporate business objectives -- including a demonstrable awareness of how key areas of corporate performance could be improved by the faster transaction processing and access to reports and ad hoc query results that in-memory processing makes possible.
About the author:
David Loshin is president of Knowledge Integrity Inc., a consulting and development services company that works with clients on big data, business intelligence and data management initiatives. He also is the author or co-author of numerous books, including The Practitioner's Guide to Data Quality Improvement. Email him at email@example.com.