The mainstay of the modern enterprise has always been the transactional database, where operational data is processed...
and stored. Transactional databases come in many shapes and sizes; however, their similarities now outweigh their differences, making the process of selecting databases a chore for many organizations.
In the past, features and considerations such as hardware requirements, available options, maximum table sizes and deployment costs typically were the key factors that led to the choice of one database management system (DBMS) over another. With commonality more of a defining thread between the various products that are available today, those choices have become more complicated.
What’s more, the open source movement has spawned databases that for all intents and purposes are free – at least from the standpoint that there’s no initial cost to purchase the database system software. Other expenses remain much the same, though: the cost of hardware, IT facilities, programmers and support staff, maintenance contracts and so on.
That has created a conundrum for organizations looking to deploy new databases – namely, is the overall cost of an open source database much different than the cost of a commercial product after all of the tangible and intangible elements are included? While answering that question is best left to the realms of return on investment and total cost of ownership calculations, it does raise an issue that needs to be taken into account when a company is developing or updating its database strategy.
There are others as well. First and foremost, economic pressures are driving many database system software trends, ranging from increased efficiency within the DBMS to lower maintenance demands and integrated data continuity capabilities.
The new economics of database system software
Increased database efficiency can help lower capital expenditures and operational costs by reducing hardware demands or enabling users to maximize their CPU utilization with server virtualization technologies. Simply put, more efficient databases require less server hardware. Improved resiliency and the automation of many database maintenance tasks can further reduce operational costs. A database that can, say, automatically extend tables on demand requires less administration and, thus, fewer dollars to be spent on the salaries of database administrators.
One of the biggest concerns facing organizations with significant investments in transactional databases and DBMS technologies is data continuity. Database vendors are now offering capabilities designed to improve continuity while reducing the reliance on third-party disaster recovery products. That includes technologies such as integrated data mirroring, data replication, automated transaction rollbacks, integrated backup and other tools that can help a database deliver 24/7 availability. Such enhancements can offer a big payback to companies.
Another trend affecting the database planning process is cloud enablement. With the growth of cloud-based applications and services, many databases are undergoing evolutionary changes in areas such as how they interact with applications, their support for disconnected transactions and how they deal with high-latency, semi-persistent connections. And of course, cloud databases and Database as a Service technologies are emerging as potential alternatives to traditional on-premises database system software.
However, data security is rapidly becoming the biggest challenge for companies on database projects – particularly as large transactional databases are Web-enabled, putting ones that aren’t properly secured at risk of being breached by hacker tools and other intrusion mechanisms. Databases need to be well integrated with security tools to authenticate end users, encrypt transactions and prevent data leakage. The trend here is for database vendors to offer application programming interfaces, security modules and other technologies to enforce security policies without hampering legitimate transactions.
A new union: mixing database system software and virtualization tools
Integrated support for virtualization is also growing. While server virtualization became a big IT trend in 2008 and 2009, many IT managers have held off on putting databases inside virtual machines. Now that virtualization technology has matured, more organizations are looking to take advantage of it with databases in order to reduce their hardware costs.
Data security is rapidly becoming the biggest challenge for companies on database projects – particularly as large transactional databases are Web-enabled.
But IT departments need to make sure that their database vendors are providing the management, monitoring and resiliency tools needed to deal with the intricacies of database virtualization. In addition, virtualization can further change the database security environment: Existing enforcement models based on network-perimeter protections and network monitoring procedures might not be able to cope with the dynamic nature of virtualized systems.
Luckily, many of the traditional rules of thumb for selecting database system software do still apply. The final choice should take into account metrics such as transaction load, transaction processing performance, database size and potential scalability needs, costs (both tangible and intangible), and compatibility with the existing IT environment, applications and infrastructure.
Of course, there are also other factors to consider, such as technical support, training and the availability of database design and management tools. The key is to understand where ongoing and emerging database technology trends will lead your organizational requirements so you can start to narrow the field, and then use the metrics outlined above to help you vet the DBMS options that could meet your needs.
About the author: Frank Ohlhorst is an award-winning technology journalist, professional speaker and IT business consultant with more than 25 years of experience in the technology market. He has written articles for a variety of technology and business publications, and he worked previously as executive technology editor at eWeek and director of the CRN Test Center.