This article originally appeared on the BeyeNETWORK.
Here is an example of the classic chicken and egg dilemma: Do we need to build a data warehouse in order to support a business need for performance or productivity analysis, or do we conduct performance and productivity analysis because we have a data warehouse? The deeper question is more fundamental – does the business drive the technology or the other way around?
I suggest this puzzler because of two recent sets of events. The first is a result of some current customer activity at two different sites. Both of these clients want to improve their reporting, although with different drivers. Client A is interested in learning as much as possible from their data: trends, opportunities for up-sells and cross-sells, best customer analysis, lifetime customer value – all sorts of interesting stuff. However, there are some constraints in the organization: most of the data is in databases hosted by offsite vendors, there is only limited senior management support and the organization is not trained. In fact, they do not even have any processes in place to react to discovered knowledge. Consequently, if they did classify their customers and identify key marketing segments, it is unlikely that they would be able to take any actions. Additionally, they do not have any performance metrics defined to track any benefits or lift.
Client B is interested in evaluating their current (minimal) processes for productivity reporting and improving those processes. They are spending time analyzing their current approaches to capturing metrics – looking at the reports that are generated, who is providing the data that feeds those reports, etc. The intention is to understand the processes that they employ to perform their reporting and look for opportunities to improve those processes. However, they are not evaluating how these metrics correspond to their operational or program performance. They think they want to build a data warehouse because it will more easily facilitate the construction and delivery of their current (less than meaningful) reports.
The second is a result of a preponderance of people who approach me at conferences and workshops, asking me to help them figure out how to justify an investment in developing data quality or business intelligence programs. While both the IT and business partners agree that data quality and business intelligence projects are of value, their organizations require the development of a business case demonstrating a return on any technology investment. Yet, it is often difficult to demonstrate the hard value of certain activities that we all (seem to) believe are unquestionably wise.
Let’s consider each of these situations a little more closely. Client A desires results, but they have a limited commitment to adjusting the organization structure and governance to effectively exploit discovered knowledge. Client B desires measurements, but has a limited appetite for the pre-analysis necessary to link system or program metrics to line-of-business productivity. What is needed by the multitude of ROI-seekers is some training in effectively articulating the business value of their perceived technology investment.
The common thread here lies in the desire of individuals within an organization to jump into adopting a technology before the organization itself is properly prepared to use it. For example, Client A would benefit from organizational management training focusing on performance activities, performance metrics, baselining and setting expectations for exploiting business intelligence. This client could benefit from documenting their underlying business activities, understanding the different ways those activities are measured and how they can be improved, and what the expectation is for any performance improvement. Client B’s issues could be better addressed by spending some time reviewing the organization’s various lines of business, examining their business objectives and clearly specifying how system and operational performance is tied to success in achieving those business objectives.
Finally, I have two observations regarding ROI. While the process of developing a business case by demonstrating a return on investment is based on sound principles, there are some technical and governance programs whose intangible values far outweigh the potential benefits predicted by the ROI model. Of course, in some situations there are very clear, hard benefits, and in those cases, the development of the business case should be straightforward. But when trying to adapt the ROI approach to justify doing something the right way in the first place, are we perhaps stretching the concept of ROI a little thin?
In all of these cases, it appears that determining and justifying the technical solution may be a little premature. Before attempting to justify any kind of technical solution, these organizations would be better served by clearly identifying business objectives through careful assessment and baselining. Once the objectives have been identified, they should also identify ways to measure how those objectives are being met. By completing this process, they will be best situated to see how the technology program actually does provide added value.