This article originally appeared on the BeyeNETWORK.
There is a great appeal to the concept of master data management (MDM), especially when considering the cost and effort savings when reducing the complexity of replicated or redundant data and functionality as a byproduct of consolidation. Yet while there are many good products supporting an MDM environment, the selection and purchase of a core MDM system should be the culmination of a series of preparatory steps taken once the organization is ready to make the transition. But how do we know if the organization is properly prepared to make that change?
In fact, readying the organization for MDM involves doing a little bit of homework, but recognizing the value of these preliminary tasks will simplify the process once the decision to move forward has been made. This homework consists of considering three different aspects of the application infrastructure and how they correspond to the use of data:
- Documentation of business processes and mapping to application functionality
- Definition and use of common information concepts and their instantiations
- Capability/maturity assessment of organizational data quality and governance
Business Process Models
One of the key business drivers for MDM is the consolidation of different data sets representing the same notions into a single (possibly virtual) resource shared across many business applications. Determining an approach for managing master data will be based on the ways that the applications use the data, and the ways that applications use data should be related to how they are intended to achieve business objectives. In reality, though, often the business process is confused with the business application; in other words, the implementation of an application to support a business process supplants the process itself.
It is, therefore, no wonder that when many staff members are asked to described their business activities, they instead describe the details of the applications designed to support those activities. Yet there may have been decisions made at the time of implementation that have created artificial constraints and dependencies that could be easily addressed via MDM. At the same time, a careful review of the information process flows relating to business activities provides the basis for recognizing which entities are used across the enterprise and, correspondingly, which composing data elements are critical to business success.
Mapping the business process flow separately from the application implementation provides three benefits. First, it segregates the business process from its implementation and provides a clearer view of the business dependencies on information without the implementation dependencies getting in the way. Second, its abstract view allows the analyst to identify business information concepts whose true meanings may have been masked as a byproduct of the implementation. Third, the business process model will highlight replicated tasks applied to common concepts across different applications, setting the stage for identification of master data objects and corresponding master data services and prioritization for consolidation.
Common Information Concepts
Due to the organic growth of the application infrastructure, different systems evolved over different time periods, designed by different types of systems designers, using different methods and different personal biases. While systems are designed to meet the operational needs of the immediate business clients, the non-coordinated use of business terms that are common to the organization and industry allows for variance. In most companies, many people use the same business terms with understanding of meaning based on each person’s individual experience.
What emerges quickly during any data integration exercise, though, is that there is enough variance in “implied semantics” to prevent a true consolidation. One application’s view of what a “customer” is may be different enough from another application’s view to render any report based on combined data a work of fiction. Recognizing that this variance will increase the time for an acceptable master data consolidation suggests that prior to the deployment of the MDM program, it is worthwhile to collect and document the fundamental business terminology that is commonly used in order to document the numerous understandings, research authoritative sources for actual definitions and harmonize those definitions to determine alignment (or perhaps misalignment) with actual use.
This collection of business information concepts and corresponding authoritative definitions is the fundamental layer in the master data metadata stack, which will (over time) be extended to incorporate all relevant metadata about each data element, entity, master data object, master data service and corresponding business rules. However, before embarking on the MDM metadata journey, alignment of business terminology will shake out any show-stopping issues associated with corporate information concepts prior to deciding on a deployment platform for consolidation, migration and implementation.
Data Quality and Governance Capability and Maturity
The third critical piece of homework involves an assessment of the organizational staff’s capabilities with respect to data governance and data quality management. No master data management program can be successful in the absence of data quality management overseen via a data governance program. However, instituting MDM without having clearly defined roles and responsibilities for data stewardship, data governance and data quality management is likely to result in an environment that may have a successful migration and consolidation, but will be subject to increasing data quality “entropy” as time goes on.
Prior to kicking off the MDM program, it is beneficial to perform a maturity assessment of the organization’s data quality management capabilities. At the same time, the planners should project the level of data governance maturity necessary to support the future master data (and services) environment. This enables the definition of a road map for instituting the appropriate data governance infrastructure as well as the appropriate set of data quality management guidelines and protocols for continuous quality management of the master data asset.
Priming the Organization for MDM
Master data management can be a great and effective technique for organization and alignment of enterprise-wide information and service use. However, jumping to the selection of an MDM implementation without readying the organization for the transition creates a high level of risk for failure. However, understanding the business process models, the core information concepts and the ability of the organization to continuously maintain measurably high quality information are not only beneficial for a master data management program, but provide insight into the de facto organizational application infrastructure for any future application migration or renovation project.
David is the President of Knowledge Integrity, Inc., a consulting and development company focusing on customized information management solutions including information quality solutions consulting, information quality training and business rules solutions. Loshin is the author of The Practitioner's Guide to Data Quality Improvement, Master Data Management, Enterprise Knowledge Management:The Data Quality Approach and Business Intelligence: The Savvy Manager's Guide. He is a frequent speaker on maximizing the value of information. David can be reached at firstname.lastname@example.org or at (301) 754-6350.