The following is an excerpt from Enterprise Master Data Management: An SOA Approach to Managing Core Information, by Allen Dreibelbis, Eberhard Hechler, Ivan Milman, Martin Oberhofer, Paul van Run and Dan Wolfson. It is reprinted here with permission from IBM Press; Copyright 2008. Read the chapter below to learn about the business benefits of managed master data, or download a free .pdf of "Business benefits of managed master data to read later.
Chapter 1 is split into four parts. This is Part 4.
Table of contents
- Part 1: Introduction to enterprise master data management
- Part 2: Why do we need master data management systems?
- Part 3: What is a master data management system?
- Part 4: Business benefits of managed master data
- Chapter 2: Master data management as a service-oriented architecture enabler
- Podcast: Listen to a podcast with Dan Wolfson, co-author of this book
Business Benefits of Managed Master Data
In this section, we take a look at some benefits of using managed master data within an MDM System. We will categorize these benefits using the same three principles we discussed earlier, that is, that an MDM System:
- Provides a consistent understanding and trust of master data entities
- Provides mechanisms for consistent use of master data across the organization
- Is designed to accommodate and manage change
In Chapters 6–8, we focus on Solution Blueprints and discuss specific examples of the business benefits of MDM within particular industries and problem areas.
1.4.1 Consistent Understanding and Trust of Master Data Entities
One of the main objectives of MDM enablement of an enterprise is to improve the quality of the master data elements for the entire business. Here, we look at some different aspects of data quality and how MDM can help improve them. With higher quality in the data comes a more consistent understanding of master data entities. A broader discussion on data quality can be found in Chapter 9.
"Bad data costs money." High-quality data is required for sound operations, decision making, and planning. Data quality is high if the data correctly represents the real-life constructs to which it refers. A traditional development rule states that $1 spent in design costs $10 in development and $100 in support . The same applies to data quality: Money and effort spent designing for higher data quality early on provides a good return on investment when compared to fixing data issues later on. Even this is a fraction of the cost of having to deal with data quality issues after they have caused business problems. In reality, the $100 measurement is probably not even close to covering the cost of bad data. It is estimated by the Data Warehousing Institute that bad data costs U.S. business over $600 billion a year . The cost of bad data manifests itself in misplaced shipments, product returns, lost marketing opportunities, cost for immediate and near-term system repairs, loss of customer trust, loss of customers, and an adverse impact on sales and thus market share.
An MDM Solution needs to have the capabilities to increase the quality and thereby the trust in an enterprise's data compared to the previous state of unmanaged master data. Accuracy and completeness of master data, its consistency, its timeliness, its relevance, and its validity are the primary contributors to data quality. Accuracy of the data is defined as the degree of conformity that a stored piece of information has to its actual value. Because of the central positioning of master data within the enterprise, the accuracy of master data, in particular, plays a large role in determining the overall data quality. Data that can be validated while a customer service representative is on the phone with a client can be made more accurate than it was before by just updating a "validated-on-date" field. There is no change to the actual data, but it is now known to be accurate. Thus, knowing when the data has changed can be at least as important as the change itself. The accuracy of the data is also dependent on the context. Data that is sufficiently accurate in one context may not be accurate enough for another context. For example, a value for "age" may suffice for marketing purposes, whereas a legal document may require a date of birth.
Higher accuracy of data leads to greater efficiencies in business processes that use the data. Because of its high degree of reuse throughout the enterprise, this relationship is especially true for master data. Major improvements in data accuracy can be achieved through MDM functionality such as matching/de-duplication and structures data stewardship.
Matching and de-duplication refers to taking data from multiple source systems, multiple channels, or different interactions and matching them up with existing data in the MDM System. Before we begin the matching process, we first validate and standardize the data to improve the accuracy of the matching. Matching typically involves creating a candidate list of possible matches (also referred to as bucketing) based on the data already in the system and then comparing these candidates against the incoming record. By calculating a score for each comparison, the matches can be ranked. Typically, a threshold value indicates that certain records indeed match, and now these records can be collapsed together into a new combined record using a set of survivorship rules for each of the attributes of that record.
Data stewardship involves human interaction to determine match and no-match cases where automated processing cannot provide a guaranteed match. This process means manually assessing the possible matching and de-duplication and subsequently deciding on the survivorship of the individual attributes. Data stewardship and the management of data quality are discussed further in Chapter 9.
Data validation is a technique to improve the accuracy and precision of the data by using a set of rules to determine whether data is acceptable for a system. Examples are data formats, range validations, limit checks, and checksums on a data element, or cross-checks on multiple data elements. In an MDM context, there may be a validation rule to describe the format of a product identifier or to validate that a person can only have, at most, one legal name.
Because the master data in an MDM System often comes from multiple source systems, and because the validation rules across those source systems are seldom fully synchronized, the MDM System provides its own superset of data validation rules. Data validation can be enforced as part of an Extract, Transfer, Load (ETL) process (more on this later) when data is loaded into the MDM System. However, data validation also needs to be part of online transaction processing to validate data when MDM services are invoked. Centralizing data validation in an MDM System achieves a higher level of data validation for the enterprise as a whole.
Completeness of master data is determined by the degree to which it contains all of the relevant entities, attributes, and values required to represent the real-life master constructs such as customers, products, or accounts. Typical questions asked in this regard would be whether all of the entities for a given master construct are present, including all of the required attributes for these master data entities and their values. For example, in an MDM System managing customer information, are all of the required addresses (shipping, billing, or vacation) available? Do they contain the required attributes (address line city, postal/zip code) and are the values for these attributes provided?
Completeness is also dependent on business context—what is required in one context might be optional in another. For example, in the case of a life insurance customer, smoking status is a mandatory attribute, and the master data record for this customer would not be complete without a valid value for this mandatory attribute. The same customer record in the context of car insurance can be considered complete without this attribute. An MDM System servicing both verticals would need to be flexible enough to support this contextual distinction. MDM Systems typically have many different contributors of data within an enterprise. This enables an MDM System to maintain a more complete picture of the master data than any of the contributing systems on their own, because each keeps only a subset of the total data as required for their business purpose. In other cases, the MDM System might be the system where the collective data from the source systems is augmented with additional information not kept in any of the source systems.
Borrowing from the general notion of consistency in formal logic, we can define master data to be consistent when data retrieved through two different locations, channels, applications, or services cannot contradict itself. In other words, at no time should the manner by which the data is accessed have an effect on the information it represents. In a consistent environment, the values for data should be the same.
This might seem obvious, but as we saw in the previous section, the initial scattering of unmanaged master data across the enterprise is often natural and more or less unavoidable. What is also unavoidable is that the quality of that master data differs from system to system. It is therefore entirely possible that these sources disagree on a particular aspect of a master data object. For example, what one system has stored as a billing address might be kept as a shipping address in the other. Similarly, the date of birth of a customer in one source system may be different from that in another. A billing system might have accurate account and address information but would not be the trusted source for date of birth or e-mail address, while an online self-service system would have more accurate e-mail information but not necessarily the best postal addresses.
Even when we only focus on a single system, there can be large variations in data quality due to variations in the level of data consistency. All of these types of inconsistencies need to be addressed by the MDM System, both at the time of deployment and throughout its lifecycle. Being centrally positioned within the enterprise, an MDM System is in a unique position to improve the consistency of master data for an entire enterprise. Consistency is also determined by the level of standardization, normalization, and validation that was performed on the data. Data standardization ensures that the data adheres to agreed-upon guidelines. For example, address standardization determines what an address should look like for a specific geography and gives a fixed format based on a postal code look-up. Many other elements, such as first and last names, can also be standardized. Standardization greatly improves the ability for computer systems to locate and manipulate data elements.
Data normalization describes the organization of data elements in related subcomponents. This can be thought of in the traditional context of the database modeling technique of normalization but also on a much smaller scale, for example, parsing a personal name such as "MARIA LUZ RODRIGUEZ v. de LUNA" into the correct data structures.
As we saw earlier in Section 1.3.5, some MDM implementation styles are more prone than others to show some level of data inconsistency. In essence, as soon as master data appears in multiple places, there is a potential for data inconsistencies. This is true even if these sources are managed replicas. Replication technology is very good at being able to keep multiple copies synchronized—but there is often some amount of lag between copies as changes take place.
The timeliness of master data is another important factor determining its quality. Master data changes relatively slowly compared to other forms of business data. For example, we have observed that in financial institutions, customer data changes around three percent per day and that contract-related information can change eight percent per day. Address and phone numbers for individuals seem to change, on average, every 2.5 years. Product information in retail can change quickly as retailers introduce seasonal products into their catalogs.
These changes often take time to propagate through the enterprise and its systems. With this propagation comes a delay between the data being changed and the availability of this change to the data consumers—the longer the delay, the greater the potential loss in data quality.
A typical example of this is a traditional data warehouse where data is extracted from source systems, cleansed, de-duplicated, and transformed for use in an analytical context. Because many data warehouses are used for off-line decision support, it is common for them to be updated on a daily or sometimes weekly basis. Thus, the data is always somewhat out of date with respect to the operational systems that feed it. Such warehouses may not be suitable for operational usage. An MDM System may take on the task of maintaining this cleansed version of the master data on an ongoing basis, while serving as a source for the data warehouse. In this case, the MDM System is providing on-line access to this cleansed data.
Another factor in the timeliness of the data is its freshness. Captured data typically deteriorates by a certain rate because the real life constructs it represents change over time and not all these changes make it back into the captured data. For example, on average about 20% of all Americans change their address every year.
Timeliness thus affects both consistency and accuracy. Propagation delay impacts the consistency of the information across systems, whereas freshness is one indication of how accurate the data remains.
Data relevance is a measure of the degree to which data satisfies the needs of the consumer. It measures how pertinent, connected, or applicable some information is to a given matter. What is obvious from this definition is that relevance is also context-sensitive. Master data that is relevant in one context or to one user might be irrelevant in another context or to another user. If all relevant information is captured for the different consumers of the information, then the information can be considered to be complete.
For example, the physical dimensions of a grocery item in a product information system are relevant to someone in shipping but irrelevant to a translator who works on creating a Spanish version of the product catalog. In the description of completeness, we discussed the example of a smoker status and its relevance to different lines of business in an insurance company. Data relevance determines why and what we measure or collect. To ensure data relevance, the "noise" (unnecessary information) factor of the data needs to be reduced. In the case of operational data, relevance is usually determined during the definition and requirements phase of an MDM project. For example, during this phase of an MDM implementation, a gap analysis can be used to determine which relevant data elements need to be added to an existing data model. If data from multiple source systems is combined in an MDM System, then the relevance of the data elements from a single system to the enterprise as a whole needs to be determined.
The MDM System does not necessarily need to contain the sum of all of the parts. Certain pieces of data might be irrelevant from an enterprise point of view. These additional data elements may continue to be maintained in the line-of-business systems. Relevance may also change over time. As business needs change, what is relevant today may change tomorrow. Accommodating these changes is a natural part of the evolution of an MDM System.
We can trust data when we know that it has met an appropriate set of standards for accuracy, cleanliness, consistency, and timeliness—that is, when we know that data stewards manage the data and that the data is protected from unauthorized or unmanaged updates. The more we know about the data, the more we understand the data itself and what is meaningful (and what is not), and the more we learn how to gauge our trust in the information. We can learn about the data and data quality using data profiling tools, and we can begin to understand the provenance or lineage of the data through a combination of automated and manual techniques.
We can aggregate master data from across the enterprise, clean it, reconcile it, and then manage it so that we control who is allowed to see it and who is allowed to change it. By actively managing master data in this way, we can assert our trust in this master data. When we believe that we can trust the data—that we manage and maintain a collection of trusted information—then we can be an authoritative source of that data for other users and applications.
1.4.2 Consistent Use of Master Data Across the Organization
It is not just the quality and consistency of the data that is important—it is also the consistent usage of that master data throughout the enterprise. MDM Systems offer a consistent, comprehensive view of master data across the organization. Typically, this unified view is not available before an MDM implementation takes place. In this pre-MDM situation, master data is typically spread out across multiple, autonomous line-of-business systems. These systems could be of a homogeneous or heterogeneous nature. An example of a homogeneous situation will be presented in Chapter 8, where we will see a solution blueprint using an MDM System to provide a consistent view of product data across SAP systems from multiple geographies. A heterogeneous example could be one where customer data is stored both in Siebel CRM and in a custom-built billing system. The benefits of using MDM in this situation come from the data quality improvements we saw earlier and from cost savings and efficiencies we will describe later, as well as from improved support for regulatory compliance.
18.104.22.168 Cost Savings and Efficiencies
Cost reduction and avoidance is another benefit of tackling MDM. There are many operational savings and efficiencies that can be achieved by implementing reusable services supporting key processes such as name and address change in CDI or product information changes in PIM. In an unmanaged master data environment, transactions like these typically need to be applied to every application that contains such data, often by manually rekeying information. Depending on the MDM implementation style in use, such processing can be drastically reduced by only updating the coexistence or operational MDM hub and automatically forwarding such changes to the interested applications, or by having these other applications consult the MDM System directly for this master data. This process can reduce the amount of effort required to propagate such changes through the enterprise and improves this propagation by ensuring all relevant systems are updated. This process also improves the quality of the propagation by ensuring that all the updates are the same and that no re-keying errors occur.
Other cost avoidance opportunities can be identified by focusing on the benefits of having an enterprise-wide view of the master data entities available in the MDM Solution. This enterprise view allows for the discovery of relationships between entities that were previously only distributed across multiple systems. In the case of CDI, we can therefore bring together all of the information about a party from across all of the different systems, including all of the addresses at which the client can be reached, all of the products the client owns (obtained from different lines of business), family relationships, or additional identifiers such as driving license or passport number. All of this information can be used to optimize dealings with the client and provide a better customer experience in dealing with the company—thereby increasing customer retention.
MDM can lead to a reduction in data storage costs and total cost of ownership of a solution by removing redundant copies of master data, although this benefit mainly occurs in a consolidation and transactional hub style of MDM. The data volumes occupied by master data in a typical enterprise are very significant and there can be substantial savings in storage costs. In a registry or coexistence style MDM implementation, the storage requirements typically increase, because all of the existing unmanaged master data copies are still maintained, together with the new data storage requirements for the MDM hub. Another caveat here is that in many cases an MDM System starts to store more master data than was originally available in unmanaged form. In essence, the master data in the MDM System is augmented with data previously not recorded in the enterprise. Data fields may need to be larger—for example, the ID field needs to be longer because more entities appear in a single system. Typical examples can be found in the area of privacy preferences or e-mail addresses that were previously not recorded in the older source systems.
Enterprise resources such as money, labor, advertising, and IT systems are typically scarce commodities that need to be applied in those areas where they can offer maximal return on investment. Unmanaged master data hinders this resource allocation because the information required to drive decisions is scattered among many systems. Questions such as "Who are my most valuable customers?," "Which are my best selling products?," and "Is there fraud—and if so, where?" require a managed set of master data.
In traditional operating environments, such decisions are based on information from a data warehouse, which often is the only location where data was available in cleansed, unduplicated form. Data warehouses, however, have inherent design characteristics to optimize them for analytics and reporting, and they are generally not designed to support operational transactions. In addition, this data often has a higher degree of latency and is therefore somewhat stale. Without MDM, a customer who just bought a high-end product over the Web and is calling in to the call center might not appear as a very valuable client to the customer service representative. Without managed master data, it is very difficult to get a complete view of such a customer or product and determine its value to the enterprise. Consequently, resources can't be optimally applied, and it is difficult to provide a higher level of service.
Supporting all channels with managed master data delivers common, consistent information that allows customer service representatives to give the same discount to a client on the phone as the one he or she just handled by mail or on the Web site. Managed master data allows for the product description in the printed catalog to match the one on the Web site and printed on the product. The consistency MDM offers leads to cost savings because it reduces the effort required to process this data at a channel level. This reduction in effort is a significant improvement over the effort required to keep consistency across channels in an unmanaged master data environment.
The overall picture that MDM creates of the master data is more complete than any of the pictures in the contributing systems, and it is therefore more useful for due diligence processes or to detect potential fraud. In fact, we can use MDM to proactively uncover fraud and to create alerts or take appropriate actions. After the master data is managed by an MDM system, we can determine relationships between master data entities that were not detectable before. For example, it is very valuable to detect that a new prospective client is co-located at an address of another customer with a very similar name who is on the bankruptcy list, or to figure out that the manager in charge of purchasing is married to one of your biggest vendors.
MDM allows for the streamlining and automation of business processes for greater efficiency. Furthermore, MDM centralizes the master data within the enterprise and enables the refactoring and reuse of key business processes around that master data. For example, CDI facilitates the development of enterprise-wide New Account Opening processes, and PIM enables the development of enterprise-wide New Product Introduction processes.
22.214.171.124 Regulatory Compliance
Many newspaper articles commence with "Since Enron and September 11, 2001" when discussing regulatory compliance, but regulatory bodies have been around for much longer than this. The original Anti-Money Laundering (AML) controls were implemented in the Bank Secrecy Act of 1970  and have been amended up to the present. The Basel Committee first came together in 1975 as a result of the failure of Bankhaus Herstatt.
Since the two mentioned incidents, however, the pace and rigor of new regulations has increased significantly. In addition, it is very rare for one of these regulations to be withdrawn and disappear. Since 1981, over 100,000 regulations have been added in the United States . Consequently, the number of regulations that a modern enterprise needs to adhere to is continually increasing, as are the expenses associated with compliance. According to a study by the Financial Executives Institute, companies should expect to spend an average of $3 million to comply with section 404 of the Sarbanes-Oxley Act (SOX) . Forrester Research estimates the five-year cost of a Basel II implementation for the largest banks to be $150 million. Obviously, there are vast differences between all these regulations: by industry and by geography, and also by the strictness, penalties, and consequences of noncompliance. Some of the most well-known regulations are Basel II, Sarbanes-Oxley (SOX), the Patriot Act, Office of Foreign Assets Control (OFAC) watch lists,18 Solvency II, "Do not call" compliance from the Federal Communications Commission (FCC), Anti-Money Laundering (AML), and HIPAA.
Some of these regulations are global, while others are specific to North America or Europe, but most have equivalent regulations across all geographies. Obviously, implementing point solutions for each of these regulations separately is not viable, and enterprises have to look at approaching this situation in a more holistic manner. Fortunately, even though they have different policies, many regulations share common objectives that require that authoritative data is used in business processes and that proper controls over key data are in place. Organizations are therefore starting to establish regulatory frameworks instead of addressing each of these regulatory compliance initiatives with a dedicated point solution. This approach also better positions them to adapt to changes in regulation or the introduction of new regulations in the future. Achieving regulatory compliance initially does not add anything to a company's bottom line; it is a pure cost and used as a "penalty avoidance" measure. However, the solutions that are put in place to achieve compliance can be leveraged to drive many other advantages and differentiators in the market. Thus, as you can see in Figure 1.17, the more maturely compliance is handled, the more business value for the stakeholders can be derived from it.
To illustrate the relationship between compliance and master data more concretely, we now describe a few of the high-impact regulatory policies.
Figure 1.17 Maturation from risk mitigation and penalty avoidance to leveraging risk and compliance as a competitive advantage.
- CDI—Know Your Customer (KYC)
KYC is a compliance policy related to the Bank Secrecy Act and the USA Patriot Act and to international standards such as Solvency II and the International Accounting Standards. It requires financial institutions to diligently identify their clients and obtain certain relevant information required to enact financial business with them. One aspect of KYC is to verify that the customer is not on lists of known fraudsters, terrorists, or money launderers, such as the Office of Foreign Assets Control's (OFAC) Specially Designated Nationals list. Another is to obtain an investment profile from customers to identify their risk tolerance before selling them investment products. CDI systems are designed to store and maintain identifying pieces of information, such as driving licenses, passports, and Social Security Numbers on the parties in the system. Through the de-duplication functionality available in an MDM System, companies have a much better chance of correctly identifying two parties as being one and the same. CDI systems can store KYC party profile information like the questionnaire answers obtained for a financial profile of the client, and CDI systems can more easily check a company's entire list of customers, vendors, employees, and so on, against any of the known felon lists.
Privacy is defined as a basic human right in the "Universal Declaration of Human Rights," and although data privacy legislation and regulation is not very strict in the United States, it is much more rigidly defined and enforced in Canada and especially in Europe. The European Commission's "Directive on the Protection of Personal Data" states that anyone processing personal data must comply with the eight enforceable principles of good practice. These principles state that data must be:
- Fairly and lawfully processed
- Processed for limited purposes
- Adequate, relevant, and not excessive
- Not kept longer than necessary
- Processed in accordance with the data subject's rights
- Not transferred to countries without adequate protection
Verification of these principles within an enterprise requires strict management and governance of the company's master data. This governance includes both the referential and persisted storage of the data as well as management of the processes handling this data. These and other requirements originating for data privacy legislation and regulations can be serviced by using MDM System features. Access to the data needs to be restricted to those who have the rights to administer it through user authorization and authentication, and data entitlements. Private data can not be kept indefinitely, and archiving and data deletion features need to be present. Preferences for do not mail, do not call, and do not e-mail need to be available within the MDM data models in order to comply, for example, with the "National Do Not Call Registry" in the United States. These kinds of data augmentations are often easier to implement in a centralized MDM System than in silo-based administrative systems.
- CDI/Account, Credit Risk Mitigation
Unfortunately, not everybody pays their bills. Credit risk is the risk of loss due to non-payment of a loan or other line of credit. Offering more credit to a particular client increases the credit risk for the company. Risk is offset against the potential gains that can be made on the loan. However, in many cases, companies cannot even clearly determine the credit risk of a particular customer they are exposed to because they do not have sufficiently cleansed and de-duplicated customer data. The same customer might exist in the system multiple times and carry credit on every instance, thereby increasing the creditor's risk. Using an MDM System to keep customer data clean and de-duplicated can help lower risk exposure for the company by providing a clear picture of the risk exposure of a particular client. Another common occurrence is that some of a company's vendors are also its clients. Realizing that these two parties are one and the same can increase a company's negotiation position when credit is drawn on one side of this relationship and offered on the other.
- PIM and Regulatory Compliance
Product information is also a heavily regulated asset. Packaging information, export and customs information, ingredients lists, warning labels, safety warnings, manuals, and many other types of product information all have to adhere to format, content, and language rules that are very industry- and geography-specific. Centrally storing this data in a PIM system helps the compliance process, but it is the collaborative workflow processes around maintaining this data in a PIM system that enable proper control of product master data to ensure regulatory compliance.
- PIM/RFID—Regulatory Compliance—Traceability in the Pharmaceutical Industry
The Prescription Drug Marketing Act (PDMA) of 1988 in the United States mandates that drug wholesalers that are not manufacturers or authorized distributors of a drug must provide a pedigree for every prescription drug they distribute. This regulation was created to prevent drug counterfeiters from entering illegal and potentially dangerous products into U.S. commerce. While the implementation of this regulation is still being contested in court and was postponed in 2006, several states have stepped up their individual pedigree legislation. Most noticeably, California has adopted a requirement that the drug's pedigree be available in electronic form. Electronic Product Codes (EPC) and Radio Frequency Identification (RFID) are two promising technologies that are being used in this area. By building applications around these technologies, individual shipment lots can be tracked to ensure their pedigree. Hooking this transactional data up to the product information stored in a PIM system allows for a full 360-degree view of a product, its detailed information, and its passage through the supply chain.
- Account Domain and Regulatory Compliance
Today, all financial institutions globally are required to monitor, investigate, and report transactions of a suspicious nature to their central banks. They must perform due diligence in establishing the customer's identity and the source and destination of the funds. The account domain of an MDM can be used to provide references to accounts that exist elsewhere, in other back-end administrative account systems, and how they are related to parties in the system. This information can be very useful in identifying possible cases of money laundering. Alternatively, the MDM System can be used as the system of record for account information, in which case the account exists and is managed solely within the MDM System. In this second case, the transactions that are being performed on such an account need to be monitored for possible fraudulent behavior.
1.4.3 Accommodate and Manage Change
In this section, we take a look at various aspects of managing and accommodating change within an organization, viewed from an MDM perspective.
126.96.36.199 Reducing Time to Market
Marketplaces are increasingly more volatile, competitive, and risky. For businesses to participate in these markets, they must be able to respond rapidly to directional, structural, and relationship changes within their chosen industries and market sectors. Reducing the time to market for their New Product Introduction is a critical objective in this pursuit. Time to market is defined as the amount of time it takes to bring a product from conception to a point where it is available for sale. Across industries, different phases in the product development process are identified as the start or end-point of the Time to Market process. In some industries, the start is defined as the moment a concept is approved; in others, it is when the product development process is actually staffed. The definition of the end of the Time to Market measurement is also open to interpretation. In some industries, it may be defined as the handover from product engineering to manufacturing, or in other industries, it may be the moment the product is in the client's hands. Regardless of the scope of the process, what is important to a business is the relevant measurement of its Time to Market against that of its direct competitors. Getting to the market first is important for various reasons. It allows an enterprise more freedom in setting the product price, because no competitive products are available until the competition catches up. It may also allow an enterprise to obtain an early foothold and capture an initial market share before its competitors, allowing the organization to profile its brand as the industry leader in that area.
It is critical for a successful and optimal execution of the New Product Introduction that consistent, high-quality information about the product is available to all parties involved in the NPI process. Unmanaged, scattered master data about products leads to inconsistencies, inaccuracies, and therefore to delays in Time to Market for the product, providing opportunities to competitors to react and get to market first. MDM is a key enabler to the management of these collaborative workflows. By obtaining a consistent, cleansed, and accurate version of the product data in a PIM system, many NPI processes can be improved. Steps in the NPI product development process typically include checking, review, approval, and control of product structures. It is therefore critical to manage the related product information in the same manner and to provide a consistent implementation of the NPI process in the PIM MDM System.
188.8.131.52 Revenue Enhancement and Other New Opportunities
MDM provides a higher level of insight into master data, and this can be used to identify opportunities for revenue improvement. Increased insight into high-value customers through profiles, or account and interactions information, can be used to identify candidates for up-sell or cross-sell opportunities. Increased insight into master data around products can then be used to identify which up-sell and cross-sell opportunities exist when selling a particular product to that customer and which bundling opportunities can be leveraged.
Events relating to master data can be analyzed to identify revenue opportunities. For example, residence changes and other life events can alert sales to potentially changing customer needs. Without MDM, there is no enterprise-wide ability to recognize and communicate such events and thus no sales actions are taken, no e-mail campaigns are directed based on such events, and no outbound telephone calls or Web offers are made. All of these result in missed revenue opportunities.
184.108.40.206 Ability to Rapidly Innovate
Companies cannot grow through cost reduction and reengineering alone. Innovation is the key element in providing aggressive top-line growth and increasing bottom-line results (see  for details). Innovation is the successful implementation of creative ideas within an organization. Innovation begins with a creative idea by an individual or a team, and while the initial idea is a necessary input, it is not sufficient to guarantee innovation (see  for details). To achieve innovation, the implementation of the idea needs to be successful. It is in the implementation of those creative ideas where MDM can help an organization innovate. Innovation within an enterprise can take many different forms. Product, service, and process innovation are some of the more obvious types, but marketing innovation, business model innovation, organizational innovation, supply chain innovation, and financial innovations are other examples of innovations that can contribute to increased success. All of these innovations have dependencies on the master data available within the enterprise and the processes surrounding them.
220.127.116.11 Product or Service Innovation
If a company wants to introduce an innovation around a product or a service it offers, or if it wants to start offering a new product or service, an MDM System can help centrally manage the related changes that need to be made to the product master data. Where no MDM System exists, product data may be scattered across the enterprise. Integrating across these multiple copies and ensuring all copies are properly updated acts as an inhibitor to innovation.
A product innovation may require updates to the MDM System as well. For example, what was previously a valid value for a product attribute might now not be incorrect, requiring a change to data validation routines. Alternatively, the innovation might require additional attributes to be kept as part of the product information, requiring changes to the data structures and metadata information. In many ways, the advantages MDM can provide here are similar to the ones that offered streamlining of the new product introduction process, as we have seen earlier. Another usage of MDM for product innovation is product bundling. Many organizations have separate lines of business that manage their individual product lines, often backed by isolated IT systems geared specifically to supporting one particular type of product. A common example here would be a telecommunications company that sells landline and mobile subscriptions, cable or satellite TV, and high-speed Internet access. Such a company might want to provide product innovation by offering its clients bundled products with associated discounts. None of the existing administrative systems may be suited for this purpose—however, an MDM System managing a combination of customer, product, and account information would be a logical starting point to enable such a purpose. It can provide reference links to all of the administrative systems that manage the bundle components and oversee the terms and conditions of the bundle.
18.104.22.168 Process Innovation
SOA in general and MDM specifically are enablers of process change and innovation for the enterprise (we will go into more detail on the relationship between SOA and MDM in the next chapter). MDM delivers data management services to the enterprise that closely align with business tasks that manage master data. Therefore, the definitions of these services can be directly used in the conception and process modeling phase of a process innovation project. Additionally, the implementation of these services enables the enterprise to realize the process innovations much more quickly than was previously possible. Previously, process innovations would have resulted in extensive impact on the scattered master data elements in the enterprise. Where does the process retrieve its customer name and address information from? Where can it find the related products sold to those customers? How are both of these source systems organized so the data can be retrieved effectively? All of these questions and this complexity would have to be dealt with each time an innovation in business process was considered. Because the implementation of the MDM System has already resolved these issues, it is now easier to change existing business processes and support process innovation.
22.214.171.124 Market Innovation
Marketing is focused on creating, winning, and retaining customers. Marketing innovations deal with the identification and development of new ways of achieving this. including new product designs or packaging, new product promotions, or new media messages and pricing. Successful marketing innovations depend heavily on the quality and timeliness of the market and enterprise data they use to do their analysis. This is where an MDM implementation can be very beneficial to these innovation initiatives. The MDM Customer Data Integration system contains the most accurate, up-to-date, and complete view of the current customers, vendors, and prospects. The MDM Product Information Management system contains the most complete view of the products available in the enterprise. Combined with data from other master data domains, this is a wealth of information vital to the marketing analysis and market innovations. The data warehouse is another typical source of data consulted for this purpose. Because cleansed data from the MDM System is an excellent data source for a data warehouse, its usage for marketing innovations is complementary to that of MDM.
126.96.36.199 Supply Chain Innovation
As communications and transportation have rapidly increased, opportunities for change in the supply chain have increased dramatically. The Internet has brought together suppliers and buyers that were previously unaware of each other and has opened world markets for even the smallest organizations. Internet advancements have also opened up the market for labor to be employed where it can be most economically sourced. All of these new possibilities offer opportunities for supply chain innovation. To implement these innovations, enterprises need to optimize their ability to switch between in-house and outsourced parts of their supply chain, cultivating the ability to quickly switch from one supplier to the next or from one distribution model to the next. When master data is not well managed, making such changes can lead to serious business errors. Well-defined and well-managed master data, combined with well run MDM processes, allows for quicker implementation of supply chain innovations. New sources of master data can be incorporated in the MDM infrastructure quickly, and because all of the data will go through the same data quality processes and the same MDM business processes, the overall stability of the corporate master data won't be affected negatively. It is also much easier to supply master data to new elements within the supply chain requesting it. Because of its hub architecture, the number of changes that need to be applied to application interactions is much smaller than in a traditional network infrastructure where unmanaged master data exists in many different systems. In Figure 1.18, the addition of a new distribution channel (A) leads to fewer changes (dotted arrows) in the enterprise application infrastructure when an MDM System is present.
Figure 1.18 Supply Chain with and without MDM.
188.8.131.52 Accommodating Mergers and Acquisitions (M&A)
As we saw earlier, mergers and acquisitions are a common cause of the existence of unmanaged master data in an organization. The successful integration of data is heavily dependent on the data governance practices and the data quality standards of the participating organizations. If data governance is not practiced well within one of the participants, it is going to be very hard to identify the data sets that need to be consolidated. In many cases, much of the data is kept in places that IT is not even aware of, such as in spreadsheets or local databases. If the quality of the data is not adequate, the confidence in the data is low and consolidation is going to be problematic even when adopting an MDM strategy. More often than not, the anticipated cost of such a data consolidation effort is underestimated and partly to blame for a high number of failed mergers and acquisitions. The Boston Consulting Group estimated that more than half of the mergers and acquisitions between 1992 and 2006 actually lowered shareholder value.
184.108.40.206 Introduction of New Requirements
Business changes continuously, not just for the reasons described earlier, but often just to keep up with competitors and stay in business. In the change patterns described in this section, the role of an MDM System is to accommodate business change more easily and more rapidly within the enterprise, and to do so in a more controlled and governed manner.
First, let's consider changes to usage patterns and the user community. The user community that manages and retrieves master data changes over time. Common examples can be found in the many self-service Web sites where customers can change name, address, and phone number information online or through the integration of a vendor portal for a retailer. These self-service and portal-based usages add whole new communities of end users to the enterprise. Using an MDM System, these new user communities can be integrated into the total user community by assigning them the appropriate security rights and privileges, providing them with suitable user interfaces, and managing the additional workload by scaling up the MDM System to an appropriate level.
Management of master data is governed by many business rules—rules that determine data validity, entity lifecycles, decision-making processes, event handling, matching, and merging. Over time these rules tend to change. Some rules are relaxed, others are tightened, others are corrected, and still more change because of external influences (e.g., regulatory or legislative changes). When using an MDM System, these business rules are encapsulated within services, so service consumers should not need to change their integration logic as rules change. Also, because business rules are componentized inside the MDM System, they are easy to change. Such changes require appropriate governance, as we will describe later in this book.
It is hard to predict future usage patterns of any IT system, and MDM is no exception. Thus, an MDM System must allow for new services to be built that address to new requirements without disturbing any of the existing integrations. It is important that the MDM System is able to quickly support such new services to reduce IT project implementation time. An MDM System typically contains tooling to create new service definitions augmenting the default set of services provided with the product license.
Few products change as frequently and dramatically as software products. In an enterprise context, there is a never-ending stream of new applications, new versions, new releases, and new fix packs of all of the software products being used with the organization. These must fit into the IT infrastructure with minimal interruption to the business. An MDM System can help accommodate some of these types of changes in a number of ways. First, an MDM System must respect backward compatibility. In other words, it must make sure that newer versions of the MDM software, with new and improved functionality, can be introduced without affecting existing integrations with the rest of the IT infrastructure. The new system must support the existing services, and the existing integrations and data, in order to not disrupt the business after upgrade. Secondly, MDM Systems typically run on a stack of other software products such as database management systems (DBMS), application servers, and message middleware. The MDM System isolates the end user of the services from the details of the underlying infrastructure. Clients of the system should not require software upgrades if an underlying stack component is upgraded. Thirdly, MDM Systems typically run on a variation of different hardware platforms. This enables the enterprise to select the platform most suitable for its needs. Infrastructure stack dependencies can become opportunities if the same MDM System can run on Microsoft®, Windows®, and IBM z/OS® Mainframes. Finally, the usage of SOA architecture (see Chapter 2) with service-based interfaces isolates the MDM users and client software from the details of the underlying implementation and related changes.
As we saw earlier in the discussion on mergers and acquisitions, MDM can play an important role in bringing data from the participating companies together into one managed location for master data. But adding additional sets of data is not only related to mergers and acquisitions. In many cases, enterprise MDM enablement is phased in by addressing only one subset of the applications and their data in the enterprise at a time. After an initial load involving a few enterprise systems, the rest is phased in iteratively over time. Even in relatively mature deployments, batch loads into the MDM Systems are fairly common.
Facilitating quick and accurate migrations, creating initial or delta loads is an important capability of an MDM System, allowing the business to leverage the advantages of MDM more rapidly and to react efficiently to changing environments.
MDM is a broad subject that touches on many of the concepts of enterprise information architecture. MDM strives to untangle and simplify the complex systems that have evolved to manage core business information by logically consolidating this information into managed yet flexible MDM Systems. Acting as either a system of record or a system of reference, MDM Systems can provide authoritative data to all enterprise applications.
As we have described throughout the chapter, successful MDM Systems:
- Provide a consistent understanding and trust of master data entities
- Provide mechanisms for consistent use of master data across the organization
- Are designed to accommodate and manage change
These are the key principles of MDM that we will continue to detail throughout the remainder of the book.
The business drivers behind MDM are compelling—from regulatory compliance to improving the responsiveness of an organization to change. By providing authoritative information as a set of services, MDM is also a key enabler for broader enterprise strategies, such as SOA. The following chapter will dive into the details of SOA and the role of MDM Systems in an SOA environment.
More information about enterprise master data management
- Download the full chapter from Enterprise Master Data Management: An SOA Approach to Managing Core Information
- Read more master data management books in the chapter download library
This was first published in July 2008