Data quality management roadmap

Jeff Pettit explains how to reap the benefits of highly productive data for years to come.

This article originally appeared on the BeyeNETWORK

In the seventies, a small integrated electronics company started in California’s bay area. Although their technology of fitting more and more integrated circuits into a smaller space eventually allowed room-sized computers to fit into your laptop today, that same exponential growth of information grew into a cancerous tumor threatening their very ability to make good decisions with quality data. Something as trivial as how the founding fathers created part numbers for their products and components that went into their products eventually grew into a stifling roadblock to managing their product decisions.

In the beginning, putting embedded meaning in the part number seemed harmless. For example, the first digit is the model number and the third digit is the speed number. It would help the customer differentiate the different models and options available. However, when this “intelligent” product code intended for the customer became the internal identifier for ordering materials, manufacturing and shipping thousands of products the cat was out of the bag. Ironically, the world’s first computer-on-a-chip company was stymied trying to represent thousands of permutations of form, fit and function into a 25-digit part number. This misguided path eventually allowed many non-form fit and function attributes such as where it was manufactured to be introduced into the part number without control.

So where did this end up? They ended up with hundreds of part numbers for one real world product and thousands of parts in a simple bill of material. Different groups commandeered the loosely managed standard so that, for example, the 13th position of the part number meant something different to one group versus another. This caused a breakdown of communications between business groups. It stopped the ability to make progress in advanced planning techniques. It resulted in millions of dollars wasted on bad decisions and millions of dollars spent trying to work around the problem. They had inadvertently dug a hole so deep that it would take nearly a decade to dig out of the hole.

The information used to represent your company’s product structure is the very backbone of any manufacturer. It became apparent that fixing this would take strong information management techniques and governance structures that would outlast any executive or organizational structure. This was not something that could be fixed by this year’s initiative, a new business intelligence (BI) tool or even a large enterprise resource planning company or tool.

What did the first computer-on-a-chip company do? Like the actual product that founded the company, they realized that the information about the product needed to be well designed and controlled. This new blueprint for the product structure not only needed to be highly specified, but it also would be used to drive policy, help decision making and serve as a standard guide to reconstruct the backbone of the company.

This was not easy. Even the most basic disciplines to rebuild the company’s product structure were not there. Vertical-based organizations lacked the scope of authority to manage this true enterprise entity. To compound the problem, organizations could come and go on a yearly basis. IT was preoccupied trying to implement the next ERP system, and the business was busy trying to make up for bad decisions with poor quality information.

Finally, it took a non-partisan cross-organization volunteer team that was founded on the idea that the true requirements for the actual life cycle of the product and all of its parts could be established and that this blueprint would guide all future reconstruction and usage.

Representatives from each part of the company’s value chain participated in the definition of the product structure. This was a true horizontal team spanning the entire company, such as raw material purchasers, factory workers, packagers and distributors, and those who planned each stage of its life cycle from design to customer support, from strategic to tactical.

Once the design was finished, it then required establishing the authority to make sure it was followed year to year. With something as large as this effort, it took multiple executive vice presidents to agree and sign not only this standard guiding blueprint, but also the policy around governing the construction and resource funding.

The following list exemplifies the magnitude of the change required to restructure the company’s product information:

  • Cross-functional information design teams

  • Standard definitions and business rules for the top 80% of the entities and attributes signed off by value chain executives

  • Standard conceptual and logical information models signed off by value chain executives

  • Executive agreement for all current and future efforts to manage product structure information to adopt the new design and policy

  • Governing processes used to control yearly funding of reconstruction efforts

  • Governing processes to control the disparate attempts to solve product structure issues

  • Multiyear programs to migrate planning systems one at a time

  • Multiyear programs to migrate execution systems one at a time

  • Cross-functional governing teams to judge ongoing compliance to the design

  • Educational classes on the new processes, polices and governing models

Key Learnings

Master data such as your company’s product structure is used horizontally across the many parts of the company. Build your company an enterprise-strength cross-functional set of design requirements with all the key stakeholders. Include terms, definitions, business rules, and authorized sources.

  • Large efforts to redesign your company’s master data will take a governing process that will outlast any organizational policy. You will need senior executive support and policy written at the same level as the company’s security, hiring or financial policies.

  • The design must be widely known and accepted across the business.

  • Turn independent efforts to try to solve a silo organization problem into incremental efforts following the single master design for your master data.

  • Quality master data does not come from after the fact, reverse engineering efforts to homogenize or harmonize disparate sources, but it ultimately comes from good solid cross-functional designs and a means to realize the ultimate sources with the least amount of time and money.

This truly enterprise approach to enterprise data finally enabled us to use advance planning solutions for long- and short-term planning, aligning supply and demand efforts across the supply chain as well as creating a cleaner view of the product from the customer’s perspective. It helped not only the top line financial numbers, but also the bottom line profit by reducing the number of systems and processes to handle the old obsolete embedded-meaning part numbers. The company has even worked with a large ERP company to redefine their master data efforts to include the ability for an extendable enterprise design. This approach can be applied to all master data. This Fortune 500 company is now redesigning other master data subjects such as customer, supplier and employee. The model has helped to redefine transactional data, especially in the area of planning transactions such as forecasting, planning and scheduling.

True high quality information does not happen by accident or mutate from poor legacy designs. Design quality in now, and you will reap the benefits of highly productive data for years to come.

Dig Deeper on Data quality techniques and best practices