SAVANNAH, Ga. -- A city where stylish new golf resorts mingle with historic architecture seemed a fitting background...
for The Data Warehousing Institute (TDWI) to hold its first Master Data Management (MDM) Insight event. Attendees learned about new practices for aging IT infrastructures and heard about common problems that MDM early adopters have encountered -- some unexpected, some par for the course.
Many of the 104 attendees at last week's three-day event had little experience with MDM, though all plan to start an MDM implementation within the next year or two (a qualification requirement for attending the hosted, travel-expenses-paid conference). The event confirmed that MDM is still in the early-adopter stage, but the pace of MDM adoption is picking up significantly, according to Wayne Eckerson, conference co-chair and director of research for Renton, Wash.-based TDWI. There are many business drivers fueling MDM interest, he said, and a diverse array of ways to accomplish it.
"There's such a range of potential domains for MDM. The most popular are customer, product and supplier -- but we're also seeing interest in using MDM for financial hierarchies and organizational hierarchies, as well as data warehouse dimensions," Eckerson said. "But people are struggling to get the business to justify and fund these projects that most people 'in the trenches' view as vitally important to ensure the success of various business endeavors."
Master data management peer discussions revealing
Frustration around building a business case for MDM was indeed evident among many attendees, especially during the "peer-to-peer" sessions. After keynote sessions from industry luminaries Jill Dyché and Evan Levy, analyst presentations, panel discussions and MDM case studies, attendees were arbitrarily assigned to small groups of about 20, with a moderator and a few discussion topics. One group's discussion flowed naturally from how to handle executives constantly reprioritizing IT projects (making it hard to get MDM projects off the ground) to commiserating over the oft-heard irksome contention that a department's "data is different and special" (and shouldn't be part of an MDM initiative) to sharing ideas for getting started with MDM.
One organization getting ready to take the MDM plunge is South Carolina Federal Credit Union, based in North Charleston. Bonnie Karst Ciuffo, chief information officer, and Elizabeth Brown, vice president, planning, enterprise architecture and quality, and their team, were originally working on a data warehouse evaluation, they said, but discovered that MDM would be an important part of that project. The event was well timed, Ciuffo said, since they'll be putting their requirements together by April, selecting a vendor by August, and planning to start an implementation in 2009. The credit union's main concern is keeping MDM costs under control, Ciuffo said, so she was glad to learn about common challenges.
"We don't like to be on the bleeding edge, but we like to be on the leading edge," she said. "Evidently, other people have already 'bled' -- hopefully, we won't have to suffer those wounds."
The underestimated master data management implementation challenge
One MDM challenge emerged consistently, according to TDWI's Eckerson. Cross system analysis, or understanding the data destined for an MDM initiative, is a huge time and cost sink, according to early adopters. Case study presenter Barry Briggs, chief technology officer of Microsoft IT, reported that 10 people spent 100 days working on source system data analysis, while presenter Miguel Albrecht, director of the European Patent Office, said he had 60 people working on it. It's not really a new issue, Eckerson explained, but the task may be particularly underestimated in MDM implementations.
"It's similar to the problems we face in data warehousing," he said, "which is you don't really know what's in that source data, yet your goal is to reconcile and standardize it so you can make it available to other systems. In MDM, you need to do that work so you can reconcile semantics and create a golden record, or at least a link between records, so that other operational systems can access it."
This is especially challenging in MDM, Eckerson explained, because operational source systems tend to have just enough data to run adequately in their own silo, but that data proves to be inadequate or incomplete when organizations try to reconcile and match it with theoretically similar data in other systems. The good news is that data-profiling and data-discovery tools are available, which can make the job easier, Eckerson said. He added, however, that the problem seemed to be an eye-opener for many attendees, who hadn't fully considered the time and money investment required for source system analysis.