The latest data management news is that industry is being transformed by new concepts -- and familiar best practices
-- according to those on the front lines.
Close to 1,000 data professionals and thought leaders gathered recently at the DAMA International Symposium and Wilshire Meta-data conference in Denver to celebrate the organization's 20th anniversary and share their best practices and thoughts about data management. The hot topics -- governance, universal data models, and semantics -- weren't necessarily what industry watchers might have expected.
DAMA members are information architects, data modelers, database administrators, and other data specialists, and there's a feel to this conference that's different from some other industry events, agreed John Schley, DAMA international president and a data administration specialist with Columbus, Ohio-based Nationwide Mutual Insurance Company.
"It's more of a sharing of ideas and experiences, and learning from peers," Schley said. "Where other conferences have speakers and then people that attend, at DAMA it's hard to tell who is a speaker and who is not. There's a lot of peer-to-peer networking and sharing. It's a reality show meets conference."
Data governance is the new reality
Data governance and compliance is the new reality, many attendees said, changing the way they work. The emphasis on governance gives data management more visibility in the corporate world. Data quality is taken more seriously, data integration is a necessity, and security is an imperative, not a luxury, attendees said. A competitive global marketplace and laws such as Sarbanes-Oxley bring the promise of increased resources -- but the pitfalls of higher stakes.
"Regulatory pressures mean that people who run businesses have more incentive to really understand their business," said Alan Swan, vice president of data management with Philadelphia, Pa.-based ACE Insurance Company.
"Sarbanes-Oxley and the other new laws holding executives responsible for data are really changing things," said Tom Kretz, a member of the Chicago DAMA chapter. "The business side is getting invested and interested, and once that happens, the funding is there."
Data modeling has new meaning
Some attendees are using newfound resources to go back to basics. Rather than new technology, their focus is shifting to best practices in architecture -- both for data systems and for organizations. The new Zachman Framework track this year attracted attendees interested in moving modeling from a technology discipline to a corporate philosophy. People were also very interested in universal data modeling, a concept championed by Len Silverston, founder of Castle Rock, Colo.-based Universal Data Models LLC and author of the best-selling book series The Data Model Resource Book, Volumes 1 and 2.
"There's pressure in data management to deliver faster," Silverston said. "People are realizing that we can't reinvent the wheel. I agree that businesses are very different, but the underlying data of most businesses have a lot of similarity. People are all dealing with contact information, accounting and budgeting information, and other common information."
Based on his experiences in the industry, Silverston has developed a set of universal data models and resources that includes 230 models covering everything from customer to product data, including a $150 million data modeling project. The models are both "generally applicable and holistic -- a wide-angle lens" for looking at an organization's data, he said. Companies might customize the models, but starting with a universal data model increases the longevity and quality of the design and enables common, consistent communication, Silverston explained. Models are also the foundation for integration practices such as master data management, and common, universal data models can streamline merger and acquisition activities, he added.
"Universal data models help get projects started faster," said Tim Gattone, data warehouse architect with Englewood, Colo.-based Echostar Communications Corp.
"The concept of universal data modeling gives us the opportunity to reduce our enterprise to a single view. It also gives practitioners a common way of speaking to each other," said Karen Vitone, team lead for data architecture at Stamford, Conn.-based Pitney Bowes.
Semantics will add meaning to vast data stores
More data has been created in the last three years than in all the past 40,000 years, and total data will quadruple in the next two years, according to research from University of California, Berkeley, said Stephen Brobst, chief technology officer of Dayton, Ohio-based Teradata, a division of NCR. That explosion of data requires the ability to store, secure and manage the physical data, DAMA president Schley said, but it also demands that the stored data be useful and meaningful.
"We've gotten to the point where business really needs value out of their data stores. They have invested a lot of resources and the payback is in semantics," Schley said.
"Semantic modeling enables code generation and minimizes time-consuming hand coding," said Diana Wild, organization secretary of the Metadata Professional Organization (a special interest group of DAMA).
Semantics is a set of best practices, technology and standards intended to help organizations lower data management costs and gain new insight by creating "formal, machine-understandable definitions" of business terms, explained Dave McComb, president of Fort Collins, Colo.-based Semantic Arts Inc.
In the semantic world, organizations use a small number of easily defined concepts to build formal definitions of meaning for more complex business terms, McComb explained. A company might start with a term such as "contract" and use that to build a definition for "customer." It's related to metadata, but rather than define such attributes as where data is stored, semantics defines what information means, McComb said. These declarative, machine-understandable descriptions ultimately help systems infer meaning about data relationships that aren't explicitly defined.
"Semantics has the potential to become a $50 billion industry," McComb said. "It will lower the cost of information integration and information discovery, and enable a new generation of applications that are built around shared understanding and meaning."