News Stay informed about the latest enterprise technology news and product updates.

Certified Data and the Certification Process for Financial Institutions

Legislative & regulatory initiatives can now drive financial institutions to aggregate data and build data management infrastructures on a much larger scale.

This article originally appeared on the BeyeNETWORK

Untitled

Basel II, Basel IA, Consolidated Supervised Entities, The Bank Secrecy Act, Graham Leach Bliley, the US Patriot Act and Sarbanes-Oxley are just some of the legislative & regulatory initiatives driving financial institutions to aggregate data and build data management infrastructures on a scale not previously conceived. Predictably, additional legislative and regulatory initiatives will be passed in the future.

In many of my articles, I’ve stressed the virtues of a robust data quality architecture and processes. In this month’s article, Alex Le (Principal, Knightsbridge Solutions) and I will further explore these activities. We realize that success hinges upon a concept called “certified data” and the ability to certify data at various levels according to use and risk.

Certified data is defined as data that has been subjected to a structured quality process to ensure that it meets or exceeds the standards established by its intended consumers. Such standards are typically documented via service level agreements (SLA) and administered by an organized data governance structure. The following chart is a simplified, high-level view of the certification process.

Figure 1: Simplified view of the certification process.

The Certification Process
There are a few key ideas embodied in this working definition. First, certified data is an extension of data quality. Second, certification is not a binary standard implemented across the entire enterprise. Third, the quality of data can be measured and, therefore, standards can be set (SLAs) in a way that data can be certified based on its intended use.

These are indeed powerful concepts. The certification process enables organizations to reduce the investment required to source data for all legislative and regulatory initiatives now and in the future. It also holds out the promise that existing technology investments (tools and infrastructure) can be leveraged and scaled, thus providing investment reduction.

Data Certification in an L&R-Compliant Arena
Implementing a data infrastructure to support any of the legislative and regulatory obligations can be a daunting challenge, considering the complexity of data sources and the quality criteria required for certification. More often than not, the initial legislative and regulatory data warehouses and data marts that were created failed. This was due to inadequate certification criteria that left data void of traceability, depth and reliability.

Rough translation: Many efforts are kicked off with the idea that the requisite data can be sourced from existing, intermediate data stores that are used today to run the business. In addition, some think that sourcing via COBOL or SQL extracts into task-specific tools (AML, for example) or into a robust reporting tool (SAS, Cognos, Business Objects, etc.) are rapid, cost-effective solutions. Unfortunately, it turns out to be the clear road to non-compliance.

The main problem faced by financial institutions in creating compliant L&R data stores is time and transparency. Given the sheer number and diversity of data sources required for each legislative and regulatory initiative, the ability to trace data all the way back to the ultimate source system, as well as track all changes along the thousands of data streams, requires more planning, infrastructure and efforts than initially thought. Management always finds it hard to believe that most of their successful operational systems are neither internally consistent, nor compliant with the new legislative and regulatory directives. Many of the data sources are survivors of a previous era: flat files, network databases, hierarchical databases. Additionally, the systems are not documented, and the design and development teams have long since left the organization.

Moreover, external data sources are increasingly being brought into the legislative and regulatory arena to supplement the institution’s data. These sources will be in various formats. They will also need to be matched with internal sources. As the number of systems needing to be integrated grows, the complexity of the integration process and the requisite transparency increases exponentially. Consider the following figure:

Figure 2: Data conflicts emerge as the number of systems needing to be integrated grows.

As the above diagram demonstrates the complexity of integrating and tracing data lineage across many different data sources grows logarithmically, becoming unmanageable right from the start.

Getting a Handle on the Solution 
Earlier we postulated appropriately using “certification of data” for its intended use is crucial to success in the legislative and regulatory arena, in both the present and future. We would further suggest that the certification of data, for all functions within a financial institution, represents the single best practice within enterprise data and information management today.

To help clarify how all data management pieces fit together, let’s review the following Process Formula and Construct:

Figure 3: Image of the certified data formula.

The Process Component

We can then break down the Process Component into the following:

Figure 4: Image of the broken down process component.

Within this process, we can use automated tools to examine data from various perspectives, check quality and evaluate data’s level of certification.

Attributes of Certified Data
As we have stated, certified data is data that has been certified for a specific intended use. The following are attributes that should be tested to determine the quality of data as part of the overall certification process:

Figure 5a: Various attributes of certified data.

The attributes of data quality all work together to create a detailed data quality matrix that can be measured and used as the basis of service level agreements for data.

Figure 5b: Table of certified data attributes.

The Governance Component
Data governance and data quality are often grouped together as a single data management plan. In fact, governance should be put into place to ensure that the management of definition, creation, maintenance and data quality is sufficient to satisfy its intended users. This will further treat data as a high-quality certified shared business resource.


 

Figure 6: View of the Governance Process.

A successful implementation of a governance component will provide a framework to increase the level of data integration not only within business units, but also across business units and at the enterprise view. Implementing this data certification and framework will result in:

  • Improved data understanding, i.e., the development and sharing of definitions.
  • Increased data and process sharing, which will result in reduced data redundancy.
  • Improved data quality across the data life cycle.
  • Improved information access, i.e., the transformation of data into risk information.

Conclusion
To date, most organizations that have tackled legislative and regulatory data quality issues have tended to implement tactical solutions to improve quality within a single application or business process. While this approach may mitigate the problem for part of the organization in the short-term, such limited initiatives generally fail to achieve broad, long-term data quality improvements. As a result, historical and reporting requirement problems can occur in the long-term.

Solving the data quality issue requires an enterprise-wide approach that includes addressing organizational, cultural, process and technology infrastructures. This is called the “certification process.” While this seems like a daunting task, it can be achieved on a phased basis. For real legislative and regulatory data quality improvements, a long-term view must be taken.

Corporate data is a key strategic asset, which simply means that collecting and storing data is not sufficient. Like any other asset, data must be managed; there are cost benefits associated with it, and if the process is correct, a large return exists. If data is not managed appropriately, however, the quality of the asset degrades. Another consequence to this is that the cost of maintaining the data outweighs its benefits. The goal of legislative and regulatory data quality management is to provide the infrastructure to transform raw datainto consistent, accurate and reliable corporate information. Once this is established, it must be maintained.

Dig Deeper on Data modeling tools and techniques

PRO+

Content

Find more PRO+ content and other member only offers, here.

Start the conversation

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

-ADS BY GOOGLE

SearchBusinessAnalytics

SearchAWS

SearchContentManagement

SearchOracle

SearchSAP

SearchSQLServer

Close