This article originally appeared on the BeyeNETWORK.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
Operational best practices in data governance are those best practices that are applicable during the ongoing practice of data governance rather than during the development phase. While a number of best practices listed in Part 1 of this two-part article also apply once the program is up and running – for instance, “Clear Communications” should be part of the initial program design, and then sustained across data governance efforts – operational best practices typically do not come into play until the data governance program is in operation.
Operational Best Practices
Create an Environment of Accountability
Accountability is a foreign – or at least rarely practiced – philosophy at many companies. Responsibility and ownership are discussed, but rarely is accountability assumed or assigned. Accountability is one of the foundational elements upon which data governance is built. Every participant in a data governance council or committee should be accountable for a set of discrete decisions. There should be clear ownership. For instance, data stewards are accountable for specific data. The data quality manager is accountable for the accuracy and meaning of data under governance. Without formalized accountability, there can be no governance.
Of course, accountability should also include measurement and reward. Every member of a data governance council should have his or her job description modified to include the accountability component of governance and some mechanism for compensating for successful outcomes.
Build a Functioning Data Governance Council
A data governance steering committee or council is the central decision-making body for data governance. Members are accountable for creating and enforcing policies and procedures, establishing governance processes, settling disputes over data, refining workflows and other data governance decisions. Without a formally assigned and approved council, there will be no coordination of the work of data governance.
Note that while creating a data governance council is not considered a best practice during the development phase of building the program, it is necessary for the long-term and sustained success of operational data governance.
Institutionalize Data Stewardship
Before data can be formally managed, stewardship of the data must be formally established. Data stewardship must be formally recognized and assigned in order to sanction the role and define its scope. Its definition will vary according to what data is most important to the organization.
Data stewards will be accountable for all aspects of the data under their control. They will establish data quality thresholds; monitor the cost of poor data quality, deciding who should have access to the data, for what purpose and under what conditions; collect and verify all applicable metadata; and act as a representative of the data in resolving all usage conflicts.
Intentionally Segregate Duties
Woe is the company that allows the corporate fox to mind the henhouse: in no case should any data steward also have responsibilities for any aspect of managing business processes. Doing so puts the data steward in the untenable position of managing both data and the process in which it was created. This is equally true of data quality verification. Data stewards should be involved in the tactics of data correction and cleansing, but the business should define rules for its respective data sets. Discussions of the penalties for doing otherwise have taken countless column inches in numerous newspaper articles in recent years. As in corporate governance, data governance must support the concept of segregation of duties in all ways.
Incidentally, corporate compliance programs fall under the same best practice. Those responsible for designing the compliance program should have no part in administrating it, and the other way around.
Like good government, data governance should be driven from processes that have established effective checks and balances to ensure reasonable decisions and fair and even treatment of all stakeholders.
All of the processes that make data governance work – how committees work, how compliance with policies and procedures is administered, how waivers are granted, how data quality is maintained – should all be written using the principle of checks and balances.
Always Be Transparent
Everything data governance does should be done in the open. Indeed, the council should take the proactive step of openly publishing everything in such a way that no one has to search for artifacts or research decisions. Risking the perception of a cabal or skunkworks organization can shut down an otherwise effective data governance program.
This best practice is probably nowhere more important than in the two critical areas of compliance verification and the management of data quality.
Publish the bad along with the good. No matter how egregious the data error or rule violation, bad news should be made public knowledge (at least within the company). Doing so may generate wrath in the short term but will, in the long term, foster trust. The mere act of publishing a problem alone will not foster trust. What will create the atmosphere of trust is that the governance program was designed in such a way as to find the problem in the first place and provide mechanisms to fix it, while ensuring that the same problem will not occur again.
Transparency extends to the measurement of data governance progress. In order to manage data governance and demonstrate that it is generating the results expected of it, strong, well-designed progress measures need to be established. The results of the process of measuring the progress of governance should be made available to all stakeholders.
Address Data Quality Problems at the Source
It is always better to understand the root cause of data quality issues. Clearly, addressing problems at their source is not always possible, economically justifiable or politically palatable. It is, however, always the “best” place to address them. It should be the first place at which the attempt is made.
For departmental data governance programs, this may be a tall order because authority may not reach beyond the department. In such cases, every effort should be made to gain concurrence with whoever is responsible for affecting the source system fix. It is the responsibility of members of the data governance organization, especially the affected data stewards to bring whatever powers of persuasion they have to bear on the situation.
Make Metadata Part of Data Governance
As data management professionals know, collecting and organizing metadata to make it useful to all who need it is probably the most labor intensive aspect of a data governance program. Because of the labor requirement, it is also the most frequently abandoned activity. This is unfortunate because running a data governance program without metadata is like building a house without a blueprint. It can be done, but the result in both cases will be less than optimal.
Without technical metadata, you cannot enforce naming conventions. Without lineage metadata, you cannot control usage and you cannot conduct an impact analysis on a proposed change. Without data quality and business rules, you have no idea of the accuracy of the data you are using and, therefore, should have less than total confidence in the conclusions being drawn from the data.
In short, without metadata, you are flying blind. It is time to bite the bullet and use data governance as the trigger to get metadata back on track.
Remember Change Management
Data governance is all about taking control of the problems surrounding data. One of the quickest ways of getting control over the environment is to implement change management. This should be a single system for the management of all changes whether they are directly related to data or not. Processing all changes through a single change management system will ensure that all issues are processed in the same way, and any hidden data related requirements will come out as changes are processed through the system.
Implicit in any well-designed change management system is an issue escalation and resolution process, and a feedback mechanism that keeps the originator of the issue informed about progress.
Building and operating a data governance program is nothing less than an interlocking web of capabilities that are provided by using the best practices described here. Leveraging established best practices is the only way to tackle the four forces of data volume, regulatory requirements, data security and the innovative use of data discussed in Part 1 of this article. As companies launch new data governance programs, they must pay conscious attention to these four forces. Failing to follow best practices will leave the resulting data governance program at risk for not serving the business as effectively as it should.