How To Establish An Effective Data Quality Policy
Moving Beyond Day-to-Day Data Cleansing
In the financial services industry, regulation on due process and fit-for-purpose data has grown increasingly prescriptive, and the risks of failing to implement a data quality policy can be far-reaching. In this article, Boyke Baboelal of Asset Control looks at how organizations can overcome these challenges by establishing and implementing an effective data quality framework consisting of data identification and risk assessment, data controls and data management.
Too many financial services organizations fail to implement effective data quality and risk management policies. When data comes in, they typically validate and cleanse it first before distributing it more widely. The emphasis is on preventing downstream systems from receiving erroneous data. That’s important, but by focusing on ad hoc incident resolution, organizations struggle to identify and address recurring data quality problems in a structural way.
To rectify this, they need to the ability to more continuously carry out analysis targeted at understanding their data quality and reporting on it over time. Very few organizations across the industry are currently doing this, and that’s a significant problem. After all, however much data cleansing an organization does, if it fails to track what was done in the past, it will not know how often specific data items contained gaps, completeness or accuracy issues, nor understand where those issues are most intensively clustered. Financial markets are in constant flux and can be fickle, and rules that screen data need to be periodically reassessed.
Source: Disaster Recovery Journal How To Establish An Effective Data Quality Policy