Financial services firms are in the early stages of implementing data quality programs focused on risk management and regulatory requirements as well as operational efficiency, according to a new survey sponsored by The Depository Trust & Clearing Corporation (DTCC).
The DTCC 2013 Data Quality survey, conducted by Element22, a data management advisory, design and technology solutions firm for the financial services industry, identified emerging data quality industry practices and how those practices can be used to create benchmarks against which future progress could be measured.
Data Quality Challenges
More than half of the firms surveyed said they would like a clearer sense on how data quality is impacting their business. They attribute this gap to the fact that many firms do not have a formalized data quality program, instead dealing with data quality on an ad-hoc basis.
The majority of firms surveyed currently use proprietary/in-house systems, indicating there is no predominant technology solution being used in data quality work. Additionally, an industry-standard index framework designed to establish data quality specific benchmarks would benefit firms at the enterprise level and facilitate the development of other data management and data quality standards.
“Firms must continue to take a data-centric view which includes data management and, specifically, data quality programs to effectively manage risk and lower costs,” said John Yelle, DTCC Vice President Data Services. “The DTCC 2013 Data Quality Survey is valuable not only in assessing where the industry stands in developing data quality programs across a variety of firms, but it also highlights the progress on the data management maturity curve.”
To read more about the data quality survey, visit or go to: