DTCC Connection

Apr 17, 2013 • DTCC Connection

New regulations require cleaner data

Mark Davies, General Manager and Head of Avox

The impact of ongoing efforts by financial regulators and the firms themselves to monitor and mitigate risk has seeped into almost all areas of a firm’s operations, including the management and maintenance of data. The overhaul of global systems post-financial crisis is resulting in an audit of data, specifically in relation to that which firms hold about themselves and their counterparties or clients - known as business entity reference data.

This “data exploration” is being driven by the cumulative effect of several individual pieces of regulation, including the European Market Infrastructure Regulation (EMIR) and Solvency II in Europe, the Dodd-Frank Act and Foreign Account Tax Compliance Act (FATCA) in the US, all of which are likely to have an impact globally. The primary goal of these proposals, with the exception of FATCA, is to improve risk management in the financial system.

Business entity reference data describes the legal structure of a firm or entity. It includes corporate hierarchies, registered address information, industry sector codes and company identifiers. It is used for Know-Your-Customer (KYC) activities in the banking and insurance industries, and for the settlement and reporting of trades across the financial markets. Aside from its use in the client onboarding process, interest in entity reference data has mainly come from those operating in the back or middle-office. However, one of the repercussions of the new regulatory environment is a significantly broader interest in this data, from the compliance and risk departments all the way to board level members and investors.

A growing concern across the industry is the realisation that all too often, business entity reference data is unclear, inaccurate and maintained using legacy processes – often manual – which cannot keep pace or satisfy the demands of today’s financial markets. Records, over time, have become stagnant, compounded by the fact that entities may have undergone multiple changes in their legal structures which have not been reflected in the information held by firms. So maintaining one version of the ‘truth’ is hard enough but when you consider the complex system architecture that many firms run as a consequence of mergers and product or geographic silos it becomes almost impossible to separate good from bad.

The less than pristine quality of entity data that exists in the financial markets is causing both reputational and operational harm. During the collapse of Lehman Brothers in 2008, it became apparent that the entity reference data held by firms was not current or useful enough to paint an accurate picture of market risk and exposure. As a result, prior to the default, regulators did not have the information required to identify the build up of systemic risk, and post-default counterparties to the bank were unable to quickly or accurately identify which of their trades were with Lehman group entities or other exposed parties.

New mechanisms are being developed to help monitor risk including, the Legal Entity Identifier (LEI), a unique identification system for parties to financial transactions. The creation of a global LEI system has been endorsed by the G20 and positive progress has been made towards its implementation. In the US, a provisional LEI solution known as the CFTC Interim Compliant Identifier (CICI) has been active since August 2012 for the purpose of reporting interest rate and credit default swaps, a requirement under the Dodd-Frank Act. Similar reporting rules for these contracts will come into effect in Europe within the next six months under EMIR.

Firms are under increased pressure to make sure that the data they hold reflects the correct legal structure of their business and their counterparties, and the LEI helps in that effort. Currently, organisations use a variety of methods to gather, organise, and manage entity identifier information for the huge number of de facto standards that exist today. This inevitably leads to duplications, inconsistencies, erroneous mappings and, therefore, increased risk.

Before the LEI implementation and the raft of new rules and regulations come into effect, firms should ensure that their data has been verified as both correct and meaningful – a process known as data validation – and then properly maintained on an ongoing basis. This is to ensure that firms, and ultimately the whole of the financial system, are making decisions on the basis of clean, correct and useful data.

In recent years, it has become incumbent upon regulators and participants in the financial services industry to make sure that the financial markets are underpinned by a robust infrastructure. An essential lesson from the crisis is that data is absolutely critical to these efforts, and that firms must take individual responsibility for their data. Many institutions, particularly larger banks, have already begun the process of analysing their current data architecture, determining where entity data is present, matching their internal records, and identifying data quality issues within their data. But many have not, and for these firms, processes relating to business entity reference data will require attention – sooner rather than later.

Originally published on Thomson Reuters GRC. ©Thomson Reuters

dtccdotcom