This year, regulatory reporting requirements introduced under the Dodd-Frank Act—impacting
financial institutions engaged in cross-border derivatives transactions with US counterparts—have
brought attention to the management of client data. The U.S. reporting requirements form part of the
global effort to reduce systemic risk and improve transparency in the derivatives markets, and in a
few short months, Europe will follow suit with similar but not identical rules that necessitate the
reporting of listed and OTC derivatives transactions to a trade repository. Notwithstanding some of
the technical nuisances in the reporting mandates of various jurisdictions, the increased focus on
trade reporting provides some thematic similarities with which institutions should consider when
reviewing the impacts and action plans for compliance. Namely, more than ever before, client or
counterparty data management is becoming a critical function of the new derivatives markets,
increasing the focus on this area well beyond the current client on-boarding function, and Know-
Your-Customer processes. In the context of the increased burden of reporting requirements,
ineffective management of counterparty data has the potential to disrupt client relationships. To
mitigate this, the different functions within an institution—including the front, middle and back
office—will have to demonstrate a more integrated approach to client data management in order to
support the compliance function, ensure successful and streamlined implementation and in turn,
minimise impact on the customer experience.
Industry Concerns
Broadly speaking, the industry has supported the G20 goal to make the financial markets more stable
and less risky by reforming the over-the-counter derivatives market and introducing new
requirements, such as reporting. However, for institutions, the process of working out what specific
rules they are required to comply with has been, in truth, a challenging process. Industry participants
have already expressed concern around the complexity of reporting requirements. This is a charge that
has been levied at almost all areas of the Dodd-Frank Act and the European Market Infrastructure
Regulation. The concept behind reporting, of course, is a simple one: know the economic details of
your trade and who you traded with; maintain accurate, up-to-date information about that trade and
your counterparty; and report this information to a trade repository for the purposes of monitoring the
build-up of systemic risk.
In practice, coordinating compliance with the requirements is, seemingly, much more complex. Each
regulator requires a minimum set of data fields to be reported; under EMIR, additional data sets
include extensive counterparty information with complex classifications, trading information and
collateral data. For reporting firms, the task of cross referencing data is a daunting one but it urgently
requires attention. Significant upgrades need to be made to institutions’ data recordkeeping and
capture processes, and this, in turn, is prompting a wholesale examination of data quality and
maintenance across the industry.
Successful implementation of reporting regulations will be determined by an institution’s ability to
quickly and accurately identify, report and classify their trades and clients according to reporting
criteria. The most efficient firms will work to enrich and enhance the required data with clients’ or
counterparties’ legal entity data. The word ‘successful’ is important not only from a compliance
perspective, but also from the customer perspective since reporting has the potential to either
negatively or positively impact on the customer experience and client relationship. Essentially, if
regulatory reporting is not effectively managed, and if accurate information pertaining to a particular
client cannot be obtained in a timely manner and without duplicate requests, it has the potential to
disrupt relationships with that client. If well-managed operationally, across the entire organisation,
negative impacts can be mitigated and the client experience could even be improved, since the
relationship will be based on an efficient collection process and true, meaningful, and useful
information.
Holistic Approach To Data Systems
The foundation of successful implementation of reporting is integration, wherever possible, on three
fronts: between departments and business functions; between processes and technology systems; and
with other areas of regulatory compliance.
Firstly, different functions within an institution must share their understanding of how regulations will
impact their use of data. Often times, compliance functions wait until they have a clear understanding
of all aspects of a final regulation before communicating the details to other departments. This means
that the wider business, including operations departments, do not fully appreciate the size and scope
of the changes needed to respond to a regulation until very late in the day. An institution-wide
approach is needed to develop or execute an action plan for regulations that take into account each
function’s client data requirements, with the aim of developing the best regulatory response, ensuring
compliance and safeguarding the client experience.
Secondly, there must be integration between processes and technologies. Currently, institutions store
data and documentation in technology silos and disparate operational processes. For example, KYC
data used in the on-boarding function, required for customer due diligence and reporting requirements
for the purposes of curtailing money laundering practices, is rarely shared among other levels of the
institution, who instead rely on their own data sources. Disparate systems and multiple views of a
single customer can increase the challenges of complying with new regulations, introducing
unnecessary complexity and costs, as well as frustrating clients with duplicate requests for
information. This approach is inefficient and increases the risk of error, and it needs to be addressed,
ideally, ahead of the implementation. The majority of firms will recognise this challenge but all too
often aggressive regulatory deadlines and internal complexities push these organizations to develop
more tactical fixes and perpetuate the problem. Ultimately, those institutions that are able to produce
an integrated, single view of the customer benefit from a foundation for effective and efficient
compliance, risk management and reporting.
Finally, the Dodd-Frank Act and EMIR should be viewed within the context of the wider regulatory
reform programme, including FATCA, MiFIR and AIFMD. While these regulations are different in
purpose, scope and technical requirements, they all share a common core of information and
necessitate efficient and up-to-date client data processes. Institutions should therefore identify
opportunities to develop a flexible, scalable model in their data management systems and processes
that can address multiple regulatory requirements around a common framework.
Quality Is Key
Data quality has attracted increased industry attention as a result of the cumulative effect of
forthcoming regulations. The problem of poor data has grown over time: excessive duplication;
records have become dormant; and, at the same time, clients have undergone multiple changes to their
legal structures and company details that, subsequently, may not have been updated in the information
held by institutions. Maintaining one version of the ‘truth’ is hard enough but when you consider the
complex system architecture that many firms run as a consequence of mergers and product or
geographic silos, it becomes almost impossible to separate good data from bad. At best, this results in
a significant operational clean-up effort, and at worse, it leads to heightened business risk.
Therefore, in order to ensure compliance with regulatory requirements, data must be validated on an
ongoing basis. However, each institution’s ability to manage this process is being held back by the
need to focus resources on core business operations, tested by budgetary constraints, and undermined
by the sheer volume of data that institutions interact with. Moreover, data management, despite
perceptions, cannot survive on automated processing alone--unlike other areas of processing,
automation is not the holy grail of client data management. Instead, it requires dedicated research and
expertise, knowledge of where data can be sourced and checked, the ability to adapt to hundreds of
languages and, ultimately, human oversight to find the definitive answer. For these reasons firms are
turning to external providers to help with the process.
Collaboration Will Help
Institutions should also look to their peers to ease the burden on regulation when it comes to the
process of collecting and reporting the granular information required by regulators. Sharing data
cleansing and maintenance efforts across the industry and via a shared pool increases efficiency, and
reduces costs. At the same time that institutions are outsourcing their requirements around data
cleansing, validation and maintenance they are, perhaps in equal measure, also turning to each other,
and embracing a collaborative approach to data management. Indeed, a recent report on wholesale
banking by Oliver Wyman and Morgan Stanley highlighted some interesting observations on how
banks faced with a drastic need to cut their cost bases should be focusing on more collaborative
projects.
Integration and collaboration should be top of mind, since successful implementation will be
dependent on an institution’s ability to organise themselves internally so that all functions impacted
have the ability to access up-to-date, accurate information about their clients, and their status. And,
ultimately, ongoing compliance will be helped by having one single point of access to the validated
information needed to meet regulatory requirements—reducing the complexity of compliance and
allowing firms to concentrate on core business performance.
This article first appeared in Derivatives Week, 7 June 2013.