Skip to main content

Many Firms Can't Leverage Valuable Ops Data

Special by FTF News | 4 minute read | February 7, 2023

Many financial services firms can “only leverage a small portion of the data they possess on a modern data platform to generate insight — the rest is simply used for processing activities,” says Kapil Bansal, DTCC Managing Director, Head of Business Architecture, Data Strategy & Analytics, via an FTF News Q&A.

Bansal is speaking for the DTCC about the provider’s new white paper, Data Strategy & Management in Financial Markets. The new report focuses on why there’s an evolution coming for the exchange and management of financial data, including the data essential for securities operations.

Q: How would you describe the way data is exchanged and managed across financial markets and firms now? Why does this situation have to change?

A: As financial markets have become more electronic, data has moved center stage. Furthermore, increased regulatory requirements, the need for accurate and timely business insights, operational efficiency, and capital/balance sheet efficiencies are requiring market participants to think about data management more strategically and holistically.

In support of this, the industry has developed a broad range of methods and formats for data exchange and standards across organizations.

Due to heterogeneous formats and disparate systems, most financial institutions only leverage a small portion of the data they possess on a modern data platform to generate insight — the rest is simply used for processing activities.

Meanwhile, several aspects of post-trade processes remain manual, which means potentially valuable data never gets analyzed or even stored in a modern technology platform.

These inefficiencies in data management cause operational risk at various points in the trade lifecycle, increasing the processing time and often requiring costly reconciliations. To harness and deliver upon the true value of data, these areas must be addressed.

Q: It seems as if Ops departments at financial services firms have historically not gotten involved in data procurement or distribution as that would have been handled by another group. Is that dynamic changing? If so why is it changing?

A: We believe that effective data management is a key theme across the entire value chain in capital markets, from price discovery to settlement and other post-trade processes.

Historically, data management challenges have been viewed and addressed in silos. This is changing. After all, data challenges in operations areas may result in financial impact and inefficiencies.

By effectively managing post-trade data, firms can benefit from cost savings and lower capital costs, as well as achieve enhanced data insights for front-office functions.

Q: In the past, the silos often represented the power hierarchy within a firm. In this respect, data was power because it could be withheld or controlled in quid pro quo way. Is it possible that digitalized and interconnected marketplaces can change the old silo dynamic?

A: Silos lead to redundant costs and other operational and business challenges, such as reconciliation issues and the lack of a holistic view across business lines and entities.

Silos can also result in duplicative IT architectures that increase costs and impact resiliency. At the same time, data management and data quality become even more challenging due to the fragmented nature of silos, potentially resulting in regulatory risk for an organization.

To address these challenges, more and more firms are exploring digital technologies and cloud capabilities. However, the implementation of any fintech should be part of a broader business-driven strategy and not solely an IT-driven project.

Q: The report spotlights interconnected data ecosystems that could serve as a new infrastructure layer. Could you provide some examples of these data ecosystems and their connections? What would cause groups within a firm to share data from their legacy systems?

A: Several market participants are developing virtual, cloud-based “data sharing” capabilities or data “marketplaces.”

Organizations are investing in technology to eliminate internal data silos and to enable the provisioning of data to their clients via APIs [application programming interfaces] and standard data formats. Legacy systems modernization, while important, should be viewed as a longer-term initiative.

In the interim, business leaders can drive value by delivering enhanced data insights to their clients. Effective data management and data privacy are critical enablers for data sharing.

Q: The report makes the argument that efficient data management, cloud-enabled capabilities and more automation of data management tasks will help firms speed up the development of new products. Could you provide an example of how that would work?

A: Cloud infrastructure and its business capabilities are rapidly maturing. Almost all organizations are investing in cloud capabilities and APIs.

Effective data management ensures data is maintained at the required quality benchmark and has the appropriate level of controls and privacy. Leveraging cloud capabilities and APIs, organizations can accelerate the build-out of data products.

Proof of concepts can be done at a much faster pace and clients can integrate with less friction and development costs.

Q: Why will the greater acceptance of open source standards lead to operational efficiency?

A: Market participants typically interact with each other in an “all-to-all” model.

At the same time, there is significant activity across domestic and international markets, each with varying regulatory requirements. The result is often non- harmonized, non-standardized data sets.

In response, the industry and regulators are focused on achieving more consistent data standards. For example, in the derivatives trade space, we have seen regulators across jurisdictions focus on better aligning data standards to simplify trade reporting and deliver new insights.

These standards, once introduced, could help organizations streamline their trade reporting infrastructure leading to cost efficiencies as well as delivering increased transparency.

This article originally appeared in FTF News on January 27, 2023.

Kapil Bansal - Managing Director, Head of Business Architecture, Data Strategy & Analytics at DTCC
Kapil Bansal Managing Director, Head of Business Architecture, Data Strategy & Analytics