Skip to main content

Data Strategy & Management in Financial Markets

By DTCC Connection Staff | 3 minute read | January 18, 2023

Financial markets have long been all about data. Investors with better data – more complete, timelier, in more digestible and actionable form – have an advantage over those that are less well-informed. Data helps decision makers pursue alpha, provide insights or manage risk. It is essential to negotiating and confirming the terms of proposed trades, to executing transactions, and to settling transactions. Indeed, a huge part of trading strategies, risk management, compliance, and regulatory oversight depends on data about historical transactions and client activities.

The recent acceleration in adoption of modern technologies, especially cloud, increased willingness to collaborate, and evolution of business demand, all have the potential to drive large-scale change for the industry. Data will remain vital – but how it’s stored, organized and exchanged is changing rapidly.

Our new white paper, Data Strategy & Management in Financial Markets, examines pain points related to data exchange (i.e., how organizations make data available to one another) and data management (i.e., how an organization handles data), looks at how key technologies can address these in the future, and considers hypotheses for the future. The paper also discusses what data providers and data consumers should focus on to maximize benefits.

Assessing Current Pain Points

Based on our analysis and discussions with our clients and partners, DTCC believes that today’s data infrastructure is inefficient for four primary reasons:

  • Overlapping Standards: Data exchange is dictated by overlapping standards and formats, requiring a high degree of maintenance. These standards typically assume point-to-point communication with asset class specific and inflexible formats, as well as bespoke data models which has the unfortunate consequence of limiting the ability to explore the interlinkages of data.
  • Missing Metadata: The way data is stored in prevents users from exploiting it fully. Information in the form of metadata (i.e., descriptive data that captures attributes of the underlying data) is often missing or embedded in specific data stores of applications, which significantly limits how broadly the data can be used and re-used in new ways.
  • IT & Operational Complexity: Today’s data infrastructure contributes to significant non-financial risk. Inefficiencies in the way data is exchanged and stored mentioned above have all contributed to IT and operational complexity, with substantial implication for operational risks (and costs).
  • Lack of Data Quality: Many data sets are not of desired data quality to support decision making, let alone automated decision making. For all the reasons outlined above, data quality is often difficult to ascertain.

Driving the Next Decade of Change

In the paper, we distill our views about the future for data exchange and data management into four hypotheses on trends that we believe will drive the next decade of change in how data is used in financial markets:

  • Accessible & Secure Data: Data users will have unprecedented flexibility in choosing what data to receive and how they receive it, breaking free from the constraints of exchanging fixed sets of fields at pre-defined time intervals. To enable this, data governance, privacy, and security will need to take center stage.
  • Interconnected Data Ecosystems: Industry participants will successfully free their own data from legacy systems and will not only master pooling it into their own data ecosystems, but, where useful and scalable, will connect them with others to create a new infrastructure layer. This will reduce duplication of data and allow for co-development of innovative data insights.
  • Focus on Insights: More efficient data management, rationalized data related technology stacks including cloud computing, and automation of routine data tasks, will free up capacity to focus on deriving data insights from the vast stores of data. Creating data products and insights will get simpler, not harder, with the right tools in place that will require less specialized resources.
  • Open-Source Data Standards: We expect that the industry will continue to adopt more standard data models, i.e., ways to understand and describe data sets. The most viable use cases will be in reference data and transaction reporting. The benefit for the industry would be less redundancy and better quality in data across the financial industry.

Embedding the Foundations for Tomorrow

To enable these changes, the whitepaper suggests institutions that produce and consume significant amounts of data embed key principles into their data operating models, including:

  • Establish Robust Foundational Data Management Capabilities: These include having a thorough understanding and catalog of data, breaking down data silos and implementing robust data quality practices.
  • Build Strong Data Governance: including the right set of data privacy and security standards to enable data collaboration with partners.
  • Evolve to Industry-Wide Data Models: Institutions will need to work together to establish trusted venues for experimentation and co-creation. Firms should explore where there is mutual benefit from collaborative data environments across firms and the industry to advance interoperability.

We are looking to build the future together with our clients and partners. It will take consultation and coordination. Many of the ideas here will be supplemented – and a few will probably be superseded – by insights we gain from our partners and clients, and through collaboration with them.

Kapil Bansal - Managing Director, Head of Business Architecture, Data Strategy & Analytics at DTCC
Kapil Bansal Managing Director, Head of Business Architecture, Data Strategy & Analytics

 

Read White Paper Read Press Release
dtccdotcom