Simultaneously, very little has changed, and almost everything has changed.
Since the launch of the Amsterdam stock exchange in 1611, trading in the shares of a company, a mechanism that was for centuries, largely human-centric, has relied heavily on data. While the trading process has evolved significantly since this first incarnation of market structure and, today, is almost entirely technology-dependent, it still requires data created by the interaction of market actors in capital markets. Indeed, beyond the dramatic transformation in trade order processing over the last few decades of the modern capital markets era, and even recent frenzies around “meme” stocks, the quality, timeliness, diversity, and sourcing of data – market-oriented and otherwise - has never been more critical than it is right now.
Related: CDS Market - Greater Access to More Trade Data & Metrics Aims to Boost Transparency
But, at its base, all market actors today – from analysts and portfolio managers to traders and risk managers within proprietary trading firms, hedge funds, and traditional asset managers - look to high-quality market data and sophisticated data management platforms for tactical and strategic competitive advantages.
Of course, this is all much easier said than done. Today’s intensely data-driven ecosystem presents challenges to market participants: How do they choose which datasets they ingest and from whom? How do they manage the huge volumes of data they acquire while extracting maximum value? And how do technologists make an increasingly complex scope of data highly scaled tapestry of data more consumable for their internal and external clients?
Better Data, Better Results
Demand for high-quality and highly granular data is soaring. From interest rate movements on money market instruments to equity trading volumes by security to volumes of global credit derivatives contracts and many other examples, precise monitoring of market activity enables improvements in trading outcomes, risk management, and overall performance.
This enhanced demand for greater market transparency is even greater where market conditions are rapidly shifting and highly volatile. Access to high-quality data helps firms navigate the increasingly complex liquidity challenges that arise during “normal” periods and crises alike.
Traders Magazine spoke to Tim Lind, Managing Director, Head of Data Services at DTCC, about the growing importance of market data in shaping firms’ trading strategies.
Editor: Why is access to quality market data important given current market conditions?
Lind: Today’s market environment is characterized by volatile price movements, surges in volumes, and complex interactions across asset and product classes. In particular, high-volume days have become increasingly common. Early last year, such periods were driven by the uncertainties around Covid-19 and the pandemic’s economic repercussions. In 2020 and 2021, many high-volume days were caused by soaring retail activity, which was enabled by commission-free trading applications. As retail investors gained unprecedented access to frictionless and highly gamified U.S. equity markets trading platforms, many believe that the pandemic lockdown further contributed to market volatility with people working from home and, in many cases, having more disposable income.
This roller coaster of activity has increased the demand for reliable, empirically-based market data that would help market participants better understand how forces like greater retail participation in equity and equity-linked securities trading can impact markets and lead to fundamental changes in market structure that institutional players will need to consider.
Editor: What are the implications for data generation of this retail activity, particularly on the equities side?
Lind: Prior to the shift to commission-free trading, the retail segment was estimated to have represented 10% of total market volumes. Today, some experts estimate that retail participation constitutes more than 25% of trading volumes.
The growth of the retail investor segment, and the corresponding increase in the trading of meme stocks, has not only encouraged volatility and changed how equity markets behave but also impacted the overall demand for data. We can see these changes in the trade data that DTCC processes: much higher volumes overall, increased trading of low-priced stock, more odd lots, and higher off-exchange trading volume.
Today’s retail participants don’t behave like institutional investors, especially if they want to spark some degree of chaos for institutional short sellers. As was seen with GameStop earlier in the year, access to daily transactional data has become increasingly important to buy- and sell-side firms looking to parse the details of this retail trading activity and adjust their risk models accordingly.
The current retail activity also has implications for liquidity. Does retail represent a source of liquidity for institutions to tap into, or will it create conflicts of inaccessible liquidity in unlit markets and questionable best execution for retail investors? The answer to this difficult question must be fact-based and supported by market data.
Editor: How has technology affected equities trading?
Lind: I’m constantly amazed by how fast markets are evolving. Technology innovations and increased computing power are largely driving this rapid change. In trading, we see faster speeds, more precision, and unprecedented scalability.
Alphacution, a research firm focused on modeling and benchmarking impacts of technology on global financial markets, describes the latest wave as the fourth generation of the quantitative revolution in trading technology. The first generation brought options and data feeds; the second generation added pairs trading and exchange models; the third generation was HFT [high-frequency trading] and smart order routing. Now we have the commission-free platform effect: frictionless, commission-free, gamified – mainly based on a strategy of “mobile-first.”
Dissemination of market information through social media combined with an exploding industry in alternative data sources that provide non-traditional insights into market sentiment and consumer behavior are also impacts the fourth generation. The analysts, traders, and portfolio managers who can anticipate and act on this turbulent mix of news, rumor, and trading activity are best poised for a strong performance – and they’re looking to authoritative market data to help them achieve it.
Editor: Precise empirical data is the starting point, but what else do firms need to extract maximum value from market data?
Lind: A robust and efficient data management program that feeds relevant, high-quality data into analytics and decision-making engines is essential. Getting data into a usable form, including integration with specific security identifiers, putting data into historical context, and developing innovative sentiment models are integral to any data management program.
Editor: But isn’t data management a considerable challenge, given the sheer volume of information now available from third-party suppliers?
Lind: Absolutely, it is. That’s why firms need to ensure they acquire their most critical datasets from verified trading or clearing activity across key asset categories -- equities, fixed income, and derivatives. It’s important to seek out data from a reliable provider with a strong track record in data collection and delivery. One of the primary benefits of automating and digitizing back-office processes is that every event that occurs in the trading and settlement process leaves an electronic footprint. The benefits of this automation are now accruing to the front, and middle office as trading, risk, and price discovery are relying on the empirical trade observations initially captured by the back office.
Data needs to be delivered in formats that allow easy input into proprietary models and at frequencies – whether intraday, daily or weekly, depending on the asset class – that let users make timely trading and risk management decisions. The time value of data is becoming increasingly relevant as investors deploy technology to identify fleeting price differences at a massive scale.
Editor: DTCC has a growing data services business. What distinguishes your value proposition from competitors?
Lind: Our data business is based primarily on the objective to provide greater market transparency for investors and institutional financial services participants. DTCC plays a unique role in processing transactions across multiple asset classes coupled with the sheer scale of its operations. We processed $2.33 quadrillion of securities in 2020. Approximately half of the capital markets activity globally flows through DTCC’s services when you consider our footprint in the US, global OTC derivative reporting, and institutional trade confirmation between buy and sell side. From these processing activities, DTCC generates enormous volumes of data not available elsewhere.
We brand our trade observation products “Kinetics” because it describes the forces of energy, motion, and mechanics of capital markets. For sophisticated investors and traders, trade observations captured in the Kinetics portfolio provide new insights into the market in terms of liquidity, valuation, risk, momentum, correlation, contagion, and concentration. We’ve leveraged the massive scale of trade observations to create a suite of data products designed to enhance users’ understanding of the current and historical market activity.
In addition to being comprehensive, the data in DTCC’s data offerings is extraordinarily accurate. Trades must be matched between counterparties, novated by a clearinghouse, or settled in a depository before they are eligible for any data product. Data captured from other trade reporting sources is often duplicated, lacks sufficient granularity, or is so latent that it limits its utility for providing market insights.
Editor: What products are included in your transaction-based data suite?
Lind: The Kinetics portfolio currently includes five products and we’re soon adding a sixth. They’re built on intraday, daily, and weekly transaction information sourced from DTCC’s National Securities Clearing Corporation (NSCC), Depository Trust Company (DTC) securities depository, and Trade Information Warehouse (TIW) processing infrastructure for credit derivatives. We also make historical trade data available to enhance users’ long-term trend analysis and modeling.
Editor: What else do Kinetics products cover?
Lind: U.S. and global equities, short-term fixed income trading like commercial paper, and credit default swap trade activity. In the future, we will be looking to launch a new product targeting U.S. Treasury repurchase agreements [repos], greatly improving the information available on this fragmented and opaque market. In the coming months, we’ll also be expanding the portfolio to add more fixed income asset classes and new ETF data observations unique to DTCC services.
Kinetics products can help firms derive new market signals, refine risk factors, more efficiently compile market aggregates, and enhance price discovery across asset classes to support risk and investment strategy decisions.
Let’s take DTCC Equity Kinetics, for instance: before the start of each trading day -- and throughout the day using a recently developed intraday snapshot – users can view, track and analyze aggregate U.S. equities trading volumes by security, for the most active brokers and anonymous peer groups.
The breadth of trading activity covered by the Kinetics suite and the integrity of our source data makes these products unique. It’s not surprising that DTCC has become the data provider of choice for a growing number of dealers, hedge funds, asset managers, and other firms engaged in trading and portfolio management.
Learn more about the DTCC Kinetics Suite of data and contact us today.
This article originally appeared in Traders Magazine on September 30, 2021.