Skip to main content

Data lies at the heart of the financial markets, providing the fuel for efficient and successful operations. Investors and financial firms with access to the best quality data – in terms of its completeness, timeliness, and accuracy – have an advantage over those less well informed.

By GreySpark’s Elliott Playle, Research Analyst, Mark Nsianguana, Manager and Rachel Lindstrom, Senior Manager

The increasingly digitised capital markets have opened a plethora of new market data channels, which could potentially compromise data quality, control, and management across the industry unless they are well managed. Financial firms need to check the quality of this abundant market data and identify any risks and issues it poses, or risk facing operational disaster. This is a challenge as some financial instruments can generate thousands of market data points per day. In this article, GreySpark Partners, in partnership with ITRS, explores the importance of deploying a robust market data monitoring (MDM) framework.

As well as the data itself, firms must ensure the infrastructure supplying the market data is operating smoothly and without latency issues. Given the enormity of this task, monitoring of market data feeds across many different price points is by no means easy.

Historical Context

‘Open outcry’ trading was the mechanism for price discovery in most financial markets until nearly the end of the 20th century. Traders – whether trading for themselves or on behalf of clients – were physically present at a central location, publicly announcing their bids and offers. When a trader heard a bid or offer that was attractive, the trader accepted. The transaction price was then reported to the entire trading community via screens in the trading pit.

Over the past 25 years, a major transition has occurred in how market participants interact. Technological advances led to the development of a continuous electronic auction system, known as the ‘electronic order book’, which replaced the open outcry system. This change resulted in many improvements to basic functions of the capital markets. Bids and offers from buyers and sellers could be matched more rapidly and the whole process of trading was accelerated. Since then, the middle and back office has been increasingly automated to speed up post-trade processes and enable faster and more efficient front office activities. As a result, market data became almost universally digital, which laid the foundations for today’s market data visibility.

Initially, proprietary networks were the primary means by which data was distributed. However, with the wider availability and trust in internet connectivity, the use of proprietary lines to ingest market data is no longer necessary. In a further development, many banks now opt for cloud-based data solutions. Over the last 10 years, technological change has accelerated across the capital markets with traders in many of the largest firms using data-driven, high-frequency trading algorithms that are highly automated and AI-based. The platforms to facilitate this, meanwhile, present technical challenges for those tasked with monitoring price discovery and trading activities.

Figure 1: Market Data Flow in a Trading Environment
Source: GreySpark analysis

Daily monitoring of market data feeds by IT support is needed to ensure data feed ‘health’, accuracy, and resilience is not compromised. Typically, this involves an ongoing dialogue between traders and IT support to troubleshoot connectivity and latency issues. These issues can compromise the reliability of data; for example, a higher than anticipated latency could result in traders making decisions based on outdated market data.

The data used by financial firms stems from a wide range of sources and is often only permissioned for use across certain business departments. Figure 1 shows the simplified workflow of the market data journey from its delivery by external providers through various trading platforms, emphasising its importance in informing trading decisions. The integration of Order Management Systems (OMS), trading algorithms, and execution / FIX gateways illustrates the complexity of modern trading environments where real-time market data is essential for efficient and effective trading.

Emerging Trends in MDM

Across the capital markets globally, data-as-a-service, data marketplaces and alternative trading venues are three of the most impactful trends to have emerged in the last decade.

1. Data-as-a-Service (DaaS)
DaaS models have been growing in importance for financial firms, due in part to their ability to resolve historical bandwidth issues. As shown in Figure 2, nearly half of financial services firms’ data is now stored in the cloud. Market participants can access market data directly from data vendors via DaaS plugins, reducing server-related limitations. Financial services firms are increasingly embracing DaaS models, while key market data providers and exchanges are choosing to offer DaaS plug-in solutions and centralised data distribution models.

2. Data Marketplaces
There has been a notable rise in the use of data marketplaces over the last 10 years. Data marketplaces bring together data providers and data consumers, facilitating the buying and selling of data. Providers generate revenues from selling their data, while consumers can get access to specific data that suits their needs, creating a win-win for both parties. In the early days, data marketplaces focused on intraday and historical data. Today, more data marketplaces offer real-time – or close to real-time – market data too. A significant advantage of using a data marketplace is that it will often provide standardised data to its clients.

Figure 2: Financial Services Firms Data Storage Location
Source: International Banker

3. Alternative Trading Venues
While larger financial firms can trade on exchanges, smaller companies find it challenging to become exchange members and thereby gain the option to trade on the venue itself without requiring an intermediary to make their trades. Consequently, there has been a rise in alternative trading venues which allow direct market access for these smaller players. Entities such as The Better Alternative Trading System (BATS) have purchased seats across various markets, enabling smaller traders to access these markets directly through BATS. Another advantage of this arrangement is that the smaller firms can purchase market data from the venues on which BATS, or the like, facilitate their direct access. Despite having the option to purchase market data across a multitude of channels, financial firms still control the data that enters their systems. The large amount of data they can purchase in various formats, however, makes ensuring data quality challenging. In many cases, a lack of market data monitoring causes duplications, extended latencies, and ultimately higher costs for the firm as it troubleshoots the consequential issues.

Market Data Latency Consequences

Market data latency can lead to traders being on the wrong side of the market. If the order message that is routed back to the broker’s server gets delayed, the price that the trader attempted for the order may have moved away. Potential consequences of this delay could be that the broker may not fill that order as a better price can be found elsewhere, or the broker does fill the order but the price shown on the trader’s screen may not be the price at which the trade was executed. For example, there can be a latency of up to 3 milliseconds between Tokyo and London, so firms can easily end up trading after other market participants have seen the real-time data. According to one report, the estimated value of a one millisecond latency advantage for a brokerage firm amounts to more than USD 100 million annually. As such, the importance of reducing latency to match or improve on the latency experienced by other market participants cannot be downplayed.

The capital markets industry continues to grapple with market data latency issues and outages. Deutsche Borse experienced an outage in October 2020 following a failure in one of its data management systems, leading to the postponement of a Danish bond auction. In October 2023, the London Stock Exchange experienced an outage caused by an issue with its matching engine, halting the trading of all instruments except for constituents of FTSE 100 and FTSE 250 indices. Additionally, in the same month as the LSE outage, options traders in the US were blighted by an outage of the main post-trade data feed provided by the Option Price Reporting Authority (OPRA), which is used widely to gauge the market prices of options contracts.

A Closer Look at Market Data Challenges

The process of monitoring market data is an increasingly demanding technical task that presents four key challenges:

1. Latency
With the deluge of data coming into market maker systems, it is difficult for financial firms to decipher what data arrives first. Ideally, firms need to see how quickly they are receiving data on a source-by-source basis. Optimal execution can only be achieved if a trader has access to the best quality market data more quickly than their peers. No matter how fast a trade is executed, if data is delayed in comparison to other market participants, the firm will be left on the wrong side of the market. Therefore, comparing the price, quality, and latency of data from various sources is important if the firm is to obtain the best possible trade execution.

2. Bandwidth
It is essential that firms know how much data they can accept via their feeds on a second-by-second basis. This ensures they avoid potentially disastrous operational consequences. For instance, a data overload can lead to system congestion, which can affect operational efficiency.

3. Data Anomalies
While partly automated, market data monitoring requires some human intervention. Many firms aggregate incoming market data from different providers to create their own benchmarks for price discovery. However, due to the increased volume and variety of data sources, firms are more likely to experience data anomalies and find them increasingly difficult to monitor. One example of a data anomaly might be a spike in prices. The spike is detected by comparing it to a benchmark, which could be the average price of a security from the past hour. This may not be obvious to traders in real time, but automated market data monitoring detects aberrations caused by errors in the data far more effectively and efficiently than humans can.

4. Cost
Typically, a financial firm obtains data from a multitude of sources, including exchanges, investment banks, and brokers. Market data can cost mid-sized firms more than USD 20 million per year. Poor data management that allows duplicated data to enter a firm’s systems leads to the firm incurring needless extra market data costs. For example, a bank could be storing market data on three different internal systems for the same asset. These market data issues not only risk poor execution of trades, they also make it difficult to decipher the correct value of a security and causes confusion when it comes to data usage and reporting requirements.

The Importance of Market Data Monitoring

Without market data monitoring, financial firms can leave themselves and their clients at risk of being on the wrong side of the market. For some financial firms, data that is milliseconds old is akin to yesterday’s news, but traders in other markets can operate successfully with data latencies measured in seconds. Monitoring of market data latency should therefore use specific benchmarks for different trading types.

Market data monitoring must allow firms to view the prices and latencies of all the channels via which it receives its market data. It should be able to provide real-time comparisons between each data provider and ensure the optimal data source is used for each specific use case. Financial firms will then make the best real-time decisions and executions for themselves and their clients. Due to the sheer size, speed, and variety of sources of market data now at firms’ disposal, market data monitoring is also increasingly challenging. The higher volume and speed of data feeds increases the likelihood of duplication and anomalies, which from a cost standpoint are too important to be overlooked.

A market data monitoring solution allows firms to combine data and feeds onsite, ensuring they receive and see the data they want and only the data they want. It should provide an overview of data feeds, including latencies between different data vendors, to enable comparisons and ensure data quality. A market data monitoring solution must also be able to quickly detect data duplicates and their source, pinpoint anomalies in pricing and volumes, and determine data provenance. Due to the fast-paced nature of the financial markets, market data monitoring solutions that detect data anomalies using specialist algorithms and alert support teams in real time are beneficial for optimising trading decisions.

The Missing Piece of the Puzzle

The importance of market data monitoring in determining a firm’s financial success cannot be understated. Financial firms’ decisions are only as good as the data inputs it receives and the data infrastructure it utilises. Making sense of the noise has become more difficult than ever before with numerous vendors and data channels making it challenging to identify the best data. Only a robust market data monitoring framework can ensure the deluge of data ingested is of the highest quality and meets the benchmarks for acceptable latency.

ITRS Geneos empowers financial services with real-time IT monitoring and alerting. It enables them to gain a competitive advantage from IT performance and drives resilient operations, delivering intelligent insights that reduce the business impact of outages and application issues in production. ITRS Geneos supports the most diverse and interconnected IT estates at scale without impacting system performance. It monitors transactions, processes, applications, and infrastructure across on-premises, cloud, and highly dynamic containerized environments, giving financial services full control over IT ecosystems and ensuring minimal latency 24×7. ITRS Geneos ensures the quality and validity of market data and helps minimize the latency of data feeds, powering highly informed trading decisions and strengthening the relationship between market data providers and the trading ecosystem.