Skip to main content

Market Data Maelstrom

By 20 Jun, 2011June 6th, 2017News

Automated Trader

Market data volumes and the race for low latency has created plenty of challenges for high (and low) frequency trading firms as they attempt to trade on modern markets. One of these is being able to spot faults at trading speed.

GreySpark looks at the various innovations that have emerged over the past five years, and discusses any implications they may have on market integrity. Looking at out-of-band systems for daily transaction cost analysis and best execution, they maintain that out-of-band metrics work for scalability and capacity issues but not for quality issues.

They go on to compare market data in US and European exchanges: market data spec in the USA is relatively stable and the volumes are huge, so FPGA systems can work well. In Europe the exchanges all differ and the specs regularly change, so using FPGA is more of a challenge.

The article then moves into looking at ‘stale data’. According to GreySpark, there are various tactics for recognizing stale data, by sampling the feed, measuring delays directly, comparing prices with other feeds, or comparing time-stamps.

Read more