Consolidated Market Data Feeds Gain Traction in Algo Trading and Fixed Income
By Ivy Schmerken, Editorial Director
Broker dealers and high-frequency trading firms have complained about U.S. stock exchanges charging high fees for direct data feeds, accusing the exchanges of having a monopoly and demanding transparency into their costs.
Brokers and electronic market makers argue they are forced to buy the fastest and most detailed proprietary data feeds to compete and/or to provide best execution to retail and institutional clients.
Exchanges claim that brokers have alternatives, such as the securities information processor known as the SIP, an aggregated data feed mandated by Congress in 1975, which shows the best bids and offers and last trades and calculates the national best bid and offer (NBBO).
But many professional traders and algorithmic investors need more speed and detail than the top of book prices in the SIP offers.
The debate over the direct feeds vs. SIP has been going on for decades and reached a zenith in October when the Securities & Exchange Commission held public market data hearings after the SEC rejected price increases for two market data products from NYSE and Nasdaq, reported The Wall Street Journal in Wall Street Fractures Over Stock Exchanges’ Data Sales.
Amid all of the fire and fury of this battle, there are signs that buy- and sell-side firms have shifted some of their execution needs for data over to consolidated market data feeds.
The consolidated market data field pulled in over $1.4 billion in revenues, an increase of 17% from two years ago, according to research by Greenwich Associates. More than 75% of those revenues flow to the three largest vendors – Refinitiv, Bloomberg and ICE Data Services. Revenues include real-time market data across all asset classes, including the SIP for U.S. stocks, international exchanges, fixed income, foreign exchange, and end-of-day reference prices.
By comparison, the combined stock market revenues from ICE, Nasdaq and CBOE were around $560 million in 2017, reported The Wall Street Journal on Oct. 25, but this does not include connection fees, port charges and other fees that amplify the cost to market participants.
However, on Jan. 2, Bloomberg reported a more comprehensive figure for exchange market data revenues that presumably includes futures and options. “Data collected by exchanges totaled $2.2 billion in 2017, up from $1.6 billion in 2014,” reported Bloomberg citing the Committee on Capital Markets and public disclosures.
Despite competition from direct data feeds and growing concerns about costs, the consolidated market data feed business is thriving, wrote Greenwich in September when it announced a study of 201 market data professionals and users globally.
While consolidated feeds are mainly displayed on market data terminals, client web sites, and order management and execution management systems (OMS/EMS), they also play a role in “non-displayed” machine-driven activities, such as algorithmic trading, noted the research firm.
Consolidated Feeds Expand into Algo Trading
More than 80% of surveyed respondents use consolidated data for market data terminal desktop applications, 49% use it for OMS/EMS applications, 46% for algorithmic trading applications and 38% for analytical engines, according to the Greenwich study.
“There’s been a big increase in their usage in algorithmic trading applications and analytical engines,” said Dan Connell, managing director of market structure & technology at Greenwich on the November webinar, “Market Data Ecosystem: Consolidated Feeds.
Because latency is no longer the only game in town, consolidated data is used for execution purposes by many strategies that are non-latency sensitive. Algorithmic trading applications also use a combination of direct exchange fees and consolidated data.
Firms executing single-stock arbitrage, ETF arbitrage and HFT strategies, require the direct feeds, while those with portfolio executions and macro strategies can use the SIP.
“I am not surprised at the use of algo applications and analytical applications,” said Brennan Carley, Global Head of Enterprise at Refinitiv (formerly the financial risk and data business of Thomson Reuters). Carley said that customers “are not just looking for data, they’re using that data in quantitative research, data science and AI, and so forth.”
Though some of the algorithmic trading applications are execution-oriented, like VWAP and TWAP algorithms, Refinitiv is also seeing activity in wealth management applications such as robo-trading and robo-advisory, which may be part of those algorithmic trading figures, said Carley.
Experts on the webinar expect to see machine-driven applications gain at the expense of market data terminals particularly as trends such as artificial intelligence evolve.
“More and more AI technologies are in the forefront and Wall Street and the institutional side [is] picking up on those AI technologies and therefore that is more math-based and less screen-based,” said Rui Carvalho, Global Head of Feeds, ICE Data Services
Narrowing the Latency Gap
A key factor is that vendors have narrowed the gap in latency between consolidated data feeds and direct exchange access feeds. In addition, consolidated feeds are making inroads into the direct market-access space. “You might see a slight increase perhaps on the DMA because we are narrowing the gap. We might be picking up some ex-DMA as well,” said Carvalho.
Expansion of algorithmic trading and DMA to a broader range of markets is also helping the consolidated market data feed to move beyond cash equities into other asset classes. “The game [of trading] in New York, Chicago and London has become a game of microseconds,” said Carley. The strategies of programmatic and algorithmic trading are not as latency sensitive in other centers, he said. “It’s more about coverage, it’s more about minimizing impact in those markets. They haven’t been taken over as much by HFT,” he said.
Over the past 10 years, the use cases for consolidated market data have grown and diversified. “First, everyone thought that single-digit microseconds were required for every trading strategy but that wasn’t the case,” said Tony McManus, Chief Information Officer, Bloomberg Enterprise Data. “Secondly, consolidated market data feeds have become faster over the years,” said McManus, explaining that vendors can pick up the data closer to the exchange matching engines. “Most of the consolidators have a concept of local ticker plants or proximity and therefore you get a much lower delta between a consolidated feed and a direct feed,” he said.
Third, as trading strategies have become more sophisticated, clients want more than bid-ask data, so a lot of the value-added data that consolidators supply through normalized feeds provide broader applicability to trading strategies, he said.
Another catalyst has been the increase in automated trading on the credit and rates sides, said McManus, adding that pricing engines are very prevalent across Bloomberg’s customer base. While getting direct feeds off of exchanges has been straight forward for many years, gathering fixed-income data from MTFs, brokers, banks and other sources is much harder, said McManus
Additionally, Europe’s MiFID II regulation has been a driving force behind the consumption of consolidated market data, said Carley. “The delta over the past two years has been in credit and rates,” said Carley, adding that the introduction of Authorized Publication Authorities (APIAs) and MTFs has spawned a lot more data. As a case in point, Refinitiv has onboarded around 40 venues across Europe, he noted.
Demand for consolidated market data is also rising as firms need historical data and a wide variety of content, said Carley. Much of the growth in revenues has to do with demand for fixed income and foreign exchange data, said speakers.
Traditionally in fixed income, firms were tied to a provider based on their trading venue, trading counterparties or universe of trading instruments that provided a subset of the full coverage. They would have different price engines that would overlap, but only get up to 40% of the coverage, he said. They would need to cobble together four different engines to get to 60% of the universe, he continued. “A distinct advantage today is that the consolidated data feeds provide more of a full universe concept,” said ICE’s Carvalho. For example, ICE can price the full fixed income universe on a daily basis, said Carvalho. Users want to know they can rely on a vendor to always get a price in an unlisted security, he said.
Factors to Consider
Data quality and cost topped the list of factors for evaluating consolidated data providers. In a live poll, 66% of attendees chose data quality, 59% picked cost, while 54% cited coverage of required data sources. Factors such as customer service local support, full tick or depth of book data and latency were in the low single digits.
“The market for latency-sensitive feeds has contracted,” said Carley reacting to the poll. “Those trades have become too crowded and the same is true of the low-latency game,” he said, pointing to the recent merger of Virtu and ITG as signs of that trend.
One of the reasons for rising costs is that consolidated feed providers are providing normalization services and things like building feed handlers. Building 200 exchange feed handlers from 350 trading venues plus OTC instruments, which adds up to 600, is not easy feat, said ICE’s Cavalhlo. “Clients are thinking about total cost of ownership, such as connectivity, telecom fees, switch fees and access fees to rationalize their in-plant costs,” he said.
Bloomberg’s McManus said dealers are looking for linkages between other data sets, such as reference data and news, alternative data sets, and unstructured data that consolidators provide. “There is a desire for normalization and this is coming back to the growth of machine learning and the desire to automate more workflows, hence the need for a broader data set and it all has to work together,” said McManus.
Even with some use cases like algorithmic trading consuming consolidated market data rather than direct feeds, doesn’t mean costs are falling.
In a live poll, three-quarters of attendees said they expect to spend more money on consolidated data feeds, 19% expect spending to remain the same, and 8% anticipate a decrease in spending.
Given the cost pressures, vendors are expanding their managed service offerings, such as moving to a data-as-a-service offering, so that users don’t have to manage their market data infrastructure. They are also providing easier access to tools and APIs so that users can integrate what they receive.
The three big vendors are also investing in cloud technology as a way to reduce costs. But there is debate on when customers will be ready to move to the cloud, whether it’s 12 months or over the next five years.
Carley said Refinitiv is adopting cloud technology to enable new use cases, such as data science, AI and machine learning, so that firms don’t have to spend the time and investment to set up a server in their own data center.
Ninety percent of the customers he’s talked with are going to move their market data to the cloud within four years, said Carley.
“Certainly, things like historical data, tick history, end-of-day pricing, reference data are a natural for cloud and where we see a lot of adoption,” said Carley. Applications which require the user to continuously go back to the data source for real-time updates probably won’t use the cloud.
But there are cases where people are taking streaming feeds in Amazon, Google and Amazon’s (AWS) cloud to power real applications today, said Carley. “It is not the majority, but it is happening,” he said.
Data Service Integrations Available with FlexTrade’s Execution Management System Technology
- Kensho Actionable Intelligence
- Dataminr Global News Alert Feed
- Symphony Communication Services
- OTAS Trading Intelligence And Analytic Solutions
For further information, please contact us at email@example.com.
Past blog posts related to Data Issues
BIG Data: Getting Granular with ESG Factors
Data Science Platforms Help the Buy-Side Integrate Alternative Data
Algo Development 2.0 Looks to Open Source, Cloud & Big Data