Spreading the Wealth

With more information becoming increasingly accessible to a wider set of FX market participants, are we witnessing the democratisation of data? Galen Stops takes a look.

The starting point for claiming that data is being democratised in FX, and in the financial markets more broadly, is to point out how much more accessible data has become to a wider range of market participants. 

At the retail level, people can use smartphones to find out a currency exchange rate at any time in just seconds. At the professional level, trading firms can now access high-speed market data from numerous sources at affordable prices, while aggregators allow them to rapidly compare the data coming on from these sources. 

On top of the increased accessibility of data, it has also become easier and more affordable for firms to process and analyse this data for a number of reasons. Firstly, the cost of computing has collapsed over the past 20 years, with figures cited by a Deloitte research report showing that price of computing power has decreased, from $222 per million transistors in 1992 to $0.06 per million transistors in 2012.

 

Secondly, the same report notes that the cost of data storage decreased from $569 per gigabyte of storage in 1992 to $0.03 per gigabyte in 2012. The decreasing cost-performance of digital storage enables the creation of more and richer digital information. As the cost of accessing, storing and then analysing data has come down dramatically, this has in turn lowered the bar for firms wishing to compete in financial markets.

Thirdly, the number of individuals that know how to analyse and then deploy this data in a useful manner has increased. Even though there’s still growing demand for individuals who know how to code and manipulate data sets, there are more people trained in these skills than there used to be.

A fourth factor to consider with regards to the democratisation of data is that many of the valuable data sets that exist today are not that old. The fact that new data sets are emerging – or new ways to capture existing data – inherently levels the playing field for new firms entering the market. 

This is particularly relevant when considering alternative data, which Tammer Kamel, CEO of Quandl, defines as “data that Wall Street has never seen before that also provides information that they can leverage to some advantage”.

“Hedge funds have always reached for something to give them an alpha edge. Most recently it was high-frequency trading, before that it was quant pricing strategies for complex derivatives and before that, it was just using computers when the other people were still using calculators. They’re looking for something that gives them an advantage and I think right now we’re in a world that’s exploding with data and this is the next golden opportunity for people seeking alpha. If you can find data that no one has found before, then you can equip yourself with an informational advantage that you can turn directly into alpha or trading profit,” he says.

Kamel adds: “The irony is that this is the original alpha source; informational advantages have always been the most direct path to beating the market. Now we’re coming full circle and alternative data is allowing people to really know things that no one else knows and that’s the appeal.”

Barriers still in place 

Yet despite these changes in the market, Kamel warns that the barriers to entry are still high for buy side firms wishing to challenge incumbents in financial markets.

As a senior figure at one fund explains: “The ‘haves’ and ‘have nots’ do have equal access to data to a point, but sustaining competition requires that the ‘have nots’ get something from the data that proves fruitful. I don’t see this as the golden age of competition in the space, not because the technology is at fault, but because of the structure of the asset manager business and the way it has developed. It was a lot easier to set up a CTA 30 years ago than today,” says one fund manager in New York. 

Some market participants argue that the disparity between these “haves” and “have nots” in the FX market is being exacerbated by some of the market data offerings from the major trading platforms. 

In particular, EBS Live Ultra has caused grumbling amongst some trading firms, because it requires market participants create 40% of their volume on the central limit order book (CLOB) as a market maker before they can qualify for the fastest ultra-feed at five milliseconds. 

“I think it’s problematic, because if you have a market position off the bat and you’re a market maker, it gives you an advantage. But it is also a barrier to entry for others, so I think it’s anti-competitive by its nature because the firms who want to come and play in the market have to start at a disadvantage which makes it more difficult for them to actually compete,” says one market source.

They add: “On the other hand, if I’m one of their four or five main liquidity providers, I think it’s a great idea.” 

But Tim Cartledge, global head of FX and head of product at Nex Markets, rejects the idea that EBS Live Ultra is creating an uneven playing field for FX market participants.

“Anyone competing with a market maker is by definition another market maker and would therefore qualify for the faster data! What it does mean is that the very high frequency takers can only get access to the fastest data by market making themselves,” he says.

A two-tier market?

A senior figure at one non-bank market maker still claims that this model threatens to create a two-tier market and says that they remain unclear about the economic rationale behind the 40% threshold. 

“Forty percent seems to be a completely arbitrary number to receive that data. Why 40 and not 35 or 50? It is like they pulled that number out of a magician’s hat. If you really want to create a different environment, it should be based on economics.”

Agreeing that market makers should be rewarded for providing a service on the platform, this source instead advocates the system used by equities exchanges such as NYSE, where market makers are offered a rebate for the pricing that they provide. 

“This is a more intelligent approach in my opinion, because you give the players an economic incentive to behave but you don’t create an un-level playing field for new entrants,” the sounce says. 

However, Cartledge once again disputes this viewpoint and offers clarity regarding how Nex Markets decided on the threshold for the faster market data feed. 

“We debated numbers between 30% and 40% originally. Every trade has a maker and a taker so the system average maker ratio is obviously 50%.  Forty percent was chosen initially as a sufficient margin away from 50% that clients could meet the criteria regularly without being affected by noise. We now have the benefit of real-world data of course and can now adjust the criteria based on where we believe we can attract the absolute maximum amount of market making support. To this end we recently changed the criteria to 35% for maker ADV of $500m and 30% for maker ADV of $1bio and reduced the minimum qualifying maker ADV to $100m,” he says.

But in many cases it seems as though the complaints about EBS Live Ultra are actually symptomatic of a broader trend amongst platform providers that are, in some cases, trying to shift their revenue models away from volume-dependent brokerage fees and towards fixed-cost market data fees.

“Data is becoming more and more expensive across the board, partly because all these platforms are converting themselves from brokerage houses to selling data,” says the non-bank market maker. 

The non-bank points out that charging a flat fee rather than a brokerage one that is relative to how much a firm is trading can stifle competition in the FX market by making it harder for new firms to enter the market and slowly ramp up their trading activity.

In a similar vein, one source at a US trading firm comments: “My old saw is that these firms are trading platforms, they’re trying to put as many buyers and sellers together and if there’s only three people or five firms trading on your platform, you’ve made yourself irrelevant. What I mean by this is that if you’re charging a high fee to see the prices on your platform instead of shouting them from the rooftops to everyone who wants to see your bids and offers, then you make this privileged information and your pool will get smaller.”

A managing director at an investment bank makes a similar observation, and points to the equities market as an example of where this data-dependent revenue model can become problematic. The MD explains that in some cases, as market data fees have gone up, it is squeezing out small- and mid-size firms from the market, which subsequently leaves a smaller universe of large firms picking up the tab for this data feed, meanwhile the actual liquidity pool itself has become less valuable to them with the exit of these smaller players.

“There’s a balance that has to be achieved, and ultimately I think it’s dangerous to completely move away from charging for the fundamental service that you’re providing,” they say.

This is why the source at the US trading firm warns: “The platforms get this revenue and it’s their crack, they can’t give it up, but it atrophies the platform and, if you ask me, if the winds blow in a different direction one year, it could expose them to some regulatory pressure at some point.”

Conflicting views

 Expanding on this last comment, the source says that there is very little transparency about why these data feeds are priced the way that they are, and they point out that when platforms offer different market data tiers, they are not offering a new feed to put into a firm’s pricing engine, but rather “it’s just a huge fixed cost for a minority of market participants to have privileged information early”.

Discussing the potential for future regulatory scrutiny around FX market data distribution, the non-bank market maker is adamant that “there is definitely going to be a scandal about this”. 

“The next scandal will be around market data,” the non-bank source says. “What was the WMR Fix scandal at its essence? It was people sharing information in a chat. The next scandal is going to be around someone getting access to data that they shouldn’t be to the detriment of other people.”

Although this consternation about some firms being able to access market data quicker than others is genuine, it’s worth noting that it is only a significant concern for a relatively small proportion of the FX market place. With regards to the democratisation of data, one source points out that the difference in data available to different market participants is in many cases only marginal. 

“Now, being marginally ahead can really matter, depending on what you do for a living, but this isn’t really a large universe of participants. Does it really matter to a large asset manager if they don’t get an update from EBS every five milliseconds? Probably not as much as it matters to the market maker,” the non-bank source says.

The non-bank points out that when the FX market was still voice traded, traders could gain an informational advantage by utilising a broader range of brokers spread across a large range of geographies. This is because when a market moving event occurred in these geographies, the trader would be in a position to get that information first and trade on it. 

“So, in terms of the democratisation of data, that aspect still does exist. I might inherently have access to more data than a number of other folks in the market place. So no, it’s not completely democratised. But the other side of that is that as you get more granular in terms of the data you’re looking at, the less that data matters to the vast majority of the market place.”

The conflicting views about the extent to which data is becoming democratised in FX was reflected in a recent survey conducted by Profit & Loss. Asked whether FX is becoming more of a level playing field in terms of firms being able to access market data, 45% responded “Yes” and 45% responded “No”. Meanwhile, 10% said that they were unsure.

This is perhaps reflected in Kamel’s comment: “The democratisation of data is an ideal end point that we might achieve at some point in the future, we’re a long way away from that right now.”

Galen Stops

Share This

Share on facebook
Facebook
Share on google
Google+
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on reddit
Reddit

Related Posts in