Colin Lambert believes data will become a commodity and will generate a divide between the “haves” and the “have nots”.

There is little doubt that data drives most things in foreign exchange. Pricing is the obvious area, but client business is now also analysed to great depth as service providers seek to more clearly define the value they extract from their franchises. Throw in operating metrics as well as reporting, and data permeates just about every part of the business.

So, it’s therefore a good thing that it’s so easily available to all, right? Well perhaps not, because 2018 could well be the year when we finally understand the great divide between the “haves” and the “have nots” when it comes to data.

It is not only about the speed of data, although that does and will continue to play a big role, more it is about the storage required. It’s one thing for a trading business to keep data from the past couple of years for one stream, for example, FX, but it’s another if a seven- year storage period is required and it’s across all financial markets.

There are those (typically on the consumer side and not data provision!) who believe it’s a shame that data is becoming a point of differentiation and a product in its own right. This school of thought argues that in such an environment, the data world fragments along as many lines as the trading environment, which means double the connectivity, double the bandwidth and lots more.

Contrary to this, platform providers in particular are looking to their data businesses to help bolster fragile revenues from trading activity. It may sound counter intuitive, but some platform providers argue that even though trading is without doubt more sporadic in nature and volumes are generally lower than they have been in recent years, when the markets do move, they go quickly, and those with the rich data sources are better protected.

So, on one hand we have a group of customers concerned about their ability to handle huge data (big data is so 2015) and seeking to simplify the delivery and storage mechanism, but on the other we have a group of businesses that see data as preserving shareholder value and revenue streams. This could be one occasion where the real wishes of the client may be conveniently ignored.

The data-driven divide is unlikely to be along traditional segment lines either, for while banks will undoubtedly face much higher resource demands, they also have the money to be able to pay for it. Smaller, non-bank firms on the other hand, do not have as much data to store and have fewer client compliance requirements. The interesting layer of the market could be the smaller banks and larger non-bank market makers, for it is here that we could see the biggest data-inspired shift.

Regional banks with specialities may see fit to pay up for better data in their core currencies (although it can be argued that their domestic franchises are often worth 10 times more than public market data), but are they going to pay for super- fast G7 data? This is where questions are being asked, according to Profit & Loss sources, who observe that more regional or specialist banks are seeking liquidity deals with market makers in the major currency pairs.

Often this is via simple aggregation, which is used to formulate any crosses required, but as one or two non-bank market makers are keen to remind everyone (on background), a lot of their “client base” is made up of banks that simply don’t want to pay for the fast data.

As things stand, this works well because the market maker is after volume (market share becomes a big thing again thanks to the relative lack of volatility) and the regional player doesn’t have a huge bill for data delivery, retention and storage.

It is not all roses, however, because while there are clearly barriers emerging thanks to the cost of data, the relationship with market makers that are paying for the fast data also enables regional players to stream a better price in the majors. Another trend this year, therefore, will be those major players who have super-fast market data, doing more to prevent their price being recycled by their “customers”. One player describes it as “minimising data leakage” and believes a key element in this is reducing the amount of trading over ECNs and increasing the amount of direct connectivity…and thus the cycle of the platforms needing to rely upon their data more is repeated.

It’s not all good news for the ECNs and platforms either, for as the market drifts towards firmer pricing and smaller last look hold times, the number of quotes they have to process commensurately increases, as we discuss elsewhere in this feature.

So, 2018 will be all about the data. It will empower those willing to pay the price, and just as easily place those who are not at a significant disadvantage – especially in an era in which the guidelines around last look are being tightened. If the FX Global Code does provide relief for the ‘quote and cover’ model, this impact may be reduced. But again, if the market maker believes its price is being recycled, why would they continue with the relationship?

Galen Stops

Share This

Share on facebook
Facebook
Share on google
Google+
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on reddit
Reddit

Related Posts in