As data in the FX market becomes increasingly democratised, panellists at Profit & Loss Forex Network London debated what this means for the industry.
Although the sources of data in the FX market have remained fairly static, Andrew Ralich, CEO of oneZero, said that he’s seeing a broad democratisation of the tools and systems necessary to process this data.
“Looking back five years or more, the infrastructure and software that it took to react to time series and very granular micro market structure data in real time in order to make assumptions was very expensive and there was very niche expertise surrounding it,” he said.
But Ralich highlighted that in recent years, firms offering cloud platforms are offering capabilities to marshal all this data and analyse it at orders of magnitude less cost and expertise.
“Although the data set being looked at might not change, the breadth of participants who have the horsepower to react to it is, from our perspective, getting wider,” he said.
However, Ralich went on to make an important distinction between the democratisation of tools to analyse data and actually being able to access the data itself.
“I believe that from a technology perspective, democratisation has occurred that lets firms analyse bigger, better, newer data sets. If you’re a firm participating in this space and you’re not evolving and bringing on talent that’s capable of using these new mechanisms efficiently, then you’re going to fall behind. That doesn’t necessarily mean that all the data that you can and should be analysing is accessible or has been democratised,” he explained.
Indeed, John Ashworth, CEO of Caplin Systems, made the point that even if firms in the FX market could access all the data contained within it, there would often be a defined limit to the benefits of analysing it all. Yet despite this, he pointed out that Caplin’s clients often do have huge amounts of relevant data at their fingertips that they’re not using, and it doesn’t even require sophisticated analytics tools to derive value from.
“I’m talking about customer data,” he said. “We have clients who, on the anniversary of a roll of a three-month forward, have a paper system to remind them that the customer did the trade or didn’t do the trade with them 89 days ago. Similarly, they have stacks of data about what their customers are doing which they never look at and they never use to reprogramme spreading algorithms or tier allocations or anything like that. This is a different side to this data issue but it’s just as important in terms of the efficiency of the bank and the value that their customers can accrue.”
Ashworth continued: “Somewhat controversially, I would say that the efficiencies that can be achieved by the democratisation of technology around data are dwarfed by the inefficiencies associated with the management processes to actually getting around to doing anything about it.”
No substitute for common sense
Hasan Amjad, head of algorithmic trading at GAM Systematic Cantab, agreed with the other panellists that the operational barriers around data management and usage have come down in FX, adding that they have pretty much vanished with regards to research.
“But I think the problem is that what you can’t democratise is common sense,” he subsequently commented.
Amjad cited the example of one trading firm which, following a ramp up of its machine learning team, produced a new strategy that performed incredibly well when it was back-tested. When this was shown to the traders at the firm though, they refused to put the strategy into production, insisting that it was too good to be true. The traders insisted that the machine learning team keep analysing the strategy to figure out what was wrong with it and, sure enough, a crucial input error was discovered.
“It turns out that they had made one of the oldest mistakes in the book, which was they hadn’t accounted for daylight savings time in the exchange hours. And that’s because all these clever young people that the firm was hiring have no idea how the industry works, no idea about the business model and no idea about the specific domain knowledge that you need to look at the numbers that your model spits out and be able to tell if they look right,” said Amjad.
The point being, he said, firms can’t simply hire data scientists to extract value from the increasingly democratised data in the market and therefore for many firms the case for in-depth analysis of the data is weakened because simply hiring data scientists with no domain knowledge tends to waste both time and money.
The panellists then went on to discuss who the winners and losers will be in an FX market where data is increasingly democratised. When Amjad was questioned about whether this democratisation truly levels the playing field for firms competing in FX, he responded that there will continue to be tiers of different capabilities amongst market participants, but that they might start to find their edge in different places rather than their data.
“Part of Thomas Kuhn’s paradigm shift theory states that you have this build-up of momentum to a point where suddenly there’s an explosion and a paradigm shift whereby the established tiers at that point in time collapse. In this case, the operational barriers and the barriers to accessing data collapse, but then that becomes the new normal and people automatically start finding ways to differentiate themselves and the tiers just appear in a new form until the next paradigm shift,” said Amjad.
Following on from this, Ashworth made the case that in the emerging paradigm in FX, power is shifting to the firms that own the data. In particular, he pointed to the recent M&A activity that has seen large exchange groups buying OTC FX platforms, claiming that the exchanges weren’t doing this because of the current values of the brokerage revenues associated with these acquisitions, but because of the data and market footprint associated with them. However, he was also quick to emphasise the power that the banks retain simply by virtue of the credit structure of the FX market.
“The bank’s position in the credit pyramid, daisy chaining from the very tiny customer at the bottom to large, sophisticated customers at the top, is intrinsically valuable to the bank. And while disruptors in the credit space will have a different view on this, I think that it will take ages for this structure to change. So yes, the power will be with the data aggregators and yet the banks will still have incredible power, not just because of their balance sheet and their brand, but just because of the position they occupy in the credit structure,” Ashworth said.
Amjad commented that, from his perspective, there are both positives and negatives from the consolidation of data amongst these trading venues. On the one hand, for trading firms like his, there is more uniformity, increased ease of negotiation and onboarding, and improved ease when working with the data from these venues. On the other hand, he said that no one wants to live in a seller’s market because as fewer people begin offering any kind of service, the more fragile and brittle the system becomes.
Ralich noted that it will be interesting to see if, as the exchanges evolve their consolidation models and absorb these OTC FX platforms, they are going to be able to derive unique and valuable insights from the data that they acquire along with them.
“These exchanges previously controlled a monopoly and their businesses evolved into selling the data that came from this, which by its nature is going to have all the information available for that specific exchange or equity class. In FX, the inputs into the system, the firms that are providing liquidity into these platforms, are very consistent between them but the constituents on the taker sides of these platforms are very different,” he said.
Ralich continued: “So buying 360T gets you a data set that’s very interesting about the corporate side of this world, but it’s not all-telling and that data might not be interesting to somebody else, which is unlike in the equities world. So it’ll be interesting to see how these different fragmented data sets that they’ve acquired are either combined or whether they realise that maybe there is a limit to that consolidation because fragmentation exists in this space for a reason.”