From origination through trading to settlement, Colin Lambert argues that data has revolutionised the foreign exchange industry more than any other single factor.
“Even 10 years ago I probably heard the word ‘data’ once a month,” says Kate Lowe, head of trade services at State Street. “Now? It’s 40 times a day. As an industry we are obsessed with the information and data – it’s the lifeblood of our business.”
For Lowe, the focus on data is driven largely by necessity – “how do you comply with trade reporting requirements if you don’t have the data?” she asks – whereas for others it offers protection, opportunity and validation. “Data allows us to identify weaknesses in our business model, improve our pricing and show our clients that we have done a good job,” says the head of FX at a bank in London. “It also allows us to point out, empirically, how the client’s trading style is influencing how we service them. Data has empowered a new era of open dialogue – you can’t argue with the numbers.”
Hand in hand with data is a more recent phenomenon in FX markets, the use of artificial intelligence (AI). It is hard to get through any industry presentation these days without multiple references to AI or machine or deep learning. There is no doubt, however, that AI is empowering the next generation of service in a way that it couldn’t previously. “We have technology-empowered staff now, backed up by AI processes that can improve productivity for the bank and client as well as help those people manage broader client portfolios,” says George Athanasopoulos, global head of FX, rates and credit at UBS. “The last three-to-four years have been pretty steady in terms of the human resources we have, but we are giving them better technology to work with. There is always the thought that if your automation works well you can cut human resources, but we find clients still like to talk to someone, especially if things go wrong, so we want to use data and AI techniques to support that service.”
On the trading side, data has long been the fuel in the tank for so many firms. When EBS launched EBS Live in 2004, there was a spate of “we have seen the future” comments from trading businesses although it was really only the stat arb type firms that fully embraced it to start with. The intervening years have seen data pulses drop from one per second to one per five milliseconds, they have also witnessed the growth of data providers.
“As the market became more electronified, as more data got captured, that data not only became more valuable but it became more expensive because you had these firms that were running systematically and they needed to gobble up as much data as they could possibly get their hands on,” says Jason Woerz, president of 24 Exchange. “Other firms wanted every data source possible and the exchanges started to figure out that data was the next thing to monetise (previously low latency) and they started charging higher fees on that, which led to people’s operating costs going up. Even today, when the cost of an individual data service is probably lower, the sheer scale of what you need means the overall data cost continues to go up.”
While exchanges and other platforms battle to get their data into clients’ systems – at a price – several of those clients are also looking much harder at the possibilities afforded by data. “Clients are consuming more data than ever, both in their day-to-day business as well as to evaluate their service providers,” says Athanasopoulos. “That offers an interesting opportunity for a bank like UBS because it is a de facto data vault – it is one of the reasons we have embedded a data solutions group within our business.
“Clients are concerned about service slippage, especially as the number of pricing and execution venues grows,” he continues. “They have to manage so much more – especially now they have assumed more control of their execution – and it can be a cumbersome process, but by using our data and technology we can help them optimise their own pricing, execution and risk management. Navigating the platform world where everyone has a different protocol and rulebook is a very complex issue, it requires skill and innovative thinking as well as great technology and that is where we can help them.”
A senior FX source also believes data-as-a-service is a coming trend. “AI is already embedded in a lot of the things that are going on today in terms of pricing and order execution methods,” the source observes. “Do I think it’s going to continue? Absolutely. As we begin to understand and be able to deliver data to clients via this data-as-a-service where clients can reach up into cloud and pull back a lot of this information, the power of their execution increases tremendously.
“Obviously data’s a central theme here because any kind of AI tool is underpinned by data, that’s what they run on,” the source adds. “We are likely to see a shift in the technology arms race away from speed of execution, which is already quick enough for most clients, towards faster data consumption and processing.”
A further increase in data speeds would not be universally welcome, however. “Faster data encourages latency traders who don’t have to spend valuable money to maintain a control framework across a global organisation,” says the head of e-FX trading at a bank in London. “It would see market quality decline further.”
Chris Purves, head of UBS’s FRC Strategic Development Lab, is also wary of a race to make data faster. Noting that more equities markets are looking at establishing speed bumps in their platforms, he says, “If there was one genie I would like to put back in the bottle it would be high frequency data. I am not sure what value it adds. In fact, I don’t think it helps anyone, it’s just an unnecessary cost, a point of friction that makes little or no difference to a business. The higher cost of this data just moves around the system and comes out in the form of higher costs to clients.”
Woerz meanwhile, believes the cost of data is going to become, with collateral and margining costs, one of, if not the biggest cost in a trading operation. “It is definitely something the modern day COO of trading needs to be looking at, ways to reduce that overhead,” he suggests. “Having said that, people will pay for what it’s worth. They will not blindly pay more year after year though, which is how I feel the exchanges see it.
“Not only are the exchanges raising market data fees, they’re creating structures by which you need to almost play ball with the biggest players in the world,” he continues. “It’s similar to your cable television provider telling you that ‘oh you want to get rid of this obscure channel that the Teletubbies are on, ok well there goes your HBO along with that because that happens to be packaged with it’. That’s very similar to what these exchanges are doing right now, they’re forcing you into large, enterprise-wide models even though you’re not using a lot of the data and then they’re coming in and auditing you and charging even more.
“Also, the service hasn’t gotten any better, so it doesn’t make any sense to me at all. If the value of data stays static, the price cannot continue to go up,” says Woerz.
There seems little doubt that data will continue to dominate business managers’ thinking for some time, but there is also the sense that just consuming data by the zettabyte alone will not deliver results – it needs to be used carefully and innovatively, especially if markets generally embrace the move to put ceilings on the speed of trading. The good news is that the technology required to support a trading business is more accessible than ever, as Purves notes, “While the biggest change in the focus of technology over the past three or four years is the embracing of machine learning, I don’t think it has been driven by competition or the banks waking up to the potential, it’s just that has been the timeframe wherein the technology became available. We all know you can’t do a good job of pricing without the support of good AI and technology, but if you use these assets well you can differentiate yourself.
“The advent of AI and big data has led to a skill set change in the Markets business, maybe four years ago we weren’t even looking for people with the skills to work in these fields, maybe the high frequency players were but the banks certainly weren’t,” he adds. “The technology has changed everything because it used to be you had to code everything by hand, now the technology is available to use and almost anyone could code using machine learning.”
Twenty years ago, data was largely delivered by a heavy manual process – either it was the voice desks pricing trades or it was indicative quotes flashed on a screen. Now data is the ubiquitous oil that drives the engines of markets and it has, by general consent, made markets more efficient and more transparent. Perhaps the big question facing the industry over data is a familiar one to trading businesses that faced the same issue a few years ago – how much is too much? Is there a ceiling above which the value of the data not only starts to diminish, but where it also becomes detrimental?