The Challenges of Managing Risk in Machine Driven Markets

The increasing automation of the FX markets is changing how firms manage their risk, said speakers at the Profit & Loss Scandinavia conference in Stockholm in October 2018.

Marian Micu, director of quantitative research at Qube Research and Technologies, highlighted the emphasis on new technology and automation in trading by pointing out that five years ago he had no machine learning specialists on his team and that now over half of them are machine learning traders or researchers.

Micu went on to explain that in the past, news that was likely to impact prices in the market mainly came from scheduled announcements from sources like the European Central Bank (ECB) or the Federal Open Market Committee (FOMC), and that such announcements could be very well monitored and understood by a team of human traders and then integrated into the trading strategies.

“What we observe now is that many announcements come from social networks, from tweets and venues that we were not aware of in the past, and the speed of this news has significantly increased. So we have had to adapt to this, we’ve created machine learning methods that use natural language processing that can read the news in microseconds and then transform that information into a trading signal that will allow us to avoid getting caught market making the wrong side of the trade,” he said.

Micu continued: “Another aspect to consider is that as everything becomes automated, there is always a risk of something going wrong. There is always a trade-off between speed and risk, because the more controls you put in place, the more it slows down the computer and the algorithm. So how do you solve this trade-off? We decided to diversify the sources of risk by pulling back a little bit from the market making and diversifying into other systematic proprietary trading like increasing the size of statistical arbitrage, increasing the size of other trading strategies or diversifying risk through other venues rather than racing for speed.”

Tyler Moeller, CEO of Broadway Technology, added that in addition to the speed of transactions increasing, the speed of innovation and change in the FX market has also increased. He said that innovation and change bring their own risks, that there is fundamentally a cost of error to doing something faster and therefore when it comes to introducing new technology, firms need to evaluate and quantify the risk associated with it.

“Systems will have errors at some point, and if we want to have accelerated innovation and accelerated transaction speed and the other things we’re looking for, it does mean that we have more risk of errors,” said Moeller. “There’s some places where those errors will cost a lot more. If you’re providing market making capabilities and you are required by a regulatory body or your own business to always be in the market, being out of the market might have a very high cost. But there’s other instances where a firm might be algo trading and the cost of being out of the market or making an error is significantly less. So firms need to evaluate that cost versus the risk associated with the change they’re making.”

There is also a liquidity risk associated with increased automation, according to Pär Hellström, senior quant trader, FX trading, at Swedbank Markets.

“If you look at the models we’re building, we’re all fairly bright people and so we’re building the same models. And these models are looking at the same data we all have access to EBS, Reuters and the other platforms, so we’re generally seeing the same quotes. This means that we’re roughly building the same models that are reacting to the same things, so when the shit hits the fan and everyone runs for the door, everyone is making the same decision and that squeezes liquidity. And we’ve seen this happen with SNB and the flash crash, all these models were doing the same thing. That’s a little bit scary to my mind,” he said.

Models at risk

When pressed on his biggest concern in today’s machine driven FX market, Micu pointed to alpha erosion, emphasising that this can happen much more quickly now than in the past. To combat this, he explained that his team now tries to identify alpha through new factors and deep learning algorithms that are harder to predict and have a very low correlation to the more traditional factors being looked at by trading firms.

“There’s a model risk that’s changing all the time,” he concluded.

Moeller then observed that while big market risk events tend to grab the headlines and therefore sometimes dominate people’s concerns, there can actually be a huge amount of cost to firms in their day-to-day operations if they’re not thinking about how well their systems are executing.

“If you have a system that’s pricing out to your customers, maybe have an older style of looking at your relationship pricing with your customers based off of managing fixed margins over the current market rate. In the automated world, is that margin or is that strategy and is the way that you’re actually looking at the market and your customer base really making you money or is it actually just slowly losing a bunch over a long period of time?” he said.

He continued: “Something that we’re absolutely seeing across our customer base is people looking at data in a much more holistic way and realising that the risk of automation is not just a sensationalised risk. It’s about making sure that they’re making good decisions on the many trades that they’re doing day-to-day in average market conditions.”

The panellists also touched on the challenges associated with trading off news in the “#fakenews” era, with Micu arguing that although the potential to occasionally receive incorrect information from data sources is a concern, it is outweighed by the benefits of getting this information faster.

“When we started analysing Twitter data, much of it wasn’t useful for us and there was a lot of noise. So we started filtering it more based on who is issuing the most important tweets and emphasised the information coming from central bankers and presidents, etc, and then we traded on those tweets. They have proven to be extremely relevant and an extremely good source of alpha for our algorithms. We also have access to new data that we didn’t have in the past, for instance we have satellite data to analyse different commodity markets and get information in advance of it being issued via traditional methods and agencies. I think that this is an unavoidable development and that in the next couple of years, most firms will move in this direction,” he said.

Moeller broadly agreed with Micu’s assertion, noting that “more information is generally going to be better”.

He then added: “I think there’s a trend that’s actually winding back a little bit, which was to immediately act on the data as much as possible. What we saw in the equities market was a race to be so microsecond, so nanosecond in executing on every piece of information, but what we’re now seeing with the customer base that we interact with is that they want to consume a lot of data but not necessarily act on all of it immediately.”

Giving an example of this, Moeller said that while internalising flow is obviously desirable for most firms, it can be beneficial to make adjustments regarding when they might want to exit out of some of that internalisation based off signals that they get from other data sources.

“That’s a way you can actually use really fast signals to make potentially, a little bit slower moving decisions that might actually be smarter and end up making you more money. I think there is a problem with some of the immediacy of decision making, for example, often the immediate solution that people reach for in some cases is simply to pull their orders and kill everything, which is why we’ve seen some of these flash crashes. So I think that the best approach is to consume all that data, but then be really, really smart about the decision making that happens off the tail end of it,” he said.

Trusting the machines

Knight Capital, which in 2012 lost $440 million in a roughly 45minute period due to new trading software that it introduced, is often held up as the poster child of what can go wrong when trading in a highly automated environment. But interestingly, the panellists said that this example perhaps suggests more automation, rather than less, is the key to preventing similar occurrences in the future.

“If Knight Capital literally just had a system that was watching order flow over a fiveminute window versus the past 30minute window, it would have suddenly seen an anomalous change and they would have detected way earlier that there was a potential problem. The machine didn’t necessarily have to resolve that problem, it didn’t necessarily even have to shut the system off, but it could have at least alerted someone,” said Moeller. “This actually goes back to my earlier point about recognising where are you trying to put your risk? Knight Capital building more innovative models was probably a good thing and a good business model for them, it’s how they were going to make more money. But when they were putting software changes into their system, they should have recognised that it was a riskier portion of their system and therefore put in a more tested, less released control system that was watching it, and had a human who knew how to interact with the system in case it told them there’s a problem.”

Likewise, Hellström said that, for all the new concerns about today’s automated environment, history suggests that the risks facing firms might have been greater when there was more human involvement in trading.

“If you look at a list of the largest trading losses on Wikipedia you find out that Knight Capital, which has been brought up a few times on this panel, just barely makes it into the top 30. Topping the list are the credit default swaps, where the losses were about $9 billion, compared to halfayard in the case of Knight Capital. So, can you trust the human? I would rather the trust machine,” he said.

Galen Stops

Share This

Share on facebook
Facebook
Share on google
Google+
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on reddit
Reddit

Related Posts in

Profit & Loss is no longer publishing

Thank you for 21 great years of support