Regulators Must Keep Up with Fast Markets: BoE’s Salmon

In a speech on Friday, Chris Salmon, executive director, Markets, at the Bank of England, discussed the changing market microstructure, in particular the advent of “fast markets” and stressed it was “incumbent” upon authorities to keep up.

Salmon highlighted three recent flash events in financial markets, the equities market flash crash of 2010, the US Treasuries flash rally in 2014 and last year’s Cable flash crash and while he observed that sharp moves in asset prices are nothing new, “the speed, and the typical near-total reversal” is new.

“What is clear is that flash crashes are likely just one symptom of material changes in the structure of certain markets and the nature of their participants,” he said. “Although these changes are ongoing, we need to understand them and their drivers, if we are to succeed in correctly identifying any implications for financial stability.”

In line with general thinking on the issue, Salmon noted that three main drivers of the shift to fast markets were e-trading, better data and regulation, however he also observed that one result of this change was less need to warehouse risk. A greater number of participants meant it was easier to find a “near instant match”, Salmon observed, and identified the rise of non-bank market makers as a result of this. Another impact was how some banks, who were unable to compete with the smaller, nimbler firms, had shifted to an agency model – something of a credit intermediator between end-user clients and the non-bank firms.

Although Salmon highlighted the benefits of the presence of high frequency traders – noting that empirical research has found, on average, the presence of these firms has been associated with improved headline measures of liquidity, at least for trading in small size – he also noted the concerns over increased transparency of order and resulting slippage.

Discussing the benefits from the microstructure changes, he said, “These are potentially important benefits. The ability to undertake transactions at – or close to – prices that reflect economic fundamentals facilitates the proper allocation of capital, as well as the management and transfer of risk. It also gives borrowers the confidence to plan, and savers to finance, productive investment. Efficient markets also allow for the transmission of monetary policy by allowing changes in policy interest rates to be reflected quickly across financial markets and assets.

“The rise in automated high-frequency trading has, however, also increased the incentive for market participants to protect information that could signal their trading intentions,” Salmon continued. “This is to reduce the risk of being disadvantaged by trading with other – faster-moving – market participants, and thus receiving a worse price (a phenomenon sometimes referred to as ‘slippage’)”

This interaction has manifested itself in two observable trends, Salmon noted. First, there has been a partial reversal in the trend toward greater price transparency as market participants embrace other forms of trading, which differ to traditional exchanges either by offering less price transparency and/or a narrower range of counterparties.

Second, market participants’ desire to avoid revealing information on their trading intentions and seek the best price has led them to split up large orders, including via the use of algorithms. This can be seen in a reduction in trade sizes, as well as an increase in the rate at which orders in some markets are updated or cancelled,” he added. “Here at the Bank we want end-users to both benefit from and be confident in the effectiveness of financial markets. So whilst these behaviours are presumably rational – and cost-effective – for individual market participants, we are mindful that an aggregate reduction in transparency has the potential to hamper efficient price discovery.

“Moreover, the steps some participants are taking to conceal information raises questions about how the aggregate efficiency benefits from faster markets are shared,” he warned. “Finally, the occurrence of flash crashes indicates that, even if fast market prices are more efficient in general, they are not always so.”

Salmon summarised by suggesting that the headline conclusion that fast markets are more efficient should be tempered, because, “as is often the case, the story here is somewhat nuanced”.

Salmon noted that having less reliance of risk warehousers meant that during the 2008-09 financial crisis, stresses were concentrated in those markets that had relied on dealer balance sheets. The question of a intervention by the authorities in equity or FX markets simply never arose, he said.

And while he accepted that flash crashes are headline grabbing, and occasionally sleep-denying, they have so far had limited systemic impact, “I doubt we can fully understand what conditions may trigger similar future events or completely anticipate how they might unfold.”

Interestingly, Salmon pointed out that future flash episodes, “May interact with aspects of financial market infrastructure in a way that gives rise to longer-lasting disruption. Suppose, for example, that a future flash episode happened to coincide with benchmark fixings in foreign exchange markets, or a margin call related to equity or derivative markets. The resulting impact on the recorded values of a range of assets might risk mechanically prompting further sales and price falls.”

As well as calling for regulators and market authorities to better understand the risks in modern markets, Salmon also argued that fast markets alter the pattern of risks that individual market participants are exposed to. “There are good grounds for concluding that more needs to be done to ensure market participants take account of this consistently,” he stressed.

“Use of algorithms does not change the fundamental risks associated with trading in financial markets (e.g. market, counterparty and operational risks), but use of high-frequency trading algorithms does change their relative intensity and can materially increase the potential to build up significant intraday positions,” he said. “This latter point is most obviously true for those bank and non-bank intermediaries that specialise in high- frequency trading. And though these firms should be expert in managing such risks, history shows this is not a given.”

Salmon continued by noting that end-users – corporates and asset managers – are increasingly splitting up larger trades into smaller pieces and trading them over a longer period. “Doing so exposes them to more execution risk; that is, the risk that prices move against them before they have finished transacting – which traditionally was more the preserve of intermediaries,” he warned. “Some end-users are also now automating this trading process through the use of algorithms [and] in so doing they swap one set of risks for another, and the examples of losses by specialist firms in periods of fast market turbulence cautions against assuming that all end-users will effectively manage the new risks associated with the use of algorithms.”

In addition to the users of modern technology, Salmon also highlighted the need for banks’ prime brokerage and clearing businesses to maintain substantial investment in technology and infrastructure, including that to facilitate the real-time monitoring of exposures and risk management. “The costs associated with this are high, and serve as an effective barrier to entry,” he observed. “As few banks provide these services, this leads to a concentration of nodes of market access for short-term liquidity providers.

The changing role of banks/dealers in fast markets prompts the question whether the nature of potentially disruptive risks is also changing. One obvious concern would be if a prime broker or clearing bank was paralysed, including for reasons unconnected to its activity in fast markets – say because of a cyber attack. I think it is fair to say that our understanding of how market functioning would respond in such a scenario – i.e. one in which a number of, in particular high-frequency, liquidity providers were denied market access – remains relatively limited.”

Salmon concluded by noting that if there is a common theme across all of his examples, “it is the limits of our understanding”. But while he reiterated the need for the authorities to better understand fast markets, he also argued that it is better for them to “remain vigilant and deepen our understanding, so that we can take appropriate action in the future if required – be that from a macro-prudential or supervisory perspective”.

He added, “For the authorities, we need to dig deep to understand what this means for the financial system as a whole: both to appreciate the benefits and to remain vigilant as to the risks. It is important for us to ensure that regulation – both of individual participants, market infrastructure and the financial system as a whole – keeps pace with the changes in the type and distribution of risk. Only then can it provide an adequate guard against risks to prudentially regulated institutions and their counterparties.”

Colin_lambert@profit-loss.com

Twitter @lamboPnL

Twitter @Profit_and_Loss

Colin Lambert

Share This

Share on facebook
Facebook
Share on google
Google+
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on reddit
Reddit

Related Posts in