Have we finally hit the wall in terms of our willingness, as a broad financial markets industry, to accept the continued race to cut a few microseconds off the speed of trading? I suspect we may have, for as P&L editor Galen Stops’ articles on speed bumps in the listed derivative world have highlighted, this is a complex subject and one that was previously anathema to a market that has embraced speed from the first day.
So why is a world that has so often been at the cutting edge of technology when it comes to trading now looking to slow things down – and into the bargain take a solution from FX markets? More specifically, is it right to do so?
First up, and as someone who likes to call out spin, with my sceptics hat on (is there any other type to wear?) I would argue that it is nonsense to say that imposing speed bumps doesn’t target particular customers – it does. There are firms out there who rely upon speed for their modus operandiand they will inevitably be negatively impacted by anything that curbs that advantage – it is up to the individual to decide whether this is good or bad for the market structure. Personally I think it comes down to why a trader is, to use an old expression, sniping the LP. If they are trading on latency just to make a quick turn then I don’t see how it adds to market quality, if however, they are just hitting best price to enter or exit risk, then it’s a consenting adults world and there’s nothing wrong with it. The speed bump, without doubt, targets the former strategy.
The fierce debate around speed bumps in these markets means that we can now added listed markets to the growing list of areas in life that are subjected to polarised and entrenched views, none of which will ever be changed. Nowhere more is this demonstrated than by looking at how proponents of the current model continue to argue that there is nothing wrong with the market structure when firms at the centre of that same market feel the need to make such a potentially large change. I use the word “potentially” deliberately, by the way, because with the sceptics hat back in place I am unconvinced by the arguments that this is just an effort to boost liquidity in low level contracts. If it works well, it will be imposed in bigger markets.
The (hundreds of) million dollar question is, however, will it actually improve market quality?
The argument appears to be that by allowing fast traders to operate as they have been, the exchanges are scaring off other liquidity providers, afraid of getting sniped on a price. I suppose the first thing to note is that this is not a new phenomenon in markets – it’s just a different advantage. Back in the days of the Telex (look it up kids), one party often had an information advantage or was hitting multiple counterparties, which meant the outcome was the same – the LP was given at 20 and it was 19 offered (often much further away!) Now it is the advantage of speed of technology that provides the edge.
I was a little surprised to read CFTC commissioner Brian Quintenz’ statement when discussing the ICE speed bump proposal that, “The goal of financial markets is not to protect or shelter the less informed”, because I naively thought that was the job of a regulator – to create a fair market. Quintenz added, “Those that invent, and invest in, faster information transmission technologies to capitalise on market dislocations reap the profits of their advantage. That process enhances market efficiency…”
Putting aside the inference in Quintenz’ statement that there is no place in these markets for retail or less sophisticated punters, I suppose the question to this should be, “exactly how efficient do we want our markets to be?” For if this technological excellence did indeed improve market efficiency, why are markets still inefficient? If they weren’t then these strategies would be obsolete surely? I would argue that there should be a finite point at which we can agree markets are “efficient” – and it’s not at the speed of light! Nor should it be, I would argue, in microseconds – I fail to see how buying, building or renting microwave towers improves an end user’s life in financial markets. A trader looking to make a massive number of short term gains, yes – an investor or corporate hedger, no.
It could also be argued that the definition of “efficiency” should have nothing to do with speed – to me an efficient market is one in which liquidity is robust enough to enable everyone to execute their interest in a reasonable manner, whether it takes five microseconds or 200 milliseconds is largely irrelevant to the end users of financial markets. In what appears to be Commissioner Quintenz’ ideal world we should be keener to reward those willing to spend a fortune on getting from five microseconds to one than we should to protect investors actually trying to do a trade for the purpose of holding risk for more than half a second.
So to answer the question will it work? Yes it will, as long as it enables those slower traders, such as asset managers and other hedgers to actually get a look at the price. If these clients can look at a deeper top of book, because more LPs are happy to price then market quality would be better. It is not that easy, of course, because as the FX market has shown, the HFTs will reinvent themselves as market makers and use their speed to ensure they can cancel a quote before being hit. The ideal place for these firms would be a short way down the list at top of book – someone else takes the pain of being hit by a larger trade and the fast market maker gets the information without the trading pain.
There is also, sadly, the question of some LPs attitude to the markets, namely, they should make money on every trade. A deeper top of book could also attract more “sweeping”, especially if it is a fragmented market and again a lot of sweep protection is likely to involve speed (and last look, but don’t get me started!), however too many LPs detest this practice. I think it depends upon the relationship to a degree (which is a lot harder to discern on anonymous venues of course) but there are undoubtedly participants out there who are predatory and sweep books at will to make a quick turn. How is this different to a latency arber – apart from the size and frequency of the trades involved? The likelihood is that both trading styles ending up costing the LPs the same amount of money over time, so it’s a question of how you like your pain – spread out over a longer period or in short, sharp bursts? The upshot is that the nirvana of more LPs entering the arena because they are not going to picked off by speed merchants may be fantasy, if for no other reason than there just isn’t that much profit to go round.
Of course, the easy way to protect against being swept (and into the bargain encourage more LPs into the market) is to limit where you price and only put at risk the total (not last look-adjusted) amount you are actually willing to trade, but don’t get me started on that one either…
So on balance, having read Galen’s articles and spoken to people over the years about the issue, I find that while I remain uncertain on the real benefits of a speed bump I do like the idea of an offshoot of that idea – a latency floor. It believe that Commissioner Quintenz and other supporters of speed trading are misguided in their fervent belief in getting quicker and quicker. The economics of reaching for the speed of light may work for a small number of firms and that’s fine I suppose, but markets should be for the majority, not the few – and when it comes to the majority there is a widespread acknowledgement that while it’s not OK to have creaking technology that moves about as fast as me in the morning, there is an acceptable level at which everyone has the same opportunity.
Fast enough should be good enough – I suspect what the proponents of speed bumps are telling the world is that they think we have gone beyond that threshold.