At face value the report that the US government is seeking to change how it releases economic data to markets seems to be yet another attempt to put the high-speed traders in the box seat, but in reality it is just another attempt to rearrange the deckchairs on the Titanic.

Currently, as I have discussed before, journalists are locked in a room which is cut off from the outside world, and provided with the data so they can ensure the eventual release is accurate, add details and, perhaps, provide instant analysis. At the designated hour, the Internet is switched on and the accredited news organisations send their stories. A story on Bloomberg News last week suggested that this process is likely to be ended, with the government merely loading the data onto various government department websites – this way it will end the “unfair advantage” given the news organisations.

First up I am trying to work out why accredited news organisations having an “advantage” is a problem? These are not market participants, they do not (and indeed cannot while they are locked in a room without a phone) take market positions on this data, all they do is distribute it to the widest possible audience at the same time. What can be fairer than that?

Well it could be argued that this system is not, in fact, fair, because not everybody can afford the price of these news services. I have always thought, however, that is why we pay fees to professional money managers, so that they can pay out for the technology to access this kind of information. Retail traders have to accept they are at a disadvantage and stop trying to do something about it. An (admittedly loose) analogy is the lottery; if we choose to play then it is going to cost money and the more money we invest the higher our chances of winning – so it is with technology and markets at the moment.

The other reason it could be argued the current system isn’t fair is the issue I raised in my aforementioned column in 2013 – news reading algos mean that high speed traders, by tagging and mapping to relevant parts of a release can trade on the information quicker.

The fact is though, technology is now a huge part of financial markets and that particular genie is not going back into the bottle. That said, I  struggle to see how shifting data releases from professional news services with hard-wired connections among market participants, to a website is going to dilute that speed advantage? Unless of course, the government is banking on its technology not being able to stand up to the surge in traffic, resulting in the crash of the website and no advantage for anyone!

This seems to be change for change’s sake. Yes, we need innovation and thinkers willing to challenge the status quo, but it doesn’t mean every idea has to be taken up – some things work as well as they can and should be left alone, and this is one such instance in my view.

If the intention is to make access to data fairer, this idea is doomed to failure – unless the government is willing to get involved in some randomisation. I am not talking about randomising the timing of the releases, the news reading algos would still gain an advantage, more I am wondering if the releases couldn’t be published to the website in a different format? That way, a news reading algo would not be able to map to the site as easily and may take time to work out the news (it would probably still be quicker than a human). This would mean, of course, establishing a process aimed at deliberately confusing the algorithms – I am not sure how well that would go do in the very litigious US but I have a good idea!

If we are really looking to make markets fairer there is little point looking at how we release the data. Unless one is of the CFTC Commissioner Brian Quintenz school of thinking and believes that nothing should be done to protect participants from faster traders, then one could take a lesson from the FX markets and make it standard on exchanges and the like to have an order randomiser, or speed bump. This would ensure that latency traders at least would be taken out of the game.

I am, however, confident that my beloved West Ham will win the English Premier League before that happens (yes, that’ll be never for my non-football aware readers and before the rest of you get in touch) so what is there to do? In all honesty, I think the answer is nothing. There has always been an advantage in getting the information first, not just in financial markets, so why do we think we can change it now?

I am firmly of the opinion that speed bumps and the like help build more genuine liquidity pools, but it is equally clear to me that some markets will never be open to this – the HFTs just offer too much, whether it be an illusion of liquidity or trading volumes and brokerage, to certain venues.

Unless we are willing to deal with the crux of the problem, the race to the speed of light in trading, then everything is just window dressing – and the idea of changing how data is released, to the extent that it will probably transition to much less robust channels, indicates to me that no-one really wants to tackle the real issue. All they are really talking about here is crippling a news industry business model.

Colin_lambert@profit-loss.com

Twitter @lamboPnL

Colin Lambert

Share This

Share on facebook
Facebook
Share on google
Google+
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on reddit
Reddit

Related Posts in ,