I played golf the other week, you’ll all be pleased to hear, and it didn’t go too bad. The only real hiccup was when the pro shop directed us to start on the 7th hole we arrived there to find we were behind three other groups of players – at which point my friend said, “It’s OK, we’ll go over to the 9th and start there”, which indeed we did.
Having hit a sumptuous drive (well it was reasonably straight and got past the ladies’ tee) I found myself contemplating an approach shot to the green, I looked at my GPS watch, which told me I had 170 yards to go to middle of the 9thgreen, selected the club…and hit the ball about 50 yards past the hole into the face of a rock outcrop (which no doubt, this being Australia, was full of things looking to bite me). My friend looked at me and made a comment about club selection, at which point I showed him my watch, which – although we had walked at least 40 yards towards the green, was now telling me I was 210 yards away from the 9th green.
It was at this point my ‘friend’ (I think he did it deliberately) revealed that actually what we had done was start on the 10th hole and as such I was in reality only 120 yards from the hole when I hit my second, so it was no wonder the ball went into snake territory.
I relate this story not only because I escaped with a net par off a bandido handicap, but because it came to mind during a discussion with one of my guests on this week’s podcast, Rob Loft. I don’t think it’s a revelation that the data has to be right to anyone in the markets, but since the podcast was published I have had a few messages and chats about data, including the snippet from one major player that they are looking at cutting the number of data providers the way some banks have, or are looking to, cut the number of platforms they support.
Again, I am not exactly surprised by this, the feeling being, apparently, that the firm concerned “rushed into” snapping up every piece of data available over the past couple of years, only to find, once its analytic capabilities were suitably enhanced, that there was plenty with limited or no value. One or two people I spoke with thought this could mean that the downward pressure on data prices in FX will continue, although there was a sense that this could mean those that rise to the top when it comes to data provision, could hold their prices in place.
I am less sure of that because as the FX market continues what looks to me like a journey towards “fast enough is good enough” in terms of trading speeds, we will hit a natural floor that everyone is comfortable with. Data services can then be throttled to that speed, which may – and I stress the word may – actually still leave time for the recyclers, who get their pricing from the primary venues to publish elsewhere, thus creating good enough data for other users.
With so much liquidity recycling going on it must also be hard to discern where the really valuable data is. Some venues are faster than the primaries, for example, but their data is a direct derivative of those primary venues’ – which one offers the best value? One thing is for sure, most firms will not require both sources, but that doesn’t mean they will get rid of one or more. As things stand, people tell me it is all about the pricing engines and where the data they prioritise comes from and while some platforms have a higher profile than they did a few years ago, it remains with the primaries when it comes to what is considered the “real” market.
Of course, it may not stay this way forever – as I have noted in this column before the primary venues’ volume numbers make for pretty grim reading if you’re into trends and you have to wonder at what stage their pre-eminence is threatened? Certainly a continued decline in ADV – if it is set against a backdrop of steady turnover on other platforms, would not be a good sign, however it would be important where that data comes from. Just as many customers prefer to deal on no last look liquidity, so many LPs like their data from firm liquidity streams – hence why the primaries have held their position of strength.
If the primary venues continue to decline, and other platforms offering firm liquidity can get to the 20 yards a day area (on those firm streams only, no blending!) then the gap becomes much less meaningful and the decision to pay some pretty serious money for a data stream less easy to justify. The interesting period, however, will be the switchover from one to the other – and that is why I suspect that aside from a few trivial sources, most data streams will not be cut off. If there is a period of indecision between who offers the most robust data stream then the obvious answer is to take both, because when all is said and done, it must be easier to re-prioritise an existing data stream than re-write the whole infrastructure to take one off and put one on.
It’s a bit like my golf game, I know I could play (a lot) better, but to 99% of people how the ball gets to the hole is of no consequence, it’s the score that matters. As it is with pricing. At the moment, everyone wants to hit one down the middle and the primary venues are the best way to do that, but who can’t argue it isn’t fun every now and again to go the scenic route?