As more institutions seek to offer algo execution, specifically involving multi-ticket orders, I sense that the approach they will have to take must differ from the first movers some years ago. As the next generation comes through it is likely to see an uptick in volumes executed by these means, which will be good news to everyone involved in the agency algo business.

Proponents of this service will have us believe that algo execution is constantly on the rise, however finding hard data to back that up is a little tricky. I have recent, personal, experience of how difficult it can be getting banks in particular to talk about some of the details behind their algos and that apparently extends to the kind of volumes being seen.

On background, people tell me that algo execution volumes rose decently in 2017 and early 2018 and then flattened out – they didn’t fall back, which is important, but moving into 2019 the story seems to be – with one or two exceptions – one of continued steady activity, rather than growth. My personal theory for the growth in 2017-18 was a surge of banks using algos to execute client business at the 4pm Fix and if that is correct, then the question has to be asked, how successful have algos been away from that specific instance?

Having studied the algos on offer from the banks over several years, and spoken to people about them, it is clear that the strategies themselves – while they have their fancy names – actually don’t differ that much. The differentiator is in the liquidity available to the algos before hitting the street and some of the analytics around them – post pre- and post-trade but especially the former. More recently the quality of “in flight” TCA has helped push one or two offerings ahead, but the old saw about past performance not being an indicator of future events still stands firm.

Two or three years ago we had a rush to market that saw a surge in the number of firms offering algo execution, and I suspect that we are on the cusp of a second wave amongst regional and lower tier players as they seek to retain key clients – this development could help push the market away from its current plateau. If that is the case, however, the newcomers will have to take a different approach, for we exist in a different world to that of 2016-17 when what is probably the second generation of algos was being pushed to clients.

To me, the next generation of service has to “hero” the analytics piece. We have moved beyond the stage where a (non-corporate) client wants a “point and click” algo that they just set going and walk away from until the TCA report comes in – the likely growth area in algos is likely to come from firms more concerned about execution quality, and that means giving these firms the support tools with which to make decisions. Of course the algos have to be well thought out and they have to have robust controls and, importantly, be able to mask the business to a certain degree from prying virtual eyes, but to me the important aspect of the service takes place before anyone pushes a button. Firms don’t like to use phrases like “execution advisory” in this litigious age, but in reality that is what is on offer – the alternative is probably to offer them no data at all, which will get us back to the trading blind situation where they press a button and walk away. For all the excellence of the algo technology out there, the number of clients that actually want to cede control of the order seems to be diminishing, which means they have to be provided with the data.

Talking to someone building algos for a regional player recently they confided that the biggest challenge facing them was the ability to mask the order during execution.  As a regional player, or indeed any firm outside the top five or six players, there is always signalling risk around the use of algos – these firms are simply not active enough in the public market to avoid standing out when their algo hits the market. As I have noted before, I don’t buy the internalisation rates touted by even the major players, but move away from the top five or six and they are significantly lower, however you package them. Of course, one way round this is to build one’s own liquidity pool with bilateral relationships only, but even there, as I have discussed before when talking about “full amount” orders, there is information leakage and signalling risk.

I did actually wonder whether the advent of regional players using algos would negatively impact the relationship with the major LPs, meaning creating these pools would be much harder, but I don’t think that will be the case because – and this is the height of cynicism I know – by having these firms use their own algos, the LPs can, as one put it to me last year when discussing analytics, “see the business coming from a mile away”. If the customer firm is using the LPs’ algos then there are a different set of responsibilities.

So I do actually believe the proponents of algos when they say volumes will rise, it’s just that I am not sure it is going to be a smooth evolution. Whichever way we look at it, LPs and many of their customers are in this business to make money and the looser the relationship, the more opportunity there is to take advantage of the wealth of information that is out there – whether the customer wants it out there or not.

Aside from that digression, however, if I was a regional player looking to offer an algo execution service, I would want to have my own analytics available to the client because only that way can I show them the type of experience they will get. Inevitably there will be more public market data in the package and this needs to be reflected in the expectations piece – the pre-trade analytics. For if there is one thing that can easily define the algo execution experience, it is that it is one of reality versus expectations.

Colin_lambert@profit-loss.com

Twitter @lamboPnL

Colin Lambert

Share This

Share on facebook
Facebook
Share on google
Google+
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on reddit
Reddit

Related Posts in ,