If there was one takeaway from the demonstration sessions that form part of the process in judging these awards, it was the air of optimism amongst the bankers showing their wares. Probably for the first time in a decade just about every institution seen had budget for the coming year and, apparently, a pipeline of projects that were already funded.

While this may make for miserable reading for those who have not secured those invaluable resources to further develop their platform, for the industry as a whole this would appear to be good news – and a remarkable turnaround considering the mood just a few years ago was that the single-dealer platform (SDP) was dead. That it is not is thanks to the recent obsession with data and analytics – put simply, third party vendors have not been able to create analytics packages to match those available (or coming) on the SDPs.

In keeping with this trend, the vast majority of new work – as opposed to work intended to fill gaps in the existing service – has been around the execution piece, with a string of algo offerings being enhanced or rolled out for the first time. As always, the words “unique” and “innovative” were casually thrown around – and as ever, only rarely were they justified. That inevitable cynicism aside, however, it is clear that banks have responded to their customers’ needs for more control, validation and involvement in their FX execution.

The first generation of algo offerings for example, as noted in these pages previously, had the problem of disenfranchising, or being seen to do so, those customers responsible for execution. Why would an execution trader simply hand over a trade to a bank-sponsored algo and then just watch it go? Perhaps it is the better data and analytics available, or perhaps it is merely evolution, but the last year has seen those responsible for producing the algo strategies deliver greater flexibility – to a point.

The outlier in this trend has been the rolling out of what could be termed “on behalf of” algo strategies (the banks will of course have much sexier names!) that effectively hand over the bank’s execution technology, including last look, to the customer. It is hard to see a drawback with this approach and indeed it does fill a gap in the algo strategies on offer, however there are some who fear this merely takes the industry back to disenfranchisement. Time will tell.

Something that will become a bigger issue as these new algos are rolled out and the engagement process between bank and customer over execution tools intensifies, is an old favourite – internalisation.

Historically, this has been a scale game with those banks with the biggest pockets and highest volumes winning the argument, but this is the year during which people will start to ask questions – starting with “what is true internalisation?”

Some pretty high numbers are touted around internalisation and most of them are, depending upon your definition, nonsense. Internalisation should be about no market impact and no signalling risk – the risk is merely soaked up without ever hitting the street. Currently, we suspect, there are plenty of organisations claiming internalisation when they skew a price to their entire client base and get a hit. There is, grudgingly, something in this, but it conveniently ignores the signalling risk associated with such an action. The last year has seen some serious strides made in this area with multiple banks following the example set by Morgan Stanley two years ago in seeking to build bespoke liquidity pools.

This development could be a serious challenge for the multi-dealer industry, which often makes the right noises about the quality of a liquidity pool, but does not always back it up with action if a big account messes with the karma by externalising, rejecting a high volume of trades, or reading the skews from other accounts. There is not the incentive for a multi-dealer platform to sanction a large customer or to try to remove them from a pool in which they want to operate, however, for the banks there is – and this seems to be the approach taken over the past 12 months or so.

If the banks are to get these quasi internal ECNs up and running successfully, they need to seriously monitor conditions and activity in these rooms – it is vital if these liquidity pools are to be successful. Luckily the data and analytics – those words again – are now available to assist them. Just as we have been speaking about how the data takes the emotion out of the service provider-customer discussion, so too can it when it comes to who is allowed into a liquidity pool. There are too many unhealthy analogies that spring to mind here, but in the spirit of keeping it clean, if you pollute the pool you will be thrown out.

If these pools are successful, then this is likely to be the new internalisation (some will argue, rightly of course, that it has only ever been the correct definition) and it will allow customers to interrogate their banks much harder. There has long been a challenge around building a platform that meets the needs of all client groups and, generally speaking with a few more pivots to HTML5 to come, most of the top tier banks have achieved this in functionality terms. What they now need to do is focus on delivering liquidity pools for those top-tier clients (and this is not necessarily measured in terms of volume of business) that allows the bank to maintain that stickiness.

We have heard a lot about integrating into the client’s workflow over the past two or three years, and the fact that we keep hearing it suggests that the clients have been reluctant to effectively sign their life over to one or two institutions (that or the integration work is taking an eternity), but here, in better, monitored, execution, is a space in which the clients would likely be happier to tie themselves in. Improved execution performance can be measured (independently if required) and if the bank’s principal business is not dominating the liquidity pool, then the client can have confidence in their overall execution quality. If it doesn’t, in the future componentised world, they can easily switch to their alternate provider or a new one entirely.

Probably the one aspect that will help drive this push for greater (true) internalisation is the banks’ collective willingness to be more open and transparent about how they handle flow. Several have engaged third party data providers to deliver independent rates for comparison and all make every aspect of the trade and process available to clients. This delivers confidence that clients require – especially when they are faced with increased scrutiny of their execution processes themselves.

Another big theme this year is a hangover from last – the aforementioned pivot to HTML5. Call it great decision making, foresight, or pure luck, but those banks that decided a couple of years ago to build on HTML5 are offering a more complete platform and have a small, but not insignificant, advantage. Feedback from users, and indeed our own observations, indicate a small, but again not insignificant, drawback in having platforms operating on two technologies.

It’s not just the look and feel however, one can sense the impatience and anticipation on the part of several bankers to exploit the flexibility of HTML5, thanks to another name we heard a lot this year – OpenFin. More than one representative of a bank highlighted how deploying this technology – and it’s not often in these awards we reference a third party vendor – had enabled them to create apps quicker, ensure they operate across technologies and platforms, and that are easy to update.

This development has meant that an increasing number of banks are looking in a direction that only a few had previously – delivering parts of their platform rather than the whole package. There is a growing (and perhaps grudging) acceptance that some legacy platforms are just too bulky (hence the shift to HTML5), but also that clients may, in the future, like to build their own platforms. They aggregate pricing from multiple providers, why not look at aggregating platforms into one or two screens?

This represents opportunities and challenges for the banks because, as noted, there is not a lot that is unique out there – and if something is unique (and good) it doesn’t take long before word spreads and imitations emerge. It means that SDP functionality will be competing at a micro level as well as macro in the years ahead and in execution terms that may mean a more formal arrangement than has been the case previously, where, for example, a customer did the analysis on one platform and then executed on another!

The delivery mechanism in such a componentised world remains up in the air, for while most banks still stand behind Symphony – or rather a future iteration of it – feedback from users not on the network is that the obstacles to getting onboard are simply too steep. Whether this represents an opportunity for a competitor or a roadblock to the FX industry’s vision of the future of the SDP remains unclear.

One incumbent that does seem to be tapping into the open source world is Bloomberg, with several banks discussing plans to have the platform host their apps for execution. This would be an important step forward for the platform (it remains relevant in an increasingly competitive and cost-constrained world); the customers (they have choice through one connection); and the banks (they can lose the cumbersome “view only” service and grow distribution).

Whichever vendor wins the battle to join the dots in a cost-effective manner, there seems little doubt that the future revolves around choice and the ability to downsize, rather than scale, a service. In turn, this will shape the skill set of the next generation of e-FX sales and relationship people. Of course there will still be plenty of opportunity to discuss the client’s execution record and strategies, but it will also be about being a real product specialist, about understanding the very subtle edge that one particular part of the platform can deliver.

There has, then, been a broad step forward by the banks in terms of their SDPs, as well as targeted pushes in certain product areas – and as such, we feel this is the appropriate time to introduce a new look to the Digital FX Awards.

In previous years we have repeatedly noted that our awards are not judged on scale – the biggest is not always the best is a mantra of the Digital FX Awards. We have also observed that the margins between winners and losers are always razor thin – and quite frankly, in recent years it has become increasingly difficult to separate contenders in most fields. This means that some really good product developments are relegated to a single line in the Report Card, which does not really do them justice, and the problem is exacerbated when – in years like this – developments and innovations have been on such a narrow front.

There was also the sense that some of the awards were getting a little stale, while feedback from readers remained positive it was noted that some Report Cards could be a little repetitive.

Hopefully we have solved the issue with our new format. The process remains subjective, it is important to stress however, we have not restricted ourselves to the narrow confines of previous years’ categories. Instead of many of the categories, we have created Awards for Excellence in e-FX, something that we believe will deliver a more accurate picture of where the innovation has taken place over the previous year. It may not be music to the ears of some readers in the bank eFX space, but filling product gaps alone is not additive to the industry beyond a little bump in competition levels. What we hope to achieve with this new structure is a more accurate look at where the new development is taking place.

Two things to note on that; firstly, a product that fills a gap can be a winner if it raises the standards bar sufficiently, and these awards will steadfastly continue to avoid being an ‘every child wins a prize’ event. This year the write up is extensive because of the level of investment and product roll outs that have occurred; in quieter years this will be a much quicker read!

Before the traditionalists among you go running into the streets to take up arms and attack P&L Central, we need to point out that we have not changed everything – there is still a high level of competition in these awards, it’s just in fewer categories. We will continue to pull everything together in Report Cards for five or six categories, and we will name a winner in these, but rather than replicate this approach for the other 18 awards presented last year we will adapt our new approach.

Going back to the awards, we want to close this introduction by noting one development that we find impossible to monitor accurately, but that plays a crucial roll in enhancing the FX market experience for all participants. A number of banks that were struggling with their pricing finally rolled out their new technology frameworks last year. Although impossible to judge, this is likely to mean enhanced pricing for clients through increased competition. With the banks able to price more confidently and smartly, the result for the client should be a better overall experience, unless they are of the more aggressive variety, in which case they will probably not find the banks the easy touch they once were.

It is this confidence in the pricing that we think has supported and underpinned the surge in optimism amongst FX bankers. They may not win a tremendous amount more business, but they will be able to handle what they do win much better – and that infuses confidence through the whole business. Salespeople can go after different client types that were perhaps off limits previously and managers can support pricing in greater quantities and with tighter spreads than they previously felt comfortable with.

With this background of confidence and optimism then, we hereby present the newlook 2019 Digital FX Awards

Galen Stops

Share This

Share on facebook
Facebook
Share on google
Google+
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on reddit
Reddit