There is a changing dynamic afoot when it comes to relationships between service providers and clients in the foreign exchange industry, one driven partly by liquidity providers (LPs) developing a better understanding of the value of their clients’ flow and partly by clients seeking to optimise their execution – specifically by reducing market impact. Colin Lambert talks to Roel Oomen, managing director, e-FX spot trading at Deutsche Bank, about his latest research paper that advances the study of optimal liquidity aggregation via a data driven analysis of price signatures.
There are some great numbers bandied about by some LPs regarding their internalisation rates – long seen as a base line indicator for lower market impact – but in reality it is often difficult to prove or disprove such claims. Fortunately, the availability and quality of data available from the FX market has improved to the degree that deeper, more meaningful statistical analysis can be conducted to help clients better understand, and engage with, LPs that better suit their execution style and objectives.
The new paper, Price Signatures, authored by Roel Oomen, managing director, e-FX spot trading at Deutsche Bank, develops a data-driven approach to optimal liquidity aggregation. It builds on the theoretical principles outlined in his earlier work that looked at the game theoretic aspects of aggregation, the impact of last look on execution quality, and most recently, the broader issue of internalisation by LPs. The new paper unveils a new statistical model founded upon Functional Data Analysis (FDA) and can, Oomen suggests, “meaningfully help in aggregator design and monitoring of execution costs”.
“There is great diversity across the industry in terms of how aggregation is done, what factors are deemed important, and what benefits it brings to the execution process,” he explains. “This research provides a framework and a process that can be deployed by clients and LPs across the spectrum to help them build sustainable liquidity relationships for the benefit of all those involved.”
“The case studies are based on actual trading data provided by Deutsche Bank and its clients, but the methodology is general and widely applicable,” he adds. “I think this model can help the broader industry navigate the diversity of business models and objectives supported by FX markets.”
Price signatures are statistical measurements that aim to detect systematic patterns in price dynamics localised around the point of trade execution. They are particularly useful in electronic trading because they uncover market dynamics, strategy characteristics, implicit execution costs, or counterparty trading behaviours that are often hard to identify by simply eye-balling the data. In part due the vast amounts of data involved and in part due to the typically low signal to noise ratio.
A price signature is computed as the volume-weighted, trade direction adjusted average price movement over a pre-determined interval centered around the point of trading. Whilst straightforward to calculate, judging its statistical significance can be much more challenging. This is because the signature is a curve rather than a point-estimate and the data feeding into it may be overlapping when – for instance – a trade is broken up and executed in quick succession. The statistics are non-trivial but Oomen’s research makes significant progress by leveraging recent advances in FDA and combining it with bootstrap resampling methods to account for any data dependence.
A clear illustration of the value of the approach can be found in figure 2 in the paper where thousands of “noisy” post-deal price paths, that visually lack any apparent structure, are aggregated into meaningful and interpretable signature curves that measure market impact, the speed at which it gets realised, and how it varies by currency pair. The statistical methodology can then be used to test for significant differences between these curves, Oomen says, and this can be done over any time horizon of interest, from milliseconds to hours or days. This allows one to ask and answer important questions, e.g. does the market impact of my traders differ by their execution style? Do I incur less impact when I trade with LP A or B?
The paper uses three case studies to illustrate the power of the approach. “We demonstrate that, for instance, it is possible to distinguish between internalisers and externalisers in a quantitative data driven manner,” says Oomen. “The case studies – representative of many we’ve conducted in partnership with our clients – show how mutually beneficial outcomes for both parties can be achieved via this approach and can help to better align the objectives of the trader with those of the LPs in their aggregator. It removes friction and inefficiency from the trading process.”
Oomen also points out that this approach can help to eliminate the often encountered “prisoner’s dilemma” within the aggregator and therefore lead to a significant reduction in execution costs by making sure the client’s objectives are matched by the LPs business model and risk management methodology.
Oomen provides three real world examples where clients have benefitted from such analysis.
The first highlights a case where the client makes an extreme change to their aggregation model. The trader shifts from using an aggregator with what has been an ever-increasing number of LPs over a period of time, to an exclusive relationship. “This trader is active but generally trades in smaller sizes and in quick succession,” Oomen explains. “While growing the number of LPs is often hard to resist for traders, in this particular example it has led to higher reject rate and wider spreads quoted by the LPs, which in turn means more work managing the LP relationships, and quite possibly a deterioration in liquidity access.”
The post-deal signature suggests – at the face of it – that the trader’s flow is latency sensitive and directional up to a time horizon of one minute, meaning the LP that wins the trade has less opportunity to internalise the flow. “However, an important consideration in this case study is to question whether the trader’s flow is actually benignat source – if it is not then there is little that can be done and rejects and defensive pricing will become a characteristic of their trading day,” says Oomen. “It could, however, be a problem with aggregator design and that is much easier to rectify.”
To measure the impact of aggregator design the trader in question moved all flow to a single LP, who in turn offered a competitive spread and passively internalised the risk at 100% fill rate. The results indicated that it was indeed the aggregator design and the risk management style of some of the LPs rather than the trader’s intrinsic flow characteristics that was causing the short-term adverse impact.
“The end-result in this case study is a meaningful reduction of transaction costs and elimination of execution uncertainty for the trader giving them a competitive advantage, a simplification of workflows, and a sustainable liquidity relationship with their LP,” Oomen explains.
The second case study looks at a trader with a smaller amount of LPs in their aggregator – seven – and seeks to solve the problem of identifying how their respective risk management styles impact the trader’s execution quality. “This trader noticed that certain trades were having greater market impact than others but couldn’t quite identify a clear underlying pattern,” Oomen explains. “A rigorous and systematic data driven approach was needed to make progress, and it showed there were very significant differences amongst their seemingly homogeneous set of LPs.”
The study finds that within the seven LPs three distinct risk management styles emerged, passive internalisers, impatient internalisers, and aggressive internalisers/externalisers. While the approach within each group was broadly similar, there were highly significant differences between the groups.
“A trader’s ability to distinguish between competing liquidity offerings – not just in terms of spread or amount shown, but also by the LPs’ footprint in the market when they risk manage the trader’s flow – is a pre-requisite to efficient execution,” Oomen states in the paper. “This case study shows that the techniques presented in this paper allow the trader to classify LPs by their risk management style in a data-driven manner, and can then continue to monitor for any statistically significant changes to this classification over time. This in turn, enables the trader to selectively engage only with those LPs whose risk management style are compatible with their execution objectives.”
Perhaps more significantly when looking at the relationship dynamic, the findings of the second case study could lead to a change in the internalisation sales pitch. There is little doubt that some internalisation statistics promoted by certain LPs are selective in nature. By using the suggested data driven approach, the different methodologies and even definitions of internalisation will be highlighted.
The third and final case study involves a trader that takes the initiative to conduct an experiment. In the first seven weeks the trader leaves the composition of their aggregator unchanged: it is composed of multiple largely internalising LPs. Then, for the subsequent two weeks the trader adds a candidate externalising LP.
The trader’s execution style and trading activity is unaltered over this period, and their objective is to understand whether the addition of the externalising LP has any measurable impact on execution performance.
“Our approach was very quick to pick up on the impact that the externaliser had on the trader’s execution quality,” Oomen says. “For the first seven weeks the price signatures were statistically indistinguishable but when the externaliser was added we were able to pin-point the associated change in impacts within 25 minutes or just 83 trades. Considering we analysed the flow over nine weeks and it involved 15,000 trades that is quite remarkable. (INSERT FIG 13D)
“This approach provides the trader with a very powerful tool to monitor LP behaviour, for it is not only about adding a new LP, an existing provider could change their risk management style,” he adds.
Looking ahead, Oomen points out that price signatures are only one specific example within a broader class of signatures for which the FDA and bootstrap methodology is applicable. For instance, he notes that signatures of spread, liquidity, or market activity dynamics can be constructed and studied. Similarly, signatures for rejected trades, unexecuted quotes, or trading events around macro economic announcements may be considered. As such, the results have wide applicability well beyond aggregator design, including transaction cost analysis, strategy performance measurement, the study of market integrity, stability, and liquidity conditions.
“There can also be a use from a MiFID II perspective,” he observes. “The signatures can provide a quantitative characterisation of dealers classified under the systematic internaliser regime and feed into best execution considerations.”
For now, however, the paper represents a significant step forward in creating a data-driven approach to building an efficient aggregator. The analysis, and the process developed, can be used to help traders identify those LPs with higher internalisation or externalisation rates. By doing this, as well as making the LP management process more efficient, it can deliver significant bottom line benefits – something close to the heart of all traders – by offering measurable improvements in trading performance.