A Closer Look At: Real-time Transaction Cost Estimation

Morgan Stanley’s Matt Thomas, global head of MSET macro sales, and Jian Chen, head of quantitative solutions and innovations (QSI), explain how transaction cost analysis (TCA) is evolving into a real-time estimative framework tool.

Profit & Loss: Talk to me about how you see transaction cost analysis (TCA) evolving in the FX space?

Jian Chen: If you look back at the history of TCA in QSI we started with a static tool, a TCA calculator where people could insert inputs and, based on theoretical model assumptions, look at the costs for different strategies. This was the first phase of TCA.

For phase two, we added more real-time order updates and market colour, for example, adding more information on liquidity conditions. It is this phase that people generally refer to as “real-time TCA”.

What we’re now doing is going beyond those first two phases and actually estimating market conditions during your trading horizon and then using this to estimate the cost of your transaction in realtime. I think that this is a more in-depth form of TCA and it’s the most meaningful for clients because we’re not just showing information, we’re estimating specific numbers associated with the transaction and putting science behind those numbers.

Matt Thomas: I would add that we’ve always had a great pre- and post-trade analytical system in the QSI framework, but I think that the market has become quite saturated in this area with everyone having different tools to show what transaction costs look like. So this year we decided to focus more on the actual trade.

What I mean by this is that we’re offering more real-time comparisons on the outcomes of various decisions that you might make as a trader and putting that into a context of understanding your in-flight costs, your opportunity costs. So our system shows the amount of time that might be needed to execute a trade, gives clients a sample selection of execution styles to choose from and then summarises what we think have historically been some of the costs and risks associated with each of those execution styles so that the client can make more informed decisions while trading.

Because I think that lately clients, regardless of how sophisticated they are, have been looking for more decision validation tools. So whereas in the past all the analysis has been about realised transaction costs, what we’ve developed is more focused on estimated costs that firms might incur taking into account the different objectives of the client.

P&L: Does this then play into the desire for increased auditability then? So people can explain why a decision was made not just before or after but during a trade?

MT: I think that’s a good question because investors like to see and receive a lot of data and many will often hire scientists to examine that data. I think there is an important skill set in taking data and then adding a nice visualisation layer so that people can understand it to examine the data set. That visualisation layer can help develop the narrative around what your objective is and what you actually achieved. Some people will take that data and print it off and stick it in a drawer as a piece of audit to prove they did the right thing, and others will treat the data very scientifically and use it to refine their trading and look for areas to improve in future trades.

So you have to be able to accommodate different types of people without confusing them. Empower, don’t intimidate – that’s the feedback we get from a lot of clients.

P&L: From a technical standpoint, what are the challenges associated with this transaction cost estimation?

JC: There are actually a number of challenges. First of all, an accurate cost estimate will depend on having the right data. And while for certain currency pairs in certain market conditions we might have a large quantity of this data, in other currency pairs where algos are not used as frequently we might have much less. So the key is being able to combine all the information we have to ensure that our predictions where we have a smaller sample size of data are as consistently accurate as the ones where we have a very large sample of data.

A second challenge lies in estimating transaction costs for particular market conditions – how do we define the market conditions, and how do we model the markets? That’s difficult because markets are extremely complicated and dynamic. And we don’t want to overfit the market so we want to use as few parameters as possible to define the market, but we also want to capture the essence of how the liquidity and volatility are interacting with each other to estimate the market as accurately as possible – this by itself is also challenging.

MT: If you look at equities, pre-trade analytics isn’t as important anymore because I think people trust the process and focus more on anomalies in the post trade data. But FX is still in a phase where people are trying to figure out how markets are normalised and what the structure is, and you’re still in a hybrid trading framework where humans are using data to pick trades.

So as we deliver this tool, which shows clear costs and risks associated with trading in a particular way, there will be a tendency to try to change and switch things, to put limit orders in and poke at the trade and alter it over time. With human intervention, your TCA is going to end up comparing apples and oranges. So it might not be a technical constraint, but we have to work with clients to help them trust the process so that they can refine it better over time.

P&L: So is there a big educational element associated with this technology?

MT: It’s a huge education process, every client is different, so we will tailor the message to make sure we’re providing them with the exact details they need.

JC: TCA in FX is still evolving and dynamic because liquidity is bespoke and therefore a differentiator for providers, unlike equities
where the data is transparent, and standardised and available on the exchanges. Therefore, there is still an education process around how you can both access unique liquidity, but also have a standardised approach to measure performance.

P&L: Is this Transaction Cost Estimation really needed mainly for the more complex trades firms might be executing then?

JC: No, not at all. Even something as simple as deciding if you want to trade risk transfer or use an algo can benefit by using this, because it means that you’re not basing that decision solely on your experience anymore, you’re supplementing with additional data based on market conditions that will tell you if an algo is really providing value versus return for price.

MT: You also have to bear in mind that the role of traders on the buy side is definitely expanding. They’re becoming quantitative experts, they’re relationship managers, they’re system integrators, they’re building or buying things all the time to work into their system, they’re trying to understand market structure. And while execution is a big piece of their job, it’s not the only thing they do. So giving them these tools to help them with their daily processes and alleviate some of the homework that they have to do on the execution side to help them achieve the most favourable execution for their execution goals is important.

P&L: So what are the next steps from here for the development of Transaction Cost Estimation?

JC: Collaboration with our clients is an important and continuing part of our approach. Going forward, we will work closely with clients around a single tactical framework to give them the tools to help them decide what’s the most important factor for making their trading decisions. We would also like to continue evolving our products in order to be able to more accurately gauge how market conditions may impact their trading activity and therefore the estimated transaction cost.

MT: Creating less friction in our clients’ workflow is a key driver of our product. It’s worth pointing out here that some of the client requests that come through are restricted by certain workflows on their end. Regardless of how intelligent our front end tools are, they’re asking for more integrated services and raw data. Clients would like to make decisions using our data, but within the workflow they’re used to rather than having to pop in and out of screens.

That’s a trend that we’ll work towards. I also believe clients want to have a firm understanding of how an order progresses from passive to aggressive, where they are placed, and how they can work with us to fine tune that performance. Visualising the thought process of the algo, along with the actual execution and fills is something that we are focused on. Clients don’t want to use an algo that they can’t explain to their management.

Galen Stops

Share This

Share on facebook
Facebook
Share on google
Google+
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on reddit
Reddit

Related Posts in ,