Galen Stops looks at why buy side firms
are increasingly looking at new ways of viewing and analysing market data and
how this is helping them create new trading strategies.
increasingly allowing firms in a wide range of business sectors to collect,
store and use data to enhance efficiencies and financial services is no
exception. In FX it seems that for many buy side firms the new battle ground
for using technology to deliver profits is no longer latency, but data.
However, the unique structure of the FX market and “Big Data” challenges make
capturing and using this data more difficult for these firms.
people think about technology they think “can I trade faster and cheaper?” and
there’s no doubt that technology has had a transformational impact on the industry
in this regard. But nowadays, even if you’re an HFT, the impact of this kind of
technology is starting to get very limited,” says Philippe Burke, portfolio
manager at Apache Capital Management.
“What I find
much more interesting and much more rewarding is data analysis, and if you can
come up with actionable conclusions from that data analysis, then it can be
quite valuable,” he adds.
challenge for firms looking to utilise market data in order to form actionable
strategies is simply finding the right data amidst the vast array of
information that is available to them.
“One of the
challenges that anyone will have once you start down this path is going from
just a sea of data that’s completely incomprehensible to a subset of that data
that’s comprehensible and then ultimately actionable,” says the global head of
operations at one major investment bank.
Problematic data sources
at the FX market, Stuart Farr, president of Deltix, a firm that provides software and services for
quantitative research, algorithmic and automated systematic trading,highlights two factors in particular
that make extracting useful market data very difficult for buy side firms.
The first point
that he makes is the multiplicity of data sources in FX. Unlike exchange-based
equities or futures markets, relevant FX market data can be found within the
global and regional banks, ECNs, trading technology platforms and buy side
The second factor
that Farr says makes taking data in FX more problematic is that the quotes that
banks provide to their clients in this market are often different depending on
who that client is.
as credit quality, assets, trading volume and style could all help to determine
the price that a bank offers to a buy side firm. What this means for firms
trying to analyse this data is that they’re looking at quotes that the banks might
never offer to them, and this could therefore skew any simulation or
back-testing that they might perform based on the data.
that the best way for firms to overcome this problem of accessing historical
data is to collect it themselves. But of course, this raises another issue.
challenge it then creates is: where do you get the history to develop those
models unless you just do trial and error?” asks Alex Dunegan, founder and CEO
of Lumint Currency Management. “All this work around Big Data relies on the
“Big” part and if you don’t have a lot of this data then you can’t really act
on it in an informed way.”
Lumint Currency Management plans to launch as a
hedging service for asset managers looking to outsource their currency risk and Dunegan hopes that his firm’s position as an outsource operation will enable it
to build up a strong database quickly.
“One of the
things that I’m hoping as we build our client base is that we’ll be an
aggregation point for our clients and that as an execution agent across
multiple firms we’ll to be able to utilise a much broader set of information
than someone who maybe just has a long equity desk or a currency hedge fund.
They have a very limited data set,” he says.
Finding the information
firms that do already have much of the data that they need within their own
systems, it is still challenging to find the right data given the heterogeneous
nature of their business operations and structure.
actually dozens of businesses, each with its own data model, its own set of
stake-holders, its own set of contingencies, its own regulatory burden,” says
the investment banker.
On the buy
side, the head of quant model implementation at one of the world’s largest
asset managers reports a similar problem.
“One of the
biggest challenges that I see is that there’s so much data that we have within
our company, but it’s locating that data. We know it exists, but we can’t find
it because we don’t know where it’s at, we don’t know who to talk to,” he says.
Farr though, the development of technology has reached a point in which it can
solve many of the problems that firms face in terms of accessing and normalising
the data, it’s just that not everyone is aware of it.
a lot of people spend their time putting in plumbing to get data, building
software to store it, to normalise it and to analyse it when all of that is out
there today and you can get it, buy it, pay for it or rent it. What isn’t out
there is the important thing – the trading idea and pipeline of trading ideas
that you have to develop and test and put out into the market,” he says.
Isaac Lieberman, founder and CEO of Aston Capital Management, reports that when
it comes to data analytics “the technology is the easy part”.
brilliance of today’s technology for processing data is that it enables you to
identify real time trading strategies and validate and deploy them with
leverage and speed that would not have historically been available.
“When you’re talking about artificial
intelligence and machines learning strategies you need to have the experience
and the know-how to sift through and tag the data, creating a consistent
process where you isolate the strengths and weaknesses of your strategy, producing
something that’s representative, and risk adjusted. Then you have to be able to
deploy it in the market and have the real time production results that match
your simulated results,” he says.
At the most
basic level, using the data to create strategies involves doing series
anlaysis. A trading firm might look at movements in the yen over a set period
of time looking for the moving average, the highs, lows, intraday volatility
and other factors. Doing this series analysis they can develop some basic
trading rules based on how the yen has historically behaved.
level of this analysis is looking at what variables impact the yen. Whether
it’s how the Nikkei is performing or what’s happening with US interest rates,
they find variables to analyse and estimate to see if there are correlations
with the yen’s behaviour. This enables a trading firm to produce a richer model
to start back-testing.
starts to get interesting is that now these trading firms are looking to use
unstructured data to further enhance their models. To be clear: structured data
is numbers, ones and zeros in a column that can be easily absorbed and
manipulated. Unstructured data is pretty much everything else. It can be an
image, it can come from social media, from analyst reports or quarterly
information from all this unstructured data is a fundamentally trickier
prospect. For example, the image of a glass that is empty might be assigned a
value of zero while the image of a full one is assigned a value of one. But how
do computer systems assign the value to a glass that is two-thirds full? Or one
in which the water level is rising?
In a similar
vein, analysts can’t always be correct in their predictions and so to
distinguish themselves from the other voices out there, a lot of them using
increasingly colourful language, which can make it difficult for machines to
distinguish whether they’re bullish or bearish.
As for social
media, the nuances of human language – especially when it is altered to fit 140
characters in media from Twitter – complicates using data from these sources.
How can machines account, for example, for sarcasm?
Getting to grips with Big Data
All of this
is part of the challenge facing buy side firms trying to find actionable data
from the vast array of information that is out there. But as Farr points out,
the technology needed to extract and use this data is becoming more available
to these firms.
technology challenges are tractable, there are good solutions out there for
storing, normalising and analysing data. It used to be the case that you had to
rent a cage in a data centre and put in physical hardware and connectivity. But
now it’s much more economical to go and use a specialist cloud provider where
someone else is providing the physical infrastructure and the connectivity,
typically in NY4, LD4 and TY4,” he says.
that the economics of accessing this technology have changed dramatically.
“In the last
10 or 15 years, one of the fascinating things that’s happened is that the cost
of information and the technology required to analyse that information has come
down to almost zero. You’re more limited by your own ability to ‘process’ it
and analyse it,” he notes.
that with buy side firms increasingly able to access and map structured and
unstructured data, the next challenge for them is found in Big Data.
was to collect, tag, store, and query more and more structured data in simple,
hierarchical databases; the next steps have been to broaden the scope of information
to include both structured data (e.g. 0’s & 1’s) and unstructured data
(e.g. images), to employ more flexible, relational databases, and to be able to
process larger and larger data sets at higher and higher speeds, also known as
“So you now
have a massive amount of information – and the amount itself is growing
exponentially – and it’s coming at you at very high speed, so you need to find
rules to eliminate what’s not accurate, then to manipulate it into something
that is actionable. Another development is that instead of being pure
‘consumers’ of information, some users are now ‘generating’ their own
proprietary data, using satellite images and query programs. All in a quest to
find new relationships between different bits of data, trying to find causal
relationships rather than simple co-movements,” he says.
that could result from firms using enhanced data analysis to find new causal
relationships between different market variables is that there could be more
firms looking to conduct lots of smaller trades rather than a few highly
If a buy side
firm finds a trade that has a 90% historical chance of performing well for them,
then the chances are that lots of other firms are already looking at that trade
and the opportunity has already been arbed out. In contrast, if a firm can use
new data tools and techniques to find 100 trades that each have a 57% chance of
doing well, then providing they risk manage properly, they might be able to
make fairly consistent money from trades that others are less likely to have
“That way of
thinking is a better and richer approach than looking for the 90% trade,” says
Burke. “It’s been a major enhancement for our firm.”
Creativity still key
importance of technology and the systematic nature of this approach, it still
relies on human ingenuity in order to be successful.
“When we talk
about big data, you always hear about the technology and the processing power,
but the most important thing is creativity. If you have creative people with
access to this data then the sky is the limit,” says the head of quant model
The quant adds
that although the costs for accessing and manipulating all this data has come
down significantly for buy side firms, one of the difficulties for advocating
the use of more sophisticated data techniques is that it’s hard to point to a
tangible return on investment (ROI) for some firms.
“Can we put
an ROI on it? No. That would be like me walking into the head of quant research’s
office and asking, ‘What’s the ROI on next year’s research?’ It’s just another
tool to add to your research,” the quant says.
investment banker agrees that it is hard to show an ROI from investing in Big
Data tools, but stated that the firm was able use the data analysis technology
to automate large amounts of its reporting requirements, making what used to be
a very complex operation much faster and easier, and thus were about to
demonstrate an ROI early on.
way to measure the ROI versus the time you’d spend doing something yourself.
You can say this service is going to cost me ‘x’ amount of dollars per month and
start now or I’m going to have to use two people over 12 months just to get to
the starting point,” says Farr.
for accessing and normalising data continues to become cheaper and more mature,
and as buy side firms become more proficient in handling that data, they will
be able to manipulate it in order to create increasingly effective trading
the key to this development will be people and human creativity. Big Data
remains a relatively new concept in financial services mainstream, but with a little bit
of this creativity, it will definitely play a huge role in how firms develop
and execute trading strategies in the future.