Galen Stops looks at why demand for cryptoassets has skyrocketed in 2017 and assesses whether they have any future in mainstream financial markets.
The first working implementation of a blockchain that the world had ever seen was in the Bitcoin software released in 2009. Bitcoin the cryptocurrency then rose to prominence in 2013 when, driven in part by a flurry of media attention, its value rose past $1,000 for the first time.
Following that, 2014 represented a long and painful year of price decline for Bitcoin as an asset, but it continued to garner a lot of attention, not always for good reasons. Then in 2015 the narrative began to change as people really started talking about the potential applications of blockchain technology distinct from any digital assets.
In February, Profit & Loss reported that GTX had partnered with Ideal Prediction, an independent trading analytics and data science company, to offer its clients analytics aimed at optimising their FX trading.
GTX first hired Ideal Prediction to optimise client liquidity pools and trade execution performance in March 2016 and the perceived success of this project, combined with the management teams’ strong working relationship with Ideal Prediction CEO, John Crouch, from his time working at Credit Suisse, prompted the two firms to look for more ways to utilise the data at GTX’s disposal to help its clients.
The end product of this was the analytics tool that GTX began offering to firms in February.
Increased attention on market impact has prompted non-bank market making firm XTX to release a new analysis tool, XTX-ray. Colin Lambert takes a look.
Market impact has grown steadily as a topic of conversation in the FX industry, thanks in part to the events of October 7, 2016 in Cable, but also due to the increasing instances of “mini” flash moves in markets. As risk warehousing activities have been scaled back across the banking industry, a crucial buffer is being thinned out, meaning orders that previously had minimal or no impact on market levels, now do.
After two years of endless hype, Galen Stops looks at whether 2017 will be the year that distributed ledger technology broadly starts getting put into production within mainstream financial services.
Last year saw numerous firms producing proof-of-concepts (POC) regarding the potential application of distributed ledger technology (DLT), issuing whitepapers about the technology and hosting “hackathons” and other events to discuss and promote its use within financial services.
Profit & Loss covered the major developments around DLT last year, but the editorial team started expressing frustration towards the end of the year regarding the disparity between the PR and subsequent press coverage surrounding DLT and the actual amount of tangible projects being put into production using this technology.
As FX execution becomes increasingly fragmented with more and more trading taking place in dark environments, price discovery is rapidly becoming one of the industry’s key challenges. But can the recent proliferation of new market data offerings from the leading ECNs really help tackle this problem as claimed? Nicola Tavendale writes.
The past year’s run of unprecedented market events has only served to highlight the growing demand for timely and reliable FX market data, yet innovation in this area has notably lagged behind the levels seen in other areas of the financial markets.
As more financial services firms look for ways to utilise blockchain technology within their infrastructures, Galen Stops examines whether the technology is really as safe as advocates claim, following two high-profile hacks earlier this year.
“Cyber and system security is one of the most important issues facing markets today in terms of integrity and financial stability,” said Commissioner Christopher Giancarlo of the Commodity Futures Trading Commission (CFTC) on September 8, when approving system safeguard requirements for derivatives clearing organisations.
Giancarlo is hardly alone in his concerns.
As leverage requirements make FX exposures a bigger pain point for the banks, many are looking towards compression services to solve for this. Galen Stops looks at how these services work and what they could mean for the industry.
One of the responses by global regulatory bodies to the 2008 financial crisis was to require banks to hold more capital against their financial exposures, creating a bigger buffer to protect them against adverse market conditions.
Capital constraints have widely been cited as a reason for declining activity in some markets and liquidity events in other, therefore it is not surprising that compression services, whereby offsetting trades are netted off against one another to reduce the notional amount on banks’ balance sheets, have found favour amongst banks and major dealers.
Humans are curious creatures. We study our environment, consider ourselves in relation to our surroundings, and, uniquely among living things, even ponder our ability to think. Whether we realise it or not, we regard sentience as the crowning jewel of ...
An overall increase in
the number of firms using algorithmic-based trading to execute FX orders was
seen in both Europe and the US in 2015, says a new report from consultancy firm
Greenwich Associates. The company adds in its 2015 Global Foreign ...
As more firms continue coming to market with distributed ledger-based technology solutions, Galen Stops looks at the competitive landscape of this market.
It is perhaps unsurprising, given the amount of hype that has followed distributed ledger technology, that there has ...