Mosaic Smart Data Unveils Data Normalisation Service

Data analytics firm Mosaic Smart Data is launching a data normalisation process as a new stand-alone service. The firm will employ its technology and data model to cleanse, normalise and process firms’ transaction data in all major FICC asset classes, including cash and derivatives, allowing institutions to analyse their activity in a given asset class at both the micro and macro levels in real-time.

The service can be provided in the cloud or deployed on premises behind the client’s firewall. The resulting data is stored and made available via an API allowing data to be accessed remotely, making digital and distributed working feasible.

Data normalisation provides the foundation upon which effective analytics tools are built, the firm says, adding that its service allows institutions to aggregate data from across all their trading channels and streams into a single unified messaging language. This provides a trusted data set which can be used both internally as well as shared with trusted third-party service providers.

The data is hygiene-checked, errors and inconsistencies are rectified, and enrichment consolidates the data sets. Without this crucial step, the firm says it is impossible for accurate real-time data analytics, machine learning or artificial intelligence to be applied across an institution’s entire transaction activity.  It adds this makes answering even simple questions impossible at any level higher than a venue-wide analysis and claims that the new stand-alone service opens the door to firms that wish to take the next step in adopting its MSX analytics platform which provides real-time analysis across the institution’s capital markets activities.

“Inconsistent data inputs and storage are a huge challenge for institutions and hinder performance and profitability,” says Matthew Hodgson, CEO and founder of Mosaic Smart Data. “Right now, it is impossible for many institutions to understand even quite simple information about their activity at the macro level because every venue records information differently. Normalisation solves this problem. It is the foundation of understanding data and an absolute prerequisite to comprehensive, accurate and forward-looking analytics.

“With many firms now forced to look at new ways of working, more work is being done digitally,” he continues. “This makes data normalisation even more important. Digital tools can only be effective with the right data, and if firms build tools which only analyse part of their activity, they risk creating new silos and new inefficiencies. Normalisation allows firms to see their data – and their business – as a whole, unlocking new insights and helping teams work cohesively and in symphony.”

Colin Lambert

Share This

Share on facebook
Facebook
Share on google
Google+
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on reddit
Reddit

Related Posts in ,