Riding out the perfect data storm
I recently came across an article in the Financial Times – “Smaller deals between asset managers are running at the hottest pace in almost 15 years” – stating there had been more deals under $1bn during the first nine months of the year than at any point since 2007. Interestingly, many of these deals were designed to boost operating performance, rather than reshape companies.
While this is not at all surprising, optimising cost through M&A is somewhat of a fallacy, given that many asset managers are in the same hot soup: facing regulatory burden, fee pressure, and rising operating costs.
Over the decades, asset managers have accumulated a raft of legacy systems and interfaces, creating a ‘spaghetti’ landscape. And because of the considerable costs of keeping all these systems – and the workarounds to circumvent their inefficiency – many asset managers are finding themselves operating at break-even levels.
Added to this are more esoteric private assets and the rapid growth of alternative and ESG data sets, creating another level of complexity, that sticking plasters and manual workarounds simply cannot sustain.
In a world moving towards more heavily regulated markets and fee structures, and an investment process that is failing to meet the needs of the end-investor, asset managers are heading into a perfect storm. And at the centre of it, is the inability to access, understand and control data.
Tackling data paralysis
Much of the data paralysis asset managers are facing, is down to an accumulation of duplicate legacy or outdated systems, yet the decision to decommission them is proving one of the most challenging undertakings. Our recently commissioned whitepaper – The Buy Side and System Consolidation – Grasping the Nettle – found more than half (57.7%) of buy-side respondents use a mix of proprietary and/or legacy and third-party-developed platforms to manage their mission-critical functions. This has seen many experiencing duplication issues, with a majority of 73% of respondents relying on multiple systems, to manage the same or similar functions within the business.
Given the immense downstream risk, sunsetting systems that have been around longer than the CIO themself is no easy task. Consequently, many of the challenges noted by the 52 asset managers in the whitepaper came down to two common pain points: timeliness, and uncertainty over the data they hold. An overwhelming 90% of respondents also reported suffering to some extent from poor quality market and reference data, which is having a considerable impact on middle office functions, such as performance attribution, risk and compliance.
There are also significant difficulties in sharing investment data across the business. With some 76.9% of respondents experiencing multiple silos across the company, multiple formats across the various systems, or having to manage multiple vendors and interoperability challenges. The same proportion of respondents also reported that the bulk of their data is not appropriately protected or permissioned, using for instance, Role-Based Access Control (RBAC). The fact that these two findings are linked is no surprise; unpicking organisational silos and opening up access, is almost impossible without putting the right entitlements in place.
In a bid to overcome these obstacles, many asset managers have turned to front-to-back systems, claiming to offer that elusive ‘golden source’ of data. But to me, this is an impossible task. Having worked in this environment for over a decade, I’ve learnt that no firm operates in a vacuum nor in a clean, linear fashion that enables one source of data to be truer than the rest.
The only way to harmonise disparate data sets and create a holistic view across the organisation is to ingest, aggregate and translate data from across all the systems in the organisation. While front-to-back systems, and even EDM solutions for that matter, have attempted to consolidate investment, market and reference data (though often mandating their own data model and creating yet more operational effort), they have not tackled the critical translation element. What’s more, these systems often create more organisational silos, requiring further integration costs or, worse, more workarounds.
It’s only going to get more complex. More than half (55.8%) of respondents surveyed plan to use alternative datasets in the near future, while 73% of asset managers already use ESG data to some extent. The rise of ESG and the growing reliance on alternative data sets, which are largely unstructured in form, means translating and understanding data will not only become tougher but also more vital, particularly in those critical middle office functions.
A Rosetta Stone for asset management operations
What is needed, is open interoperability in the form of APIs that can pull in data from across the investment landscape and translate multiple formats, templates and identifiers into one unified language in real-time.
We’ve created this new standard for data management, to break down data complexity, reduce the cost of investing and increase transparency across the investment chain. And we’ve done this because what is out there today, hasn’t delivered the end-state asset managers desire: to know exactly what they own, where, and how much it is worth.
Built to sit at the heart of the world’s investment industry, we like to think of LUSID as the Rosetta Stone for asset management operations. enabling firms to see their investment, market and reference data with a complete audit trail, in one accessible data store. By resolving the pain points of timeliness and uncertainty, firms can permission their resources easily with the data they need to do their job successfully.
Time to grasp the nettle
We know risk appetite for change will always be low: that is human nature, after all. And while we are technology specialists, we are human too. That’s why we advocate starting small, with specific use cases that cause the greatest distress, such as those in mission-critical functions e.g. reconciliation, performance, risk and reporting.
Having your data speak a single language across the firm avoids slippage, lowering your operational risk. And without a big-bang implementation, it shows value to the market sooner. It also makes the decision to switch between multiple systems easier and safer, creating iterative steps towards a holistic investment data management foundation, whether you are a start-up, a global asset manager or a market infrastructure provider.
The time to grasp the nettle has come. The opportunity cost of not doing so is simply too high, given a generation of clients demanding a new way of investing and a flood of regulations calling for operational change. But by embracing a new standard for data management in your organisation, you create a broader opportunity: delivering meaningful change to an otherwise opaque market, efficiency to a far from perfect investment process and transparency to your clients.
Subscribe to our newsletter
Get stories like this in your inboxSign up
LSEG – London Stock Exchange Group partners with FINBOURNE to power digital data program across the business
Consolidated Tape: A fish that swims in all waters
Why low-code, no-code is not a panacea for buy-side ops pain