Data overload: The human cost of doing nothing
I was at TSAM London recently, where it was only a matter of time before ‘big data’ came up in conversation. In the session I was moderating, my fellow panellists noted that the buy-side has found itself in a race to consume and store as much data as possible, since the big data explosion. But more data isn’t necessarily good data. And while some consider it to be the ‘new’ oil, hoarding data doesn’t deliver a fixed value or purpose in the same way as barrels of fuel.
Data itself is not a commodity, it is a means to an end, and the value lies in the outcome it delivers. What we are not doing today is asking ourselves what outcomes we expect from the data, how it will add value to the business, or how it will help resources work more effectively.
As a result, we’ve overloaded an already burdened operational landscape with systems that are simply not geared to store such volumes or compute complex and unstructured data, such as ESG and sentiment data sets. And we’re not fully leveraging that data to help us find value, whether it’s to help us serve our clients better, differentiate from the competition, or explore new business opportunities.
Interestingly, while digitalisation and technological advances are being adopted in the asset management industry, the one area that remains an inherent source of pain is data management. But just as technology can help fuel business processes, it can also address the data overload permeating the industry and holding firms to ransom.
Given the pressures facing asset managers today, the industry needs a swifter solution to the multi-year data transformation programs, that carry additional operational burden and delivery risk. It requires technology that can work with existing systems, to improve outcome-based use of data and remove silos, ensuring that the right people have access to the right data.
More than anything, we need to help overcome the inertia that is restraining operations and help change the way buy-side firms work. Moving away from the burn out caused by data fragmentation and overload, towards data empowerment. Understanding why this inertia has endured, and solving the underlying reasons, is the key to helping the industry move forward – before it’s too late.
The human cost of inertia
While bad data exists for a number of technology-based reasons, there is curiously a more human factor sustaining the status quo. In our recently commissioned whitepaper, The Buy Side and System Consolidation – Grasping the Nettle, management inertia was noted by 52 asset managers surveyed globally as the highest-scoring factor impeding the enhancement of investment management technology.
Furthermore, 45 out of a possible 52 respondents highlighted it as the most acute challenge they face, underlining just how important it is for management to appreciate the firms’ technology needs and allocate the necessary resources. Without the right technology, it can become difficult to respond to market changes, including incoming regulation, or regain control over spiralling operational costs. But what’s worrying, is that management inertia doesn’t just impact cost savings, there’s also a human cost.
Every day, valuable time is spent manually adjusting errors, creating workarounds to support inadequate broken data, and attempting to figure out a firm’s positions and exposure. We know engaged and empowered staff who believe in the technology and the roadmap are essential for long-term success. But this is far from the reality lived today, with demotivated employees and inefficient processes that are stagnating productivity and growth.
Worse still, is the operational risk firms face when these burnt-out people walk out the door, taking valuable knowledge with them. This is a particular danger given firms’ reliance on legacy systems, whose workings are now limited to a finite number of specialists.
Inaction is no longer an option
The whitepaper found that only 9.6% of respondents believe their data is clean, transparent and reliable. That’s a very small minority of firms that are not impacted by inaccurate and incomplete data. For the rest, while the business functions in which this happens vary, the root of this pain often comes down to timeliness and uncertainty.
And it’s become a much more serious issue, due to the inexorable growth of data required for ESG strategies and private assets. Frankly, the data that forms a central asset to investment management operations is simply not fit for purpose and that has put immense pressure on both people and processes.
Alongside management inertia, the whitepaper also found that reliance on legacy or proprietary technology was the second biggest limitation to modernisation. These often ‘closed systems’ are built or inherited through M&A activity over the years and cannot be easily ripped out and replaced.
But this is directly at odds with the need for high-quality, accessible and secure investment data, and explains why at least half of respondents experience onerous data silos in their organisation. Over the years, organisational silos have seen firms unknowingly pay for the same data more than once and created a host of workarounds to repurpose what data they can access. This has created a very messy audit trail, demeaning trust in the data. Solving this challenge can help employees move way from such laborious tasks, freeing them up for what they do best: generating strong returns.
Front-to-back solutions and even EDMs have provided some relief by consolidating data, but what hasn’t been tackled to date, is making the data usable and interactable across a firm’s operational landscape. Here, technology that can integrate into a firm’s operations, including existing service providers, rather than overhaul them, can make all the difference.
What is needed, is a solution that can ingest and consolidate data, but also make usable sense of the variety of formats, templates, languages, and identifiers that have mushroomed across the investment ecosystem. A platform that can takes investment, market and reference data of all forms and provides a real-time translation service, so that a firms’ data can speak the language that best fits the operational need.
Delivering this Rosetta Stone for investment operations, crucially brings transparency to the process and trust in the data. While the use of Role-Based Access Control (RBAC), eliminates silos, opening up controlled access to investment data. What we need, is to make sure the right people can access the right data, and in the language they need to use for the task at hand.
From the whitepaper findings and conversations we’ve had with asset managers, it’s clear that we’ve reached a tipping point and need to address data overload once and for all. What’s more, new data sources, such as ESG and private assets, and regulatory reporting requirements, like SFDR and MiFID II, mean additional processes are continually required.
With the limitations of legacy systems, comes the very real potential for regulatory risk. It’s unlikely regulators will look kindly on asset managers unaware of where their data comes from and what data sets are used in calculations and reporting. With operating margins already squeezed, the cost of inaction is considerable, not only to the people on the ground, but also to a firm’s reputation and bottom line.
De-risking operational change
But where there is adversity, there is also opportunity. We know risk tolerance is the main controlling function behind data migration. That’s why we believe de-risking operational change is the key to tackling both inertia and the inefficiencies of legacy systems.
It is therefore vital that we look to technology that integrates with a firms’ existing operational stack and wider ecosystem, delivering a low risk alternative to big bang transformation. And that’s what we have endeavoured to deliver with the LUSID platform. Enabling simple and easy steps to improving efficiency and supporting growth, to give you access, understanding and control of the data that you hold, starting with where you need it most.
This has a significant impact, not only improving data management but also the processes that people work with day in, day out. It means you not only see the impact and value sooner but can once again leverage the talent and skills of your workforce in high-value tasks that make a difference, helping you to compete more effectively and serve your clients with new services and solutions.
Winning back control
It’s clear that the status quo is no longer sustainable. Market pressures and new regulations have highlighted how crucial effective data management is for asset managers on a day-to-day basis. Yet management inertia is still holding many buy-side firms back, preventing them from implementing the processes and systems needed to handle vast amounts of data today.
The data dilemma cannot be solved without tackling inaction, but this is only one half of the challenge. Understanding the impact of living with data overload, particularly on your people, and de-risking the process of change is equally significant to winning back operational control and empowering employees.
As industry practitioners having lived with this pain ourselves, we’ve built the technology needed to address the data challenge once and for all, and lower the risk of operational change.
The question is: Can you really afford to live with the data overload, any longer?
Subscribe to our newsletter
Get stories like this in your inboxSign up
As Technology Evolves, Asset Managers Adapt and Innovate
Is artificial intelligence facing a diversity crisis?
Unlocking Competitive Edge: The Untapped Potential of Machine Learning in Financial Services
FINBOURNE Interviews Georgia Stewart, CEO and Co-founder of Tumelo