Co-authored by Neil Ryan, Regulatory Lead and Chris Brook, Co-Founder and Head of Platform Architecture at FINBOURNE Technology.
In the first of this two-part blog, we explored the role of a fit-for-purpose CT, combined with an interoperable API framework, and how these combine to deliver value to firms and the market. In this second part, we discover the different approaches firms will take, based on their existing technology and systems, and why a holistic approach to CT data provides the best foundation to assess liquidity and reap market opportunities.
The introduction of this market mechanism, together with the interoperability APIs can offer, will have significant implications on the systems landscape firms operate in. Depending on the organisation, there are likely to be varying levels of technologies and complexities in the architecture.
For example, a large sell-side bank may more easily have the technology, scale and resources to consume vast volumes of CT data, even if a fair amount of manual processes are involved. Such consumers have already indicated to us that they would prefer to just have a feed (i.e. “push”; not a “pull” API interface) of ALL the data, and they’d analyse and manipulate it in-house.
Conversely, investment managers with less sophisticated or legacy architectures might struggle with the volume of data, if they were to consume it all. These firms may prefer “pull” APIs, whereby they ask a question of the data (potentially via a SQL query, or similar), and the CT provider does all the ‘heavy lifting’ on their behalf – meaning they only handle the raw transaction-level data in small, specific amounts.
In either case, the benefit of industry-standard, open APIs is that they should be extremely easy for asset management technology teams to consume, and enable business users to access and explore the CT data, and join it up with other data sets in the organisation to deliver the insights and opportunities that will drive growth.
Taking a holistic approach to CT data
Taking an interoperable, API-first approach to CT data is a first step towards joining up data with complete accuracy and confidence. But it is important to understand that APIs are not a complete strategy in themselves, they are only a starting point.
While connectivity is a key element in the consumption of CT data, it is effectively meaningless without interpreting the data and deriving the full value, for investment decision-making.
Firms will need to consider whether they have the right control environment, entitlements protocol and domain knowledge, to translate across these data sets and the varying formats each uses, to garner granular insights that will drive growth.
One obstacle we see in the way of this, is the legacy infrastructure capital markets firms currently operate in. In fact, according to a new survey by the DTCC and Celent, as many as 60% of buy and sell side-firms are still reliant on a mainframe, despite the industry push towards the cloud.
While the mainframe has supported capital markets firms for decades, it has failed to keep pace with changing markets and client needs. As a result, many firms have since accumulated a medley of niche solutions and manual workarounds, forming the legacy technology and data inadequacies that exist today.
The advent of Consolidated Tape, offers an opportune time to review the investment technology and data processes in place, and whether these will enable firms to efficiently extract and derive value from CT data (not in isolation but as part of all the critical data living in the firms ecosystem), without additional cost and resourcing.
This is particularly timely in a market with heightened volatility and geopolitical disruption. Being able to harness real-time data across critical data sets – including CT data – to gain a live view of positions, portfolio, risk, and exposure is critical.
To meet the industry’s need for cost optimisation, transparency and low tolerance for risk (when it comes to operational change), firms must look to a solution that can bridge the gap between existing architecture and a future-state data stack.
FINBOURNE’s Modern Financial Data Stack delivers exactly this, in the form of a cloud-native, interoperable data store, that facilitates safer, better and faster provision of data.
- The Modern Financial Data Stack is a trusted data fabric that helps firms to understand and gain value from investment, market and reference data residing across existing systems, teams and functions.
- Leveraging a host of SaaS-enabled capabilities and open APIs, this data stack effectively empowers different functions in the investment ecosystem, so they can interpret and see disparate data sets in the format they wish.
- With data that can now be implicitly trusted and understood, investment managers can confidently join data together, to create meaningful analytics for decision-making.
- It can also facilitate access to external innovations and emerging technologies, such as AI, to provide richer insights.
For these reasons and more, we believe the combination of a fit-for-purpose CT, an open API strategy and a holistic approach to organisational data delivered by the right data stack, will create the most consistent and valuable understanding of market prices.
Crucially, pairing insights from CT data with other data sets and making it available across the organisation, not only promotes greater access to liquidity across functions, but ultimately, greater transparency of market opportunities, for growth and prosperity.