FINBOURNE interviews TP ICAP’s Chris Dearie and Roland Anderson
This week, we sit down with Chris Dearie, Deputy CEO, and Roland Anderson, CTO of TP ICAP’s Data & Analytics business. In a wide-ranging chat, we discuss how TP ICAP is providing insights as well as data to buyside clients, the potential game-changing impact Google Cloud could have on the buyside and what the investment industry can learn from Apple.
Dermot: TP ICAP is the largest interdealer broker in the world. What’s your Data & Analytics business focused on?
Chris: Historically we have monetised the raw content that emanated from our broking desks. Where we are now is focused on providing insight and further value to contextualise pre or post trade data. Our buyside clients are incredibly focused on liquidity, where it resides and how they access that liquidity. We operate in an area of fairly esoteric data sets so providing subject matter expertise, understanding of the data, and insights about the ingestion and utility of the data is incredibly valuable to our buyside clients.
Becca: What are the most common issues impacting your clients today?
Chris: Cost management is always high up on our clients’ agenda from an operational perspective, particularly those who are consumers of services and solutions. Data quality is becoming more important – in the context of increased automation – and issues around the quality and reliability of that data is hugely important. That’s largely because if you’re seeking to ingest larger amounts of data, the ways in which you understand and interact with the data becomes more relevant. At a wider level, risk remains high on our clients’ agenda. I mean that in its broadest sense but also how it translates into capital management, risk management, collateralisation and the efficient use of capital – however that is impacted by the operational activities of the institution.
Becca: Has this changed since Covid-19?
Chris: Not Covid specifically, but over the last 6 – 12 months, we have seen a lot of firms interested in participating in new markets or markets they wouldn’t normally participate in. Our clients want access to large volumes of historical data from a particular market to get a sense of whether they want to be involved. They’re looking at it in terms of market opportunity.
Roland: On the operational side, whilst the move to remote working has been pretty successful across the industry, the challenge now is around how to manage the return to work. Operationally how you manage the transition from a technology, building and safety perspective is – I think – a bigger task to the one we had getting people working remotely in the first place.
Dermot: Do you see a difference between how the buy-side consumes data versus investment banks?
Roland: There isn’t a one size fits all data consumption approach for a hedge fund or asset manager. But they all want access to the data directly whether that’s via FTP, fixed or another means. In terms of how they handle the data, we’re seeing the buyside use a lot of cloud technologies. But the true efficiency of using cloud technology is only met when you keep, store and analyse data in the cloud as opposed to simply moving data from on-prem solutions to the cloud.
Dermot: You’ve traditionally sold your data through aggregators but now also sell directly to clients. What’s driving that approach?
Chris: We want to respond to our customers and be where they want us to be. If a client wants to take our data directly in the cloud, we have to respond to that need. Our recent bond evaluation pricing tool signaled a real change in direction for us moving away from providing raw content to adding value as well. That came about because the buyside expressed an interest in understanding liquidity in the bond market.
The opportunities for data are growing immensely. The difficulty is finding meaning, value and intelligence in data sets. When building a data product, you have to navigate a use case that drives utility but also makes the construction of the underlying data set as ubiquitous as possible so that you can do 7 – 8 other things down the line. We are always thinking about data in terms of structure, its referential capabilities and meta data to give us the ability to change quickly and be interoperable between other systems.
Dermot: Do clients ask you to help with interoperability between your data sets and other data sets they have? E.g. overlaying ESG data over a bond pricing?
Chris: One of the difficulties with OTC data and non-securitised data is how you reference the instrument in the first place. At a fundamental level we’ve described the data model very clearly so it becomes as straight forward and as obvious as possible. We’ve worked with partners to create feed handlers which effectively translate our data into a generic data view which allows for it to be simply ingested into a downstream system and a downstream operation. To the point regarding ESG, we’ve seen people asking us about bonds in a categorisation that is more consistent with an ESG investment strategy or idea.
Becca: What lessons or techniques can the investment industry can learn from direct to consumer technologies?
Roland: Apple has a laser focus on what the consumer wants and they always focus on solving a problem or challenge with a tangible outcome. AWS releases approximately 1,500 products a year and only releases products they would use themselves. Sometimes, the financial services industry can be reactionary and we can definitely learn from this – only building solutions that add value and have a genuine business outcome.
There is also a lot the industry can take from better understanding our own data, for example looking at what people across the buyside are doing with their data, what is actually happening versus what they say they are doing. There is a wealth of information and insights to be gathered from this and we’re certainly focused on listening and working with our own data as much as possible.
Dermot: Looking 3- 5 years out, what areas will have the biggest impact on your buyside clients?
Roland: It’s a bit of a cliché but more workloads will continue to move into the cloud. We see attitudes and cloud adoption vary via institution and even within the organisations themselves. From a market data perspective, there is a belief the cloud cannot do everything required because of the low latency but I think the likes of Google will crack low latency in the next two to five years. If that happens, the concerns around whether the cloud can process sub-second and milli-second updates from matching engines will evaporate. If that’s the case, the buyside will be transformed and we’ll see a lot of cloud adoption for front to back processes.
Chris: All of that will engender a spike in data volumes and increased interaction with it. There will be more interest in discussing ML, AI and data science within the buyside. Operationally the explosion of data will require greater data management, understanding and optimisation. We will also likely see an increased focus on risk management as it becomes more real-time. This will then start to drive pre-trade decisions. We’ll also see the continued move from active to passive investing which is why I think indices and ETFs will continue to be very interesting with ESG as important aspects in those.
Subscribe to our newsletter
Get stories like this in your inboxSign up
Riding out the perfect data storm
LSEG – London Stock Exchange Group partners with FINBOURNE to power digital data program across the business
Consolidated Tape: A fish that swims in all waters
Why low-code, no-code is not a panacea for buy-side ops pain