About a-team Marketing Services

A-Team Insight Blogs

Strategies for Mastering Benchmark Data Management

Subscribe to our newsletter

Benchmark data is fundamental to the functioning of capital markets. Yet, despite its use over many decades, it remains costly to source, complex to manage and difficult to integrate with existing systems, leaving many firms unable to derive full value from the data. It is also perceived by many capital markets participants to be the preserve of incumbent providers that can dictate terms, although disruptors are in the wings, along with FCA comments following publication of its wholesale data market study, which examined competition in the markets for credit ratings data, benchmarks and market data vendor services, and noted that ‘across all three markets, the FCA has identified areas where competition does not work well. Users may be paying higher prices for the data they buy than if competition was working more effectively’.

The challenges and solutions of benchmark data management were discussed during a recent roundtable lunch hosted by A-Team Group and sponsored by Rimes, Unlocking Value – Strategies for Mastering Benchmark Data Management. Participants included asset managers, market data specialists, data stewards, data management leads, heads of indices and research, experts in data governance, and senior data services managers.

Challenges of managing benchmark data

Kicking off the conversation, host Andrew Delaney, president and chief content officer at A-Team Group, asked the roundtable participants about the challenges of managing benchmark data. The responses were many and varied.

An early respondent categorised the challenges into three sections – governance and oversight, quality of data, and managing data integration. On governance and oversight, the participant commented: “Revenue from benchmark data is dominated by the index incumbents, the top three or four index incumbents out there. They have created layers of commercial policies that require good governance controls and reporting. Index consumers are finding this a real challenge.”

Governance and oversight

The key issues here include data usage and tracking, with firms finding it difficult to track levels of data used, numbers of users across locations, and which department or location is reporting the data externally. “From an asset management perspective, there are layers and layers of licenses, and you are paying for the data several times,” said a participant. Another commented that high vendor data costs are difficult to pass on to users of the data. Another said: “As an asset manager, you have to demonstrate value for money, but the vendors are not part of that. If you look at performance, they’re just a component on a conveyor belt, they have no responsibility to demonstrate value. If they put up fees by 10%,15%, 20% a year, they don’t have to justify why, yet our costs are going up all the time.”

The roundtable participants discussed the potential of disruptive vendors to change the status quo, although their path to prominence will not be easy. “Disruptors are very important, because they give us leverage when we are negotiating. But it’s acceptance of those disruptors, that is an education process, and I think we should all be helping the disruptors by trying to educate portfolio managers that there are alternatives out there,” said a participant. Another added: “I guess it’s not just disruptors, hopefully there is light at the end of the tunnel in that there seems to be a bit more competition, which just hasn’t been there before in the index space.”

The intervention of the FCA was also highlighted. “The FCA indicates there are competition issues in the space, and it’s going to expand on that. So, a tipping point might come, and I think we’d all welcome the day.”

Rimes’ approach is to improve access to, and consumption of, benchmark data. It has provided benchmark data services for many years and continues to invest in solutions based on conversations with clients and evolving needs in the benchmark and index space. “There are a lot of fragmented processes in consuming and managing the data. If you look at individual firms, large asset managers sometimes have over a thousand different systems consuming the data. Sometimes, that means they have 15 different sets of feeds coming in, which is a problem. Performance and attribution checks are going to be different from your risk checks, and that can cause problems too.”

To solve these problems, the company has developed permission modeling and is working to extend this to expose index variants that customers can receive. “We want to be able to say we have all the data in-house, but no, we cannot expose all of it to all our clients because of license restrictions. For customers, the challenge is whether one index is similar to another. You have to investigate the data to see that, but we can see that as long as we have the data on board. We’ve been looking at vector analysis, where we can pick an index and find similar indices programmatically. This is a powerful solution and a good entry point for people to say, okay, this is at least a similar universe, the index is similar. We’re going to have all the data we need available, although we may need an extra license.”

The Rimes solution speeds up processes and helps firms identify if they have multiple licenses satisfying the same concern. The data is validated, quality-checked and made as granular as possible. A dynamic UI then powers the use case. The company is still working on how much data is appropriate to expose to clients but does have a working PoC on the go and hopes to release a commercial solution later this year.

Quality of data

The Rimes platform also addresses data quality issues that can have significant implications for investment decisions and overall portfolio performance. Its data extent and rules on exposure could also help customers make vendor solution selections. As one participant put it: “The breadth and depth and quality of data is key for us when selecting an index provider.”

Similarly, the look-up tool should go some way to solving problems around the timing of onboarding indices. One participant explained: “We are in business, we have a mandate, or an existing client wants to change its benchmark and all five asset managers that are running that money are all expected to change the benchmark on a certain date, maybe in four weeks, maybe five weeks, most likely in two weeks. Then all hell breaks loose because one vendor will take eight to twelve weeks to deliver a new index. That’s not acceptable from an operational standpoint.”

Data integration

Moving on, Delaney asked how firms can achieve benchmark data integration with existing technology infrastructures and ensure compatibility with other data sources and analytical tools. The conversation moved quickly to the cloud, with participants interested in scale, flexibility and the ability to consume data in different ways for different use cases.

“Asset managers today use several internal and external systems, and they’re trying to break down silo databases, some of which are legacy systems. So, is there an option to be able to migrate the database into the cloud? Can we remove the ETL process and go to a cloud-native data warehouse solution?”  Most of the participants agreed that this would be a good move, although controls and reporting would still be required and inherent benchmark data problems such as onboarding new indices cannot be avoided.

The importance of having a cloud-native based marketplace and reliable means of accessing and storing benchmark data was stressed, along with being able to provide customers with a standardised solution. That way you can have structured data in the database and the search team can get access to the data straightaway and start playing. The challenge here, however, is whether index originators would want to go down this route.

Rimes acknowledged the potential of cloud platforms to provide a marketplace for benchmark data and noted its intention to deliver data that can work with any infrastructure. Concluding, it said: “A bigger topic that typically comes up with customers is how they can ensure that when they consume benchmark data, they don’t have to treat it as an outlier – it should be another data set they are receiving.”

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Best practices for buy-side data management across structured and unstructured data

Date: 14 November 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Data management is central to asset management, but it can also be a challenge as firms face increased volumes of data, data complexity and the need to consolidate structured and unstructured data to gain valuable insights, improve decision-making, step...

BLOG

Finbourne Raises £55m to Fund Overseas Expansion

Finbourne Technology has received a £55 million capital injection to fund the international expansion of the investment data management solutions provider’s AI-enabled data and tools service. The London-based company raised the money in a Series B funding round led by venture capital firm Highland Europe and AXA Venture Partners (AVP). It follows a £15m cash...

EVENT

RegTech Summit London

Now in its 8th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Preparing For Primetime – How to Benefit from the Global LEI

They say time flies when you’re enjoying yourself, and so it seems the industry have been having a blast with its preparations for the introduction of the global legal entity identifier (LEI) next month. But now it’s time to get serious. To date, much of the industry debate has centred on the identifier itself: its...