About a-team Marketing Services

A-Team Insight Blogs

Fed’s Tarullo Pushes For Mandatory Reference Data Reporting and Standardisation

Subscribe to our newsletter

This week, US Federal Reserve governor Daniel Tarullo brought the issue of data standardisation to the attention of the US Senate during his testimony before the Subcommittee on Security and International Trade and Finance, following up on his comments last year about the data challenge related to living wills reforms. He is proposing to establish a new centralised system of data collection and monitoring and to encourage greater data standardisation across the reference data space, especially in the areas of instrument and entity identification.

The regulatory community has become increasingly aware of the data management challenge within financial institutions, as it struggles with its own challenge of better tracking systemic risk across financial markets. As noted by Tarullo this week: “The recent financial crisis revealed important gaps in data collection and systematic analysis of institutions and markets.”

To rectify these inadequacies, the US regulator is seemingly keen to kick off a standardisation process: “The Federal Reserve believes that the goals of agency action and legislative change should be: to ensure that supervisory agencies have access to high quality and timely data that are organised and standardised so as to enhance their regulatory missions; and to make such data available in appropriately usable form to other government agencies and private analysts so that they can conduct their own analyses and raise their own concerns about financial trends and developments.”

Tarullo also wants the regulatory community to begin collecting additional data in order to better supervise systemically important large financial institutions. During his speech, he discussed the investments the Fed has made thus far to be able to better monitor the markets by evaluating data sources and adding new sources. This investment should be extended, he suggested, to the entire data arena by establishing a new standalone independent data collection and analysis agency to serve the regulatory community.

He is also keen for the publication of data by the private sector to be mandated by legislation; such as trade data from OTC derivatives trade repositories, for example. The provision of this data would also need to be more timely, said Tarullo: “This kind of approach will require data that are produced more frequently than the often quarterly data gathered in regulatory reports, although not necessarily real-time or intraday, and reported soon after the fact, without the current, often long, reporting lags. These efforts will need to actively seek international cooperation as financial firms increasingly operate globally.”

He is a strong advocate of a new and improved system of data collection and aggregation at the regulatory level, which he believes will also improve risk management practices within firms by requiring “standardised and efficient collection of relevant financial information”. However, Tarullo does seem to appreciate that this level of standardisation and the introduction of new data collection tools will not be free. “Data collection entails costs in collection, organisation, and utilisation for government agencies, reporting market participants, and other interested parties. Tradeoffs may need to be faced where, for example, a particular type of information would be very costly to collect and would have only limited benefits,” he told the Senate committee.

This endeavour must therefore take into account the fact that not all data is suitable for collection in this manner, he acknowledged, and that it does not necessarily need to be provided in real time, although timeliness is important. “What is considered to be ‘timely’ will depend on its purpose, and decisions about how timely the data should be should not ignore the costs of collecting and making the data usable,” he said.

The data collected should also be user driven and reported to the particular regulatory bodies in charge of the markets concerned, added Tarullo. Standardisation is key in this endeavour and he directly referred to the need for a standardised unique identifier for institutions and instruments to make “surveillance and reporting substantially more efficient”.

Tarullo also directly referred to the practical barrier of vendor data provision that currently exists in the market, which is quite timely given the investigations into a number of data vendors’ pricing practices around proprietary codes at the moment. The Fed is a customer of these vendors but is concerned about the “strong limitations” that may be placed on the sharing of such data and on the manner in which it may be used; a concern that is seemingly shared by the private sector (see recent customer lobbying of Bloomberg for proof).

“They also create systems with private identifiers for securities and firms or proprietary formats that do not make it easy to link with other systems. Surely it is important that voluntary contributors of data be able to protect their interests, and that the investments and intellectual property of firms be protected. But the net effect has been a non-compatible web of data that is much less useful, and much more expensive, to both the private and the public sector, than it might otherwise be,” he said. Vendors be warned.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: ESG data sourcing and management to meet your ESG strategy, objectives and timeline

ESG data plays a key role in research, fund product development, fund selection, asset selection, performance tracking, and client and regulatory reporting, yet it is not always easy to source and manage in a complete, transparent and timely manner. This webinar will review the state-of-play on ESG data, consider the challenges of sourcing and managing...

BLOG

Generative AI Poised for Leading Role as Regulatory Data Burden Grows

Amidst the hype around Generative AI (GenAI) and Large Language Models (LLMs), practitioners are beginning to realise that these emerging technologies can make a positive impact on the collection and validation of regulatory data. The categories and scope of regulatory data requirements have expanded considerably in response to rapid market developments and growing regulatory scrutiny....

EVENT

ESG Data & Tech Summit London

The ESG Data & Tech Summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Directory of MiFID II Electronic Trading Venues 2018

The inaugural edition of A-Team Group’s Directory of MiFID II Electronic Trading Venues 2018 offers a guide to the European landscape resulting from new market structure introduced by the January 3, 2018 implementation of Markets in Financial Instruments Directive II (MiFID II). The directory provides detailed profiles of more than 70 venue operators and their...