About a-team Marketing Services

A-Team Insight Blogs

A-Team Group Webinar Offers Insight into Improving Data Contributions

Subscribe to our newsletter

Data contributions to financial benchmarks have been a cause for concern since the Libor scandal emerged in 2012, yet despite heavy fines imposed on banks that manipulated the rate, only now are efforts to improve the situation being made in earnest.

Addressing the issues of data contributions to the market, an A-Team Group webinar entitled ‘Bracing for the Wave – or Sailing Ahead of It? – Reducing Risk Through Benchmark Data Controls’, considered the knock-on effects of the Libor scandal and how financial firms can guard against the kind of deviant behaviour that caused it. Andrew Delaney, A-Team Group editor-in-chief, moderated the webinar and set the scene, questioning the extent of the Libor scandal and how firms can ready themselves for increasing regulatory scrutiny of data contributions made to benchmarks and indices.

Professor Michael Mainelli, executive chairman of think tank Z/Yen Group, described the run-up to the Libor scandal, noting the establishment of the benchmark in 1986 and recognition that it was being manipulated as early as 2005. He said: “A number of banks were colluding to fix the Libor rate. Authorities were informed, but from 2005 to 2009 they did nothing. In 2009, the SEC said it would investigate the issue, but from 2009 to 2012, UK authorities continued to do nothing. In 2012, the situation became political and something had to be done. By this time, over $3 trillion worth of financial products were tied to Libor. Banks that had manipulated Libor started to be fined and fines continue to be imposed as we are still unpicking the scandal.”

Noting the fundamental role of Libor in financial markets, yet a supine regulatory approach to Libor and other benchmarks that is only now beginning to change, Mainelli outlined one way in which banks can look at trades and discover any that might be suspicious. He promoted the use of automated surveillance based on statistical learning theory as a means of identifying deviant behaviour and suggested compliance should run statistical tests at all times to see what is happening on the trading floor.

On the Libor scandal he concluded: “Banks want to move on from the scandal, but they can’t as litigation is only just warning up. This one will run and run.”

With the scale of the Libor scandal and its aftermath set out, Delaney turned to the practicalities of avoiding deviant behaviour on the trading floor. Solution suppliers Robert Simpson, vice president of Verint’s global Financial Compliance Practice, and Tim Furmidge, head of product management for BT’s Financial Technology Services, proposed a number of options that can support the capture, processing and analysis of data to discover deviance, particularly unstructured data such as voice and chat data that can be difficult to manage.

Simpson described a surveillance platform including speech analytics that can be built using existing technologies and provide data capture, processing, analysis and decision making functionality for both structured and unstructured data. He suggested this type of platform can improve on the typical practice of compliance officers listening to voice recordings of data contributors, but cautioned that for a platform to be effective, methodologies need to be reviewed every quarter and measures put in place to prevent people from exercising inappropriate influence over benchmark submissions. He also noted the requirement to retain records of benchmark submissions and the information used to make them for five years, and the need to provide daily and quarterly reports covering methodologies and how any quantitative and qualitative criteria were used.

The outcomes of configuring a voice recording and speech analytics platform in this way include the ability to demonstrate that all communications are recorded, reduce the time needed to find relevant data, reduce headcount engaged in surveillance and spend more time analysing and less time searching data. Simpson commented: “By using technology that is available now, market practitioners can reduce the risk of operational and reputation risk. By investing in technology, they will see added value from call recording.”

Concurring with Simpson’s view of the benefits of automated surveillance of behaviour, BT’s Furmidge described how the underlying technology is evolving as firms move on from a tactical approach that manages silos of data, such as instant messages and email, fixed voice, mobile voice and trades, to comply with specific regulations, to a more coherent approach that captures, archives, retrieves and analyses all data to achieve compliance with multiple regulations.

He said: “The trend is towards a more coherent and common approach in which all channels of data are dealt with in a similar way. This makes it easier, for example, to recreate a trade as required by regulation. We are also seeing the need for a coherent approach to monitoring across countries, for example to comply with Dodd-Frank rules covering swaps trading. As we move forward with multiple capture engines at the point of entry, a more coherent archive and a common retrieve and analysis environment across all channels, it will be possible not only to capture data for regulatory purposes, but also to mine it to spot market opportunities.”

If this is the end game, Furmidge proposes that firms start the route to complete and coherent surveillance with a practical trial of, perhaps, voice analytics. This would include everything from a discovery session to identify spoken words a firm wants to find, to iterative improvements that drive up the accuracy of word recognition, and a final review that considers whether the trial has bettered manual methods of surveillance and met expectations.

While technology can provide solutions to problems such as the manipulation of Libor, ownership of the problems remains a critical yet outstanding issue. Mainelli said: “No-one seems to own the problems. At the moment, they are mostly a legal issue, but firms as a whole need to buck up. Some have started to withdraw from providing pricing to the market, but that is not good. The need is for firms to push on with doing better; they need to get a grip on how to manage indices or risk losing them.”

Similarly, the webinar participants agreed that shutting down communication channels such as chat is no more than a knee-jerk reaction to the problem of poor data contributions being made to the market. As Furmidge concluded: “We need to understand the art of the possible and we need more cooperation among regulators, companies, IT teams and suppliers to deliver complete and effective surveillance solutions to the market.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: ESG data sourcing and management to meet your ESG strategy, objectives and timeline

ESG data plays a key role in research, fund product development, fund selection, asset selection, performance tracking, and client and regulatory reporting, yet it is not always easy to source and manage in a complete, transparent and timely manner. This webinar will review the state-of-play on ESG data, consider the challenges of sourcing and managing...

BLOG

Malaysia Assurance Move Highlights Growing Taste for ESG Data Audit

Malaysia has become the latest country to stress data assurance in its proposal for a broad sustainability reporting framework, highlighting the increasing importance being placed globally on ESG data audits. The government in Kuala Lumpur included assurance support in its touted National Sustainability Reporting Framework for Malaysia (NSRF), consultations on which were launched this week....

EVENT

ESG Data & Tech Summit London

The ESG Data & Tech Summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Tackling the Data Management Challenges of FATCA

As the July 1, 2014 deadline for compliance with the Foreign Account Tax Compliance Act – or FATCA – approaches, financial institutions around the world are working to ensure their data management and operational systems will meet the requirements of the US legislation. This report discusses the requirements of FATCA and how the legislation is...