Regulatory Compliance - A-Team https://a-teaminsight.com/category/regulatory-compliance/ Wed, 21 Aug 2024 08:26:40 +0000 en-GB hourly 1 https://wordpress.org/?v=6.6.1 https://a-teaminsight.com/app/uploads/2018/08/favicon.png Regulatory Compliance - A-Team https://a-teaminsight.com/category/regulatory-compliance/ 32 32 Asset Managers Can Learn Lessons from New Government’s Cost Pressures https://a-teaminsight.com/blog/asset-managers-can-learn-lessons-from-new-governments-cost-pressures/?brand=dmi Wed, 21 Aug 2024 08:26:40 +0000 https://a-teaminsight.com/?p=69650 By Thomas McHugh, CEO and Co-founder of FINBOURNE Technology. Any new government brings the inevitable “change” message, but one thing never changes regardless of who has the keys to the treasury – seeking out departmental cost savings wherever humanly possible. With unprotected departments facing cuts of up to 2.9% according to the Institute for Fiscal Studies,...

The post Asset Managers Can Learn Lessons from New Government’s Cost Pressures appeared first on A-Team.

]]>
By Thomas McHugh, CEO and Co-founder of FINBOURNE Technology.

Any new government brings the inevitable “change” message, but one thing never changes regardless of who has the keys to the treasury – seeking out departmental cost savings wherever humanly possible. With unprotected departments facing cuts of up to 2.9% according to the Institute for Fiscal Studies, Rachel Reeves faces the unenviable task of making eye-watering efficiency savings while also boosting growth.

This fiscal pressure provides a timely parallel for asset managers, who are also grappling with their own rising costs, particularly in the thorny area of operations. Just as any new government must prioritise where to make cuts without harming essential public services, asset managers need to navigate the longstanding challenge of reducing costs while protecting key business functions.

Moves as drastic as ditching trading terminals to save millions, as some have considered, underscores the immense pressure to streamline operations amid years of shrinking margins and heightened competition. Streamlining data management makes more sense as a cost cutting approach, as it brings the added benefits of more effective decision making and the opportunities to innovate that arise from better managed data.

Much like our political system, data in asset management firms is often a tangled mess. Years of patching together disparate solutions with siloed data sets have resulted in a Frankenstein–like tech stack. Changing this in a big all-encompassing program that promises a new way can lead to disappointment and a change program divorced from reality – more like a Liz Truss-style fiscal horror show than an efficient machine. Transforming data into a core operational asset, at a cost that is manageable, can be a real game changer for Asset Management firms.

With all this in mind, like politicians addressing their electorate, asset managers must prioritise the needs of their investors. Extensive change to data management strategies may be needed, but it doesn’t follow that changes should all take the form of ‘megaprojects.’ Technology should simplify, not complicate. The right software should work seamlessly, providing accurate and timely information that transforms how firms use data to deliver enhanced services to their clients .

The lesson from the government’s fiscal challenges is clear: prioritise, streamline, and modernise. Asset managers should invest in integrated data management solutions that will ultimately result in a leaner, more efficient operation capable of thriving in a competitive landscape.

The post Asset Managers Can Learn Lessons from New Government’s Cost Pressures appeared first on A-Team.

]]>
Unlocking Private Market ESG Data through AI https://a-teaminsight.com/blog/unlocking-private-market-esg-data-through-ai/?brand=dmi Thu, 08 Aug 2024 08:27:43 +0000 https://a-teaminsight.com/?p=69573 By Yann Bloch, VP Product Management at NeoXam. In today’s investment world, the importance of integrating environmental, social, and governance factors into investment strategies is no longer up for debate. Asset managers globally recognise that sustainable business practices are not only vital for ethical considerations but are also critical for long-term financial performance. Despite this recognition,...

The post Unlocking Private Market ESG Data through AI appeared first on A-Team.

]]>
By Yann Bloch, VP Product Management at NeoXam.

In today’s investment world, the importance of integrating environmental, social, and governance factors into investment strategies is no longer up for debate. Asset managers globally recognise that sustainable business practices are not only vital for ethical considerations but are also critical for long-term financial performance. Despite this recognition, a significant challenge persists: accessing reliable and comparable ESG data, particularly from private companies that often lack standardised reporting practices. The solution to this problem lies in the innovative use of artificial intelligence (AI) technologies.

Private companies are increasingly producing sustainability reports that provide valuable insights into their ESG performance. However, these reports come in various formats, use different terminologies and offer varying levels of detail, creating a complex, unstructured data landscape. This lack of standardisation makes it difficult for asset managers to efficiently extract and utilise the data, hindering their ability to make informed investment decisions that align with ESG criteria.

The emergence of AI is poised to revolutionise how asset managers handle private market ESG data. AI, particularly machine learning models, can be trained to recognise and interpret the diverse formats and terminologies used in these sustainability reports. Take natural language processing (NLP) as prime case in point. A subfield of AI focused on the interaction between computers and human language, NLP can automatically extract key data points from unstructured texts. This transformation of unstructured data into structured, actionable information is a major step forward for the industry.

One of the primary benefits of using AI in this context is the ability to automate the data extraction process. Traditionally, asset managers had to manually sift through reports, a time-consuming and error-prone process. AI tools can scan thousands of documents in a fraction of the time it would take a human, ensuring that no critical information is overlooked. This not only increases efficiency but also allows asset managers to process larger volumes of data, providing a more comprehensive view of a company’s ESG performance.

AI is great for the extraction of data and even better when combined with robust data management technology. At the receiving end of AI-driven data extraction, robust data management systems ensure data quality, including consistency and completeness, and combine it with data from other sources. This integrated approach amplifies the value of AI by providing a holistic view of ESG metrics, essential for informed decision-making.

In addition, AI can enhance the comparability of ESG data from private companies. By standardising the extracted information, these technologies enable asset managers to compare ESG metrics across different firms, even if the original reports were vastly different in format and detail. This level of comparability is crucial for making informed investment decisions and for accurately assessing the ESG performance of potential investment targets.

Another significant advantage is the ability to keep pace with the evolving ESG reporting landscape. As regulatory requirements and industry standards for ESG reporting continue to develop, AI models can be updated to incorporate new criteria and metrics. This ensures that asset managers are always working with the most current and relevant data, maintaining the accuracy and reliability of their ESG assessments.

The integration of AI into ESG data management also supports transparency and accountability. By providing clear, structured data, these technologies enable asset managers to present their ESG findings to stakeholders with greater confidence and clarity. This transparency is not only beneficial for investor relations but also for meeting regulatory requirements and for maintaining the trust of clients who are increasingly demanding sustainable investment options.

The application of AI technologies in extracting private market ESG data represents a significant advancement for asset managers. These tools address the critical challenge of unstructured data, providing a streamlined, efficient, and reliable means of accessing the information necessary to drive sustainable investment strategies. As the industry continues to evolve, embracing these technological innovations will be essential for asset managers looking to stay ahead of the curve and deliver on their commitments to sustainable investing.

The post Unlocking Private Market ESG Data through AI appeared first on A-Team.

]]>
Latest UK SDR Measure Highlights Data Challenge https://a-teaminsight.com/blog/latest-uk-sdr-measure-highlights-data-challenge/?brand=dmi Tue, 06 Aug 2024 14:00:20 +0000 https://a-teaminsight.com/?p=69555 The UK has implemented the latest stage of its sustainability disclosure requirement (SDR), which is designed to encourage manufacturers of investment products to adopt measures that will prevent greenwashing. Before the measure was even introduced by the Financial Conduct Authority (FCA), however, it was apparent that fund managers’ likelihood of adopting the guidance would be...

The post Latest UK SDR Measure Highlights Data Challenge appeared first on A-Team.

]]>
The UK has implemented the latest stage of its sustainability disclosure requirement (SDR), which is designed to encourage manufacturers of investment products to adopt measures that will prevent greenwashing.

Before the measure was even introduced by the Financial Conduct Authority (FCA), however, it was apparent that fund managers’ likelihood of adopting the guidance would be limited by their data setups. Experts have told Data Management Insight that solving this challenge would be critical to meeting the goals that underpin the SDR.

Since July 31, managers have been asked to voluntarily label their products according to the degree to which they can be considered sustainable.

Those labelled “Sustainability Improvers” denote assets that have the potential to become sustainable but may not be now. “Sustainability Impact” products are those that invest in solutions that bring beneficial ESG impacts. “Sustainability Mixed Goals” labels indicate investment vehicles that combine the other two. A fourth, “Sustainability Focus”, is reserved for products that have at least 70% of allocations to sustainable assets.

Those seeking to adopt the labels must show they meet the requirements by the beginning of December.

Clarity Needed

Critics have predicted a slow uptake of the labels by fund houses, with some arguing that more clarity is needed about how the labels can be properly applied. At the heart of that challenge is likely to be firms’ ability to gather and use the data necessary to make those decisions.

The FCA said last year that asset managers and manufacturers must have robust data, governance and technology setups to adopt its measures. A poll during a webinar by consultancy firm Bovill Newgate, however, found that 90% of financial services respondents said they were not equipped with the correct ESG reporting data.

Emil Stigsgaard Fuglsang, co-founder at ESG data and consultancy firm Matter said data would be a potential pain point for firms operating in the UK.

“While many global investment managers already have these competencies in place thanks to the requirements of other regulations, the majority of smaller British firms do not,” Fugslang said.

“This means they face the challenge of accurately defining sustainability in their investments and implementing data and analytics solutions to track and document their performance against these definitions at the fund-level. This will be no easy task, but those who take action now will be best prepared by the December 2 deadline.”

Investor Protections

The labelling guidance follows the publication of anti-greenwashing advice by the FCA earlier this year, which seeks to protect investors from abuse by encouraging asset managers and manufacturers to be clear and precise in the descriptions of their products.

The FCA is keen to safeguard investors against being lured by false claims of an asset or product’s sustainability. The threat of greenwashing has been wielded as a weapon in an ESG backlash, most notably in the US, that has seen billions of dollars pulled from sustainability-linked funds.

While the measure is designed primarily to protect retail investors, it is expected also to have an impact on institutional capital allocators. One of the first funds to adopt an SDR label, AEW’s impact fund, has taken the Sustainable Impact categorisation and is offered only to institutions.

The SDR is also widely predicted to set transparency standards that institutions are likely to follow.

ESMA Guidance

The UK’s latest SDR implementation came as Europe’s regulators sought changes to some of the European Union’s disclosure rules. The European Securities and Markets Authority (ESMA), last week suggested changes that would affect the bloc’s lynchpin Sustainable Finance Disclosure Regulation (SFDR) and other measures.

In an opinion piece it set out a set of proposals that urge tweaks to the EU’s wider sustainable finance framework, arguing that there needs to be greater “interconnectedness between its different components”.

Among ESMA’s proposals are a phasing out of the phrase “sustainable investments” within the SFDR and a recommendation that market participants should instead make reference only to the green Taxonomy that underpins European market rules. Further, it suggested an acceleration of the Taxonomy’s completion, incorporating a social taxonomy.

It also urged that ESG data products be brought under regulatory scrutiny to improve their quality.

Clash of Standards

Other recommendations on how sustainability products should be described could conflict with the new measures introduced by the FCA.

ESMA suggests that all products provide basic information on their sustainability, with greatest detail given to institutional investors. It also urges the introduction of a “best in class” product categorisation system. That would include at least a “Sustainability” classification, denoting products that are already green, and a “Transition” grouping of funds that aim to be sustainable.

Martina Macpherson, head of ESG product strategy and management at SIX Financial Information, said institutions would need to familiarise themselves with each code.

“Challenges for asset managers remain to categorise funds in line with the UK’s labelling regime, and to align them with the EU’s fund labelling rules introduced by ESMA,” MacPherson said. “Overall, ESG fund labels represent a significant next step to address transparency and greenwashing concerns. Meanwhile, the mounting public and regulatory attention surrounding sustainable investment demands firms to use the most reliable, legitimate, and timely data to inform their decisions.”

The post Latest UK SDR Measure Highlights Data Challenge appeared first on A-Team.

]]>
Citigroup Fine Shows Importance of Having Robust Data Setup https://a-teaminsight.com/blog/citigroup-fine-shows-importance-of-having-robust-data-setup/?brand=dmi Tue, 30 Jul 2024 09:23:21 +0000 https://a-teaminsight.com/?p=69487 The US$136 million fine meted out to Citigroup for data irregularities dating back to 2020 should serve as a warning to all financial institutions that robust data management is essential to avoid sanctions amid tougher regulatory regimes. The Federal Reserve and Office of the Comptroller of the Currency (OCC) jointly imposed the penalty on the...

The post Citigroup Fine Shows Importance of Having Robust Data Setup appeared first on A-Team.

]]>
The US$136 million fine meted out to Citigroup for data irregularities dating back to 2020 should serve as a warning to all financial institutions that robust data management is essential to avoid sanctions amid tougher regulatory regimes.

The Federal Reserve and Office of the Comptroller of the Currency (OCC) jointly imposed the penalty on the international banking group after it was found to have put in place insufficient data management risk controls. Further, the group was told to hold quarterly checks to ensure it has safeguards in place.

The action has been seen a warning that regulators will take a tough stance against data management failings that could have a detrimental impact on banks’ clients and their business. Charlie Browne, head of market data, quant and risk solutions at data enterprise data management services provider GoldenSource, said the fine shows that there can be no hiding bad practices.

Greater Scrutiny

“Citigroup’s fine should be a warning to other banks and institutions who may have believed their insufficient data and risk controls could fly under the radar,” Browne told Data Management Insight. “It’s time to adapt, or be forced to pay up.”

Financial institution’s data management structures are likely to come under greater regulatory scrutiny to protect customers as more of their activities are digitalised, as artificial intelligence is incorporated into tech systems and amid growing acceptance of crypto finance.

As well as data privacy protection measures, organisations will be expected to tighten controls on many other data domains including trading information and ESG reporting. The fallout from the collapse of Silicon Valley Bank last year will also put pressure on lenders’ solvency requirements and crisis management, processes that are heavily data-dependent.

Data Care

Browne said the penalty imposed on Citigroup showed that institutions had to take greater care with their data and controls models because regulators are very aware of how important digital information is to the efficient running of all parts of an enterprise’s operations.

This fining of Citigroup demonstrates the very real costs associated with banks not being on top of their risk controls and data management,” he said.

“It’s a bold statement from the US rule makers that banks showing complacency about their data issues will be met with regulatory action. Regulators globally are now coming to the understanding that it’s fundamental that financial institutions have effective data management strategies.”

While breaches of Europe’s General Data Protection Regulation (GDPR) and anti-money laundering rules have already been at the root of fines imposed on banks and financial services firms, penalties related to operational use of data are expected to grow.

For example, institutions interviewed by A-Team Group have regularly said they are closely examining the data privacy and IP implications of using outputs from generative AI applications. The concern they have is that the content generated will be in beach of copywriter material on which the model has been trained.

Non-Negotiable

Browne’s comments were echoed by the found and chief executive of Monte Carlo Data Barr Moses, who said that as data needs become central to firms’ operations, “data quality becomes non-negotiable”.

“In 2024 data quality isn’t open for discussion — it’s a clear and present risk and it needs our attention,” Moses wrote on LinkedIn.

Browne said that ensuring compliance will require strenuous efforts by organisations to go deep into their data capabilities and processes.

“Data quality and accessibility are, rightly, front of mind, however, it’s also vital that banks consider concepts like data governance and data lineage when assessing the efficiency of their systems and adequately managing their risk. Being able to track data back to source is an important tool that rule makers are increasingly looking to demand of banks, visible in regulations like the ECB’s Risk Data Aggregation and Risk Reporting (RDARR) measures.”

The post Citigroup Fine Shows Importance of Having Robust Data Setup appeared first on A-Team.

]]>
Data Warning After UK Signals New Law Covering AI Use https://a-teaminsight.com/blog/data-warning-after-uk-signals-new-law-covering-ai-use/?brand=dmi Fri, 26 Jul 2024 14:09:57 +0000 https://a-teaminsight.com/?p=69475 Financial institutions operating in the UK must begin ensuring the integrity of their data estates after the newly elected government signalled plans to forge a potentially far-reaching AI bill. Leaders of two large data management companies said that any new technology law could usher powers of intervention if AI models and processes are seen as...

The post Data Warning After UK Signals New Law Covering AI Use appeared first on A-Team.

]]>
Financial institutions operating in the UK must begin ensuring the integrity of their data estates after the newly elected government signalled plans to forge a potentially far-reaching AI bill.

Leaders of two large data management companies said that any new technology law could usher powers of intervention if AI models and processes are seen as likely to cause danger to individuals or companies. Only with robust data management setups would organisations be able to ensure they don’t breach any new law.

Greg Hanson, group vice president and head of EMEA North sales at Informatica, and Arun Kumar, UK regional director at ManageEngine, offered their thoughts after the government of new Prime Minister Kier Starmer revealed its legislative programme for the next parliament.

While the announcement made no mention of a full AI bill, the plans revealed in the King’s Speech made by King Charles at the opening of parliament last week, said the UK will seek to “establish the appropriate legislation to place requirements on those working to develop the most powerful artificial intelligence models”.

Bad Data, Bad Outcomes

“Businesses must now brace for greater intervention and be prepared to demonstrate how they are protecting the integrity of AI systems and large language models,” said Hanson. “Developing robust foundations and controls for AI tools is a good starting point.”

Hanson echoed a view common among technologists and users that AI can only be useful if it is fed good data. Without that, downstream processes will be erroneous and potentially catastrophic to workflows and operations.

“Bad data could ultimately risk bad outcomes, so organisations need to have full transparency of the data used to train AI models,” he added. “And just as importantly, businesses need to understand the decisions AI models are making and why.”

With more institutions putting critical parts of their data processes in the hands of AI technologies, policy makers are worried that miscalculations will lead to a snowball effect of failures and negative impacts on people and businesses. Recent AI malfunctions have led to companies paying damages to affected parties. In February, for instance, Air Canada was forced to offer reparations to a passenger who was given inaccurate information by an AI-powered chatbot.

Hanson said that organisations should begin by ensuring that machines don’t make decisions without human input.

“It’s critical that AI is designed, guided and interpreted from a human perspective,” he said, offering as an example careful consideration about whether large language models have been trained on “bias-free, inclusive data or whether AI systems can account for a diverse range of emotional responses”.

“These are important considerations that will help manage the wider social risks and implications it brings, allowing businesses to tackle some of the spikier challenges that generative AI poses so its transformative powers can be realised,” he said.

Driving Improvements

Arun at ManageEngine, the enterprise IT management division of Zoho Corporation, said a well-crafted bill would do more than simply list what organisations should not do.

“This bill promises to go a long way in helping to tackle the risks that come from a lack of specialised knowledge around this relatively new technology,” he said. Such a bill “could give businesses guidance on how to prioritise trust and safety, introducing essential guard rails to ensure the safe development and usage of AI”.

Pointing to recent ManageEngine research that showed 45% of IT professionals only have a basic understanding of generative AI technologies and that most have no governance frameworks for AI implementation, he said that a bill would provide the confidence needed AI systems improve.

“Introducing legislation on safety and control mechanisms, such as a requirement to protect the integrity of testing data, will help guide the use of AI so businesses can confidently use it to drive business growth,” he said.

The post Data Warning After UK Signals New Law Covering AI Use appeared first on A-Team.

]]>
Webinar Review: Harnessing the Wider Benefits of Data Identifiers https://a-teaminsight.com/blog/webinar-review-harnessing-the-wider-benefits-of-data-identifiers/?brand=dmi Tue, 23 Jul 2024 13:49:22 +0000 https://a-teaminsight.com/?p=69442 Almost three-quarters of capital markets participants are utilising data standards and identifiers beyond their immediate regulatory use cases, realising the huge benefits that ordered and consistent datasets can bring to an enterprise’s entire operations. The findings of an A-Team Group Data Management Insight poll showed that 40% of respondents said they are using the resources to a...

The post Webinar Review: Harnessing the Wider Benefits of Data Identifiers appeared first on A-Team.

]]>
Almost three-quarters of capital markets participants are utilising data standards and identifiers beyond their immediate regulatory use cases, realising the huge benefits that ordered and consistent datasets can bring to an enterprise’s entire operations.

The findings of an A-Team Group Data Management Insight poll showed that 40% of respondents said they are using the resources to a “great extent”, while another 33% are using them to a “good extent”. Just 13% reported they aren’t utilising them at all.

The poll illustrates how financial institutions are seizing on the consistency that identifiers bring to data to turbo-boost use cases such as know-your-customer (KYC) processes and risk management, as well as bring broad operational efficiencies, according to speakers at DMI’s most recent webinar, during which the poll was held.

The webinar, entitled “How to maximise the use of data standards and identifiers beyond compliance and in the interest of the business”, gathered leading participants in the data management and identifiers space. To confine the use of identifiers to satisfying regulatory obligations would be a waste, Emma Kalliomaki, managing director of the Derivatives Service Bureau (DSB) told the webinar.

Broad Strategy

While they are critical to bringing “efficiency and harmonisation”, their broader deployment has become part of data management best practices, Kalliomaki said. Having a data strategy that recognises the applications of such resources to data uses throughout the entire business is critical, she said, adding that this necessitated robust governance models.

Among the speakers was Alexandre Kech, chief executive of the Global Legal Entity Identifier Foundation (GLEIF), which recently devised the Legal Entity Identifier (LEI) standard that’s used by companies and financial organisations around the world. Its latest iteration, the virtual LEIs, or vLEI – a cryptographically secure digital representation of LEIs – has been adopted by a large number of companies, especially within global supply chains, Kech said.

The consistency that standards and identifiers bring is also crucial to enabling organisations to “stitch” together datasets across an enterprise, enabling them to identify patterns and outliers in those pools of information, offered Robert Muller, senior group manager and technology product owner at BNY. This, he added, can create the foundations on which AI can be applied and on which the accuracy of analytical models can be improved.

Despite recognising the wider benefits of identifiers, many companies are encountering challenges in realising them. Chief among them, according to another poll during the webinar, is their integration with existing applications and systems. Two-third of respondents cited this as their chief impediment to broader utilisation.

Integration Challenges

Laura Stanley, director of entity data and symbology at LSEG said she was unsurprised by the polling. The multiplicity of systems and software deployed by modern financial institutions makes integration of their technology difficult and imposes an obstacle on the sort of joined-up thinking that is enabled by identification standards.

Another key challenge facing organisation, according to the poll, was the variety of, and regular creation of, identification standards. As well as LEIs, other standards include Unique Product Identifiers (UPIs), the International Securities Identification Number (ISIN) and the ISO 20022. These join proprietary naming codes, which companies use internally.

Kalliomaki said that companies should not be deterred by the apparent complexity of these different codes because they are largely complementary. When making a business case for their wider application, they also have the benefit of being low-cost resources, she said.

Further, she added, their wider use also provides organisations the opportunity to help national standards committees play a part in the evolution of identifiers, making them even more useful and bringing greater benefits to financial institutions.

Stanley agreed, echoing a point stated by Muller, that the application of AI, and in particular Generative AI, was likely to simplify the currently complex process of switching between standards. This, the panel agreed, would require a programme of educating market participants on the benefits more broadly using identifiers.

The post Webinar Review: Harnessing the Wider Benefits of Data Identifiers appeared first on A-Team.

]]>
Meeting New Capital Markets Challenges: Gresham and Alveo Leaders Discuss Merger and Future Plans https://a-teaminsight.com/blog/meeting-new-capital-markets-challenges-gresham-and-alveo-leaders-discuss-merger-and-future-plans/?brand=dmi Tue, 23 Jul 2024 09:28:28 +0000 https://a-teaminsight.com/?p=69434 The merger of Gresham Technologies and Alveo, which was announced last week, was born of a desire by each company to scale their capabilities to meet growing international demand from financial institutions at a time of increased focus on data management. The venture saw Gresham Technologies delist from the public market to create the new...

The post Meeting New Capital Markets Challenges: Gresham and Alveo Leaders Discuss Merger and Future Plans appeared first on A-Team.

]]>
The merger of Gresham Technologies and Alveo, which was announced last week, was born of a desire by each company to scale their capabilities to meet growing international demand from financial institutions at a time of increased focus on data management.

The venture saw Gresham Technologies delist from the public market to create the new company, which will be known as Gresham. The deal has resulted in a company that combines Gresham Technologies’ transaction control and reconciliations, data aggregation, connectivity solutions and regulatory reporting capabilities with Alveo’s enterprise data management for market, reference and ESG data.

Backed by Alveo’s majority investor STG, a technology-focused private equity firm, the combined business has got to work promoting what it calls its enterprise data automation offering.

Data Management Insight spoke to chief executive Ian Manocha, formerly head of Gresham Technologies, and chair Mark Hepsworth, who held the leadership role at Alveo, about the genesis of the merger and their plans for the future.

“We think it’s a big thing, and I hope the industry recognises that too,” says Hepsworth.

Data Management Insight: What was the rationale behind this merger?

Ian Manocha: Mark and I have known each other and for quite a few years and have always seen the strategic value of working together.

Mark Hepsworth: We’re complementary businesses. We at Alveo focus on enterprise data management, market data, reference data and ESG data and Gresham has built a business around reconciliation, investment management data and connectivity services through to regulatory reporting. The common thread is that we’re both solving data management problems for customers in financial services.

DMI: Where do you see complementarity?

MH: There’s a lot of overlap in terms of some of our customers but also the type of customers that we both sell to, the parts of those customers that we sell to both on the sell side and the buy side, and in areas like exchanges. Also, often at a senior level the same person is responsible for what their firm is doing around market data, as well as reconciliations data for example, and data management..

DMI: What triggered the eventual decision to merge?

IM: A number of things really came together at the right time. There was STG’s interest in us and the board’s view that our shareholders would be open to an exit at the right price. And from a Gresham perspective, we had a sense that, at this stage of the company’s development, we were going to be better served coming off the public markets and having the backing of a large firm like STG to accelerate our journey to take on the opportunities that we were seeing in the market. Mark and I started having the ‘we are finally going to make this happen’, conversation.

DMI: What are those opportunities?

IM: Between us we’ve got the landscape well covered so the question is now, having got all that data and now having the capability to manage it and ensure the quality of it, and of course, the reconciliation capabilities, a part of that question is, ‘what more can we do with it – how can we convert that into a business opportunity for our clients’? That’s the exciting area. So we see an opportunity now to invest more in areas like AI and to invest more in other players in the market.

DMI: What are your plans for growth?

IM: Gresham built a business organically and with some M&A work – we’ve acquired four firms in my nine years at the company. But that’s become more difficult for us on the public markets. It’s well known that there are challenges around liquidity particularly for small caps. We now have the financial backing of STG to look at those opportunities, whether we go after other firms or through organic investment, to fill out that vision of being the leading player in the data automation space for capital markets.

DMI: What will the new company offer its customers?

MH: What we’re really looking to do is create a significant new player in data management for financial services. We now have a broader range of capabilities and data management solutions that stretches further across the enterprise than they did before so we can solve more problems for clients.

Clients have a real focus on data both operationally and in terms of efficiently processing that data and delivering to business users, and doing that with the right level of governance control and transparency. All our customers are regulated and ensuring that they’re using high-quality data in their downstream processes is very important.

IM: Our customers are looking for a real heavyweight player in the data automation and data management space. They want a single heavyweight, well-funded, global company with strong technology capabilities and deep domain expertise to be their partner in their digital transformation because they’re fed up working with people that don’t understand the detail of capital markets data, and they’re fed up with having too many parties to work with.

DMI: What factors are driving the demand you want to meet?

MH: What I’ve seen over the years is that clients effectively feel data management could be easier than it is, that there’s more manual process than then they’d like. Both our companies have really focused their roadmaps in recent years around how we make that easier for customers. We moved to the cloud and both adopted open-source technologies that facilitates easy data management, as well as focused on improving business user self service. We really want to make data management easier for customers and that’s really where we’re going with the automation piece in our new tag line

IM: I’ve long felt that customers are looking to simplify their operating models. It’s not just about having the technical software, it’s also the skills and the capacity to deliver the change that’s needed. That’s particularly true in the mid-sized and smaller firms. There’s no way they can possibly build all that capability in house so we want to be the partner that they seek to deliver that end to-end-capability as a service.

DMI: Are there any practical technical issues you have had to overcome in your integration?

IM: Both firms have got modern development shops, cloud-native tech stacks and we use modern tools, so the kind of legacy stuff that’s harder to move forward is not an issue for us. And at the product level, things like APIs and cloud solutions, you don’t necessarily need to have the deep level of integration you did in the past. So for customers that won’t be visible.

DMI: What products and services will you be offering initially and what do you have in the pipeline?

MH: We will continue to offer those solutions we’re famous for: data automation and control, reconciliations and exceptions management, market data EDM, investment management data aggregation and regulatory reporting. But we’re also excited to get going on new initiatives.

IM: First out of the gates will be offerings for investment managers leveraging the greater richness of data that we now manage on their behalf. Let me give you a practical example, in Alveo market data pricing projects we are readily able to source pricing data for liquid assets but often struggle to obtain pricing for illiquid assets. Whereas in many Gresham NAV reconciliation projects were are pulling latest available pricing for some illiquid assets. So together we can fill a price visibility gap for our customers.

There are many other examples where we can now inject valuable insights into core processes without firms having to invest in costly, risky, data lake projects.  And thinking more strategically, leveraging the Alveo data management technology will help business users with self-service and distribution of these combined data sets. It’s super exciting for us and the customers we’ve spoken to are also enthusiastic which it the acid test.

The post Meeting New Capital Markets Challenges: Gresham and Alveo Leaders Discuss Merger and Future Plans appeared first on A-Team.

]]>
DMI Webinar Preview: How to Maximise the use of Data Standards and Identifiers Beyond Compliance and in the Interests of the Business https://a-teaminsight.com/blog/dmi-webinar-preview-how-to-maximise-the-use-of-data-standards-and-identifiers-beyond-compliance-and-in-the-interests-of-the-business/?brand=dmi Tue, 09 Jul 2024 14:43:57 +0000 https://a-teaminsight.com/?p=69176 Data must be consistent, accurate and interoperable to ensure financial institutions can use it in their investment, risk, regulatory compliance and other processes. Without those attributes, they won’t achieve the efficiencies, surface the insights, action decisions or realise the many other benefits of digitalisation. Identifiers and standards ensure those attributes can be met. The challenge...

The post DMI Webinar Preview: How to Maximise the use of Data Standards and Identifiers Beyond Compliance and in the Interests of the Business appeared first on A-Team.

]]>
Data must be consistent, accurate and interoperable to ensure financial institutions can use it in their investment, risk, regulatory compliance and other processes. Without those attributes, they won’t achieve the efficiencies, surface the insights, action decisions or realise the many other benefits of digitalisation.

Identifiers and standards ensure those attributes can be met. The challenge facing institutions, however, is that such rules often conflict or don’t exist. At the most fundamental level, for instance, company names may not be identically represented across datasets, meaning any analytics or other process that incudes that data could be skewed.

When identifiers and standards do align, however, they offer value beyond the advantages that come with clear categorisation. These benefits will form an important part of the conversation in A-Team Group Data Management Insight’s next webinar, entitled “How to Maximise the use of Data Standards and Identifiers Beyond Compliance and in the Interests of the Business”.

Industry Leaders

The webinar will see leading figures from the sector delve into the importance of identifiers and standards as well as provide context about their uses and benefits. On the panel will be: Alexandre Kech, chief executive of the Global Legal Entity Identifier Foundation (GLEIF); Robert Muller, director and senior group manager, technology product owner, at BNY; Emma Kalliomaki, managing director at Derivatives Service Bureau (DSB); and, Laura Stanley, director of entity data and symbology at LSEG.

“Identifiers and standards play a critical role in data management,” GLEIF’s Kech tells DMI. “They facilitate clear identification and categorisation of data, enabling efficient data integration, sharing, and analysis.

Without them financial institutions, corporates and other legal entities, would struggle with several challenges, he said.

Among those pain points are data inconsistency resulting from different systems using different naming conventions, which would lead to difficulties in data reconciliation and integration, and operational inefficiencies, with manual processes being used to verify and match data increasing the risk of errors and operational costs.

Additionally, Kech said, compliance risks that stem from fragmented and inconsistent data would prevent regulatory requirements to be met effectively; and, limited transparency would make tracing transactions and entities accurately difficult, potentially hindering risk management and auditing processes.

In essence, this would erode trust and reliability in the data, said DSB’s Kalliomaki.

“That is fundamental for firms to fulfil a lot of functions, but regulatory reporting is one that comes with great consequences if not undertaken properly,” she tells DMI.

“When it comes to having data standards, everyone is very aware that to better manage your data, to better assure the quality of your data, to ensure consistency alignment harmonisation with your counterparties and to mitigate the number of omissions and errors you may have, having standards is much more effective from a data management standpoint.”

Growing Need

“The amount of data that financial services firms are engaging with in their financial instrument processes is growing exponentially. Therefore, the need for data standards and identifiers is growing alongside this,” said Stanley at LSEG, which supports a number of identifiers, enabling delivery of a firm’s existing and evolving use cases.

LSEG issues proprietary identifiers such as SEDOL and RIC and acts as an National Numbering Agency for UK ISIN codes, is a globally accredited Local Operating Unit for LEI codes and recognises the importance of standards across the ecosystem and beyond regulation.

“At LSEG we acknowledge the potential of data when shared, the PermID is fully open and acts as the connective tissue that enables us to identify different objects of information and stitch data sets together.”

More Than Compliance

With robust identifiers and standards in place, the full value of data can be extracted. Among the benefits expected to be discussed in the webinar are:

  • Improved decision-making and analysis
  • Lower costs from reducing the need for manual data processing and reconciliation and from accelerating transaction processing
  • Innovation driven by seamless data exchange between different systems and organisations
  • Enhanced business agility and competitiveness that comes from providing reliable data for strategic planning and risk management.

“I see financial institutions using data standards and identifiers – beyond compliance – to a great extent,” says BNY’s Muller. “There are a number of best practices firms can employ, for instance strategy, design and education, to ensure standards and identifiers deliver value through associated business cases.”

With regulatory demands likely to increase over time the need for common identifiers and standards is expected to grow in importance and lead to harmonisation across borders.

“As a broader community, we all have to be willing to look at the greater good rather than commercialisation or IP-related aspects,” says Kalliomaki. “That harmonisation of us working together collaboratively is key.”

  • A-Team Group’s How to Maximise the use of Data Standards and Identifiers Beyond Compliance and in the Interests of the Business webinar will be held on July 18 at 10am ET / 3pm BST / 4pm CET. Click here to join the discussion.

The post DMI Webinar Preview: How to Maximise the use of Data Standards and Identifiers Beyond Compliance and in the Interests of the Business appeared first on A-Team.

]]>
Duco Unveils AI-Powered Reconciliation Product for Unstructured Data https://a-teaminsight.com/blog/duco-unveils-ai-powered-reconciliation-product-for-unstructured-data/?brand=dmi Tue, 09 Jul 2024 14:37:59 +0000 https://a-teaminsight.com/?p=69173 Duco, a data management automation specialist and recent A-Team Group RegTech Insight Awards winner, has launched an artificial intelligence-powered end-to-end reconciliation capability for unstructured data. The Adaptive Intelligent Document Processing product will enable financial institutions to automate the extraction of unstructured data for ingestion into their systems. The London-based company said this will let market...

The post Duco Unveils AI-Powered Reconciliation Product for Unstructured Data appeared first on A-Team.

]]>
Duco, a data management automation specialist and recent A-Team Group RegTech Insight Awards winner, has launched an artificial intelligence-powered end-to-end reconciliation capability for unstructured data.

The Adaptive Intelligent Document Processing product will enable financial institutions to automate the extraction of unstructured data for ingestion into their systems. The London-based company said this will let market participants automate a choke-point that is often solved through error-prone manual processes.

Duco’s AI can be trained on clients’ specific documents, learning how to interpret layout and text in order to replicate data gathering procedures with ever-greater accuracy. It will work within Duco’s SaaS-based, no-code platform.

The company won the award for Best Transaction Reporting Solution in A-Team Group’s RegTech Insight Awards Europe 2024 in May.

Managing unstructured data has become a key goal of capital markets participants as they take on new use cases, such as private market access and sustainability reporting. These domains are largely built on datasets that lack the order of reference, pricing and other data formats with which it must be amalgamated in their systems.

“Our integrated platform strategy will unlock significant value for our clients,” said Duco chief executive Michael Chin. “We’re solving a huge problem for the industry, one that clients have repeatedly told us lacks a robust and efficient solution on the market. They can now ingest, transform, normalise, enrich and reconcile structured and unstructured data in Duco, automating data processing throughout its lifecycle.”

The post Duco Unveils AI-Powered Reconciliation Product for Unstructured Data appeared first on A-Team.

]]>
Building Future Growth Around a Foundational Data Core: SIX’s Marion Leslie https://a-teaminsight.com/blog/building-future-growth-around-a-foundational-data-core-sixs-marion-leslie/?brand=dmi Wed, 03 Jul 2024 08:20:31 +0000 https://a-teaminsight.com/?p=69100 There’s a neat symmetry in speaking to Marion Leslie, head of financial information at SIX after one of the busiest six months in the company’s recent history. SIX, a global data aggregator and operator of exchanges in its native Switzerland, as well as in Spain, has released a flurry of new data products since January,...

The post Building Future Growth Around a Foundational Data Core: SIX’s Marion Leslie appeared first on A-Team.

]]>
There’s a neat symmetry in speaking to Marion Leslie, head of financial information at SIX after one of the busiest six months in the company’s recent history.

SIX, a global data aggregator and operator of exchanges in its native Switzerland, as well as in Spain, has released a flurry of new data products since January, including a suite of ESG tools and two global equities index families that herald a plan to become a one-stop-shop for ETFs.

According to Leslie, the frenetic pace of partnerships, product releases and enhancements this year is just the tip of the iceberg. The Zurich-based, bank-owned organisation has more to come, all built around a trove of data and data capabilities it has built up over more than 90 years of operations.

At heart, it remains a global pricing reference data provider – that’s the “base data” that SIX “is built on”, says Leslie. But the company is putting in place ambitious plans to leverage that core data competency to meet the increasingly complex demands and use cases of financial institutions.

“I believe that the fundamental data set – having really good-quality reference data and pricing data – allows us to create new value-added services and insights to our clients, and that remains the same whether we’re talking about GenAI or good old fashioned master reference,” Leslie tells Data Management Insight from SIX’s offices in London. “Unless you’ve got those basics you can’t really make sensible decisions, let alone produce reliable analytics.”

Expansion Plans

Leslie says SIX sees its USP as the ability to leverage that core data product to create applications for a multiplicity of use cases. Already it is using its fundamental datasets as the backbone of regulatory, corporate actions, tax, sanctions and ESG products for its banking clients.

A slew of recent acquisitions, investments and partnerships have been similarly guided by SIX’s programme of creating services that can tap into its core offering. The purchase of ULTUMUS in 2021 and the deepening of a long-standing association with BITA earlier this year were part of a plan to forge the company’s ETF-servicing business, each deal enhancing SIX’s indexing capabilities.

In ESG too, it has been aggressively striking deals to help burnish a slate of new sustainability offerings. Products unveiled in the past year by ESG product strategy and management head Martina MacPherson all benefit from supply deals struck with vendors including Sustainalytics, MSCI, Inrate and the CDP, as well as new partnerships with companies including Greenomy. Among the ESG products launched recently is an SME assessment tool, which MacPherson said will bring thousands of smaller companies into the ESG data ecosystem, into which banks and investors might otherwise have had no visibility.

Working Data

SIX’s ESG provisions illustrate what Leslie describes as the company’s dedication to making data work for companies.

“Organisations need to figure out how they’re going to incorporate data and how they’re going to make it relevant,” she says. “Well, the only way you can make it relevant is if it’s got something to hook on to, and that’s where you get back to those fundamental data sets.”

Leslie explains that one of the driving forces behind the company’s vigorous expansion plans is the changing demands for data among banks. No longer can any part of the industry rely on end-of-day pricing data, or monthly and quarterly reports. Ditto for risk managers and compliance teams.

The consequence has been a shift in the workloads of the front-, middle- and back-offices. No longer is research the premise of middle-office teams, Leslie offers as an example; the front office needs those insights quicker and so it has made sense for banks to embed data access and functionality within asset managers own analytical workflows.

“Asset managers see that the speed of data is increasing all the time and so the buy side, which was perhaps in the past much more built around end-of-day or less immediate requirements, is moving much more into real-time and intraday needs,” she says. “That requires, therefore, real-time market data, and that is expected by regulators, it’s expected by customers, and its therefore expected by market participants.”

AI Challenge

Jokingly, Leslie likens data operations to raising a child: it needs constant attention and feeding to grow and thrive. The simile is just as true for banks’ data management needs too; they are constantly changing and growing, influenced by internal needs and external innovations. That’s exemplified by the race to integrate artificial intelligence (AI) into processes and workflows.

Recent SIX research found that more than nine out of 10 asset managers expect to be using AI within the next three years and that half already do. Driven by its own clients’ need to understand what AI will mean to them, SIX has begun looking at how it can enhance its products with the various forms of AI available.

It has taken a structured approach to the programme and is looking at where AI can help clients improve efficiency and productivity; examining how it can improve customer experience and support; and, testing how it can be incorporated into products. For the latter, SIX is experimenting with off-the-shelf GenAI technology to identify aberrations in trading patterns within a market abuse solution.

On this subject, too, Leslie stresses that SIX can only think about such an evolution because it is confident that it has a solid foundational data offering.

“Our role is to make sure that we’re providing data that is fit for purpose and enables our clients to do business in a competitive way,” she says. “So that will include, as it always has, providing trusted, reliable data that the client knows is fit for purpose and on which they can make decisions. And that’s as true if it’s going to an AI model as if it’s going into a client digital wealth platform or portfolio reporting or risk solution.”

Values Align

Leslie took up her latest role at SIX in 2020 and also is a member of the board for the SIX-owned Grupo BME, Spain’s stock exchange, previously holding roles at LSEG and Thomson Reuters.

She is proud to be part of an organisation whose stakeholders are banks – about 120 of them – and not shareholders “trying to race to hit a quarter result”. She feels a very strong alignment with its values, too.

“It’s an organisation whose purpose is to enable the smooth functioning of the economy and has consistency and trust at the very core,” she says. “When half the world is voting this year, this stuff’s important, and when we’re talking about AI, or we’re talking about market failures then the thing that brings trust and progress is the data that sits behind it. To be a trusted provider in this day-and-age is a critical service.”

The post Building Future Growth Around a Foundational Data Core: SIX’s Marion Leslie appeared first on A-Team.

]]>