Standards - A-Team https://a-teaminsight.com/category/standards/ Tue, 06 Aug 2024 14:00:20 +0000 en-GB hourly 1 https://wordpress.org/?v=6.6.1 https://a-teaminsight.com/app/uploads/2018/08/favicon.png Standards - A-Team https://a-teaminsight.com/category/standards/ 32 32 Latest UK SDR Measure Highlights Data Challenge https://a-teaminsight.com/blog/latest-uk-sdr-measure-highlights-data-challenge/?brand=dmi Tue, 06 Aug 2024 14:00:20 +0000 https://a-teaminsight.com/?p=69555 The UK has implemented the latest stage of its sustainability disclosure requirement (SDR), which is designed to encourage manufacturers of investment products to adopt measures that will prevent greenwashing. Before the measure was even introduced by the Financial Conduct Authority (FCA), however, it was apparent that fund managers’ likelihood of adopting the guidance would be...

The post Latest UK SDR Measure Highlights Data Challenge appeared first on A-Team.

]]>
The UK has implemented the latest stage of its sustainability disclosure requirement (SDR), which is designed to encourage manufacturers of investment products to adopt measures that will prevent greenwashing.

Before the measure was even introduced by the Financial Conduct Authority (FCA), however, it was apparent that fund managers’ likelihood of adopting the guidance would be limited by their data setups. Experts have told Data Management Insight that solving this challenge would be critical to meeting the goals that underpin the SDR.

Since July 31, managers have been asked to voluntarily label their products according to the degree to which they can be considered sustainable.

Those labelled “Sustainability Improvers” denote assets that have the potential to become sustainable but may not be now. “Sustainability Impact” products are those that invest in solutions that bring beneficial ESG impacts. “Sustainability Mixed Goals” labels indicate investment vehicles that combine the other two. A fourth, “Sustainability Focus”, is reserved for products that have at least 70% of allocations to sustainable assets.

Those seeking to adopt the labels must show they meet the requirements by the beginning of December.

Clarity Needed

Critics have predicted a slow uptake of the labels by fund houses, with some arguing that more clarity is needed about how the labels can be properly applied. At the heart of that challenge is likely to be firms’ ability to gather and use the data necessary to make those decisions.

The FCA said last year that asset managers and manufacturers must have robust data, governance and technology setups to adopt its measures. A poll during a webinar by consultancy firm Bovill Newgate, however, found that 90% of financial services respondents said they were not equipped with the correct ESG reporting data.

Emil Stigsgaard Fuglsang, co-founder at ESG data and consultancy firm Matter said data would be a potential pain point for firms operating in the UK.

“While many global investment managers already have these competencies in place thanks to the requirements of other regulations, the majority of smaller British firms do not,” Fugslang said.

“This means they face the challenge of accurately defining sustainability in their investments and implementing data and analytics solutions to track and document their performance against these definitions at the fund-level. This will be no easy task, but those who take action now will be best prepared by the December 2 deadline.”

Investor Protections

The labelling guidance follows the publication of anti-greenwashing advice by the FCA earlier this year, which seeks to protect investors from abuse by encouraging asset managers and manufacturers to be clear and precise in the descriptions of their products.

The FCA is keen to safeguard investors against being lured by false claims of an asset or product’s sustainability. The threat of greenwashing has been wielded as a weapon in an ESG backlash, most notably in the US, that has seen billions of dollars pulled from sustainability-linked funds.

While the measure is designed primarily to protect retail investors, it is expected also to have an impact on institutional capital allocators. One of the first funds to adopt an SDR label, AEW’s impact fund, has taken the Sustainable Impact categorisation and is offered only to institutions.

The SDR is also widely predicted to set transparency standards that institutions are likely to follow.

ESMA Guidance

The UK’s latest SDR implementation came as Europe’s regulators sought changes to some of the European Union’s disclosure rules. The European Securities and Markets Authority (ESMA), last week suggested changes that would affect the bloc’s lynchpin Sustainable Finance Disclosure Regulation (SFDR) and other measures.

In an opinion piece it set out a set of proposals that urge tweaks to the EU’s wider sustainable finance framework, arguing that there needs to be greater “interconnectedness between its different components”.

Among ESMA’s proposals are a phasing out of the phrase “sustainable investments” within the SFDR and a recommendation that market participants should instead make reference only to the green Taxonomy that underpins European market rules. Further, it suggested an acceleration of the Taxonomy’s completion, incorporating a social taxonomy.

It also urged that ESG data products be brought under regulatory scrutiny to improve their quality.

Clash of Standards

Other recommendations on how sustainability products should be described could conflict with the new measures introduced by the FCA.

ESMA suggests that all products provide basic information on their sustainability, with greatest detail given to institutional investors. It also urges the introduction of a “best in class” product categorisation system. That would include at least a “Sustainability” classification, denoting products that are already green, and a “Transition” grouping of funds that aim to be sustainable.

Martina Macpherson, head of ESG product strategy and management at SIX Financial Information, said institutions would need to familiarise themselves with each code.

“Challenges for asset managers remain to categorise funds in line with the UK’s labelling regime, and to align them with the EU’s fund labelling rules introduced by ESMA,” MacPherson said. “Overall, ESG fund labels represent a significant next step to address transparency and greenwashing concerns. Meanwhile, the mounting public and regulatory attention surrounding sustainable investment demands firms to use the most reliable, legitimate, and timely data to inform their decisions.”

The post Latest UK SDR Measure Highlights Data Challenge appeared first on A-Team.

]]>
Data Warning After UK Signals New Law Covering AI Use https://a-teaminsight.com/blog/data-warning-after-uk-signals-new-law-covering-ai-use/?brand=dmi Fri, 26 Jul 2024 14:09:57 +0000 https://a-teaminsight.com/?p=69475 Financial institutions operating in the UK must begin ensuring the integrity of their data estates after the newly elected government signalled plans to forge a potentially far-reaching AI bill. Leaders of two large data management companies said that any new technology law could usher powers of intervention if AI models and processes are seen as...

The post Data Warning After UK Signals New Law Covering AI Use appeared first on A-Team.

]]>
Financial institutions operating in the UK must begin ensuring the integrity of their data estates after the newly elected government signalled plans to forge a potentially far-reaching AI bill.

Leaders of two large data management companies said that any new technology law could usher powers of intervention if AI models and processes are seen as likely to cause danger to individuals or companies. Only with robust data management setups would organisations be able to ensure they don’t breach any new law.

Greg Hanson, group vice president and head of EMEA North sales at Informatica, and Arun Kumar, UK regional director at ManageEngine, offered their thoughts after the government of new Prime Minister Kier Starmer revealed its legislative programme for the next parliament.

While the announcement made no mention of a full AI bill, the plans revealed in the King’s Speech made by King Charles at the opening of parliament last week, said the UK will seek to “establish the appropriate legislation to place requirements on those working to develop the most powerful artificial intelligence models”.

Bad Data, Bad Outcomes

“Businesses must now brace for greater intervention and be prepared to demonstrate how they are protecting the integrity of AI systems and large language models,” said Hanson. “Developing robust foundations and controls for AI tools is a good starting point.”

Hanson echoed a view common among technologists and users that AI can only be useful if it is fed good data. Without that, downstream processes will be erroneous and potentially catastrophic to workflows and operations.

“Bad data could ultimately risk bad outcomes, so organisations need to have full transparency of the data used to train AI models,” he added. “And just as importantly, businesses need to understand the decisions AI models are making and why.”

With more institutions putting critical parts of their data processes in the hands of AI technologies, policy makers are worried that miscalculations will lead to a snowball effect of failures and negative impacts on people and businesses. Recent AI malfunctions have led to companies paying damages to affected parties. In February, for instance, Air Canada was forced to offer reparations to a passenger who was given inaccurate information by an AI-powered chatbot.

Hanson said that organisations should begin by ensuring that machines don’t make decisions without human input.

“It’s critical that AI is designed, guided and interpreted from a human perspective,” he said, offering as an example careful consideration about whether large language models have been trained on “bias-free, inclusive data or whether AI systems can account for a diverse range of emotional responses”.

“These are important considerations that will help manage the wider social risks and implications it brings, allowing businesses to tackle some of the spikier challenges that generative AI poses so its transformative powers can be realised,” he said.

Driving Improvements

Arun at ManageEngine, the enterprise IT management division of Zoho Corporation, said a well-crafted bill would do more than simply list what organisations should not do.

“This bill promises to go a long way in helping to tackle the risks that come from a lack of specialised knowledge around this relatively new technology,” he said. Such a bill “could give businesses guidance on how to prioritise trust and safety, introducing essential guard rails to ensure the safe development and usage of AI”.

Pointing to recent ManageEngine research that showed 45% of IT professionals only have a basic understanding of generative AI technologies and that most have no governance frameworks for AI implementation, he said that a bill would provide the confidence needed AI systems improve.

“Introducing legislation on safety and control mechanisms, such as a requirement to protect the integrity of testing data, will help guide the use of AI so businesses can confidently use it to drive business growth,” he said.

The post Data Warning After UK Signals New Law Covering AI Use appeared first on A-Team.

]]>
Webinar Review: Harnessing the Wider Benefits of Data Identifiers https://a-teaminsight.com/blog/webinar-review-harnessing-the-wider-benefits-of-data-identifiers/?brand=dmi Tue, 23 Jul 2024 13:49:22 +0000 https://a-teaminsight.com/?p=69442 Almost three-quarters of capital markets participants are utilising data standards and identifiers beyond their immediate regulatory use cases, realising the huge benefits that ordered and consistent datasets can bring to an enterprise’s entire operations. The findings of an A-Team Group Data Management Insight poll showed that 40% of respondents said they are using the resources to a...

The post Webinar Review: Harnessing the Wider Benefits of Data Identifiers appeared first on A-Team.

]]>
Almost three-quarters of capital markets participants are utilising data standards and identifiers beyond their immediate regulatory use cases, realising the huge benefits that ordered and consistent datasets can bring to an enterprise’s entire operations.

The findings of an A-Team Group Data Management Insight poll showed that 40% of respondents said they are using the resources to a “great extent”, while another 33% are using them to a “good extent”. Just 13% reported they aren’t utilising them at all.

The poll illustrates how financial institutions are seizing on the consistency that identifiers bring to data to turbo-boost use cases such as know-your-customer (KYC) processes and risk management, as well as bring broad operational efficiencies, according to speakers at DMI’s most recent webinar, during which the poll was held.

The webinar, entitled “How to maximise the use of data standards and identifiers beyond compliance and in the interest of the business”, gathered leading participants in the data management and identifiers space. To confine the use of identifiers to satisfying regulatory obligations would be a waste, Emma Kalliomaki, managing director of the Derivatives Service Bureau (DSB) told the webinar.

Broad Strategy

While they are critical to bringing “efficiency and harmonisation”, their broader deployment has become part of data management best practices, Kalliomaki said. Having a data strategy that recognises the applications of such resources to data uses throughout the entire business is critical, she said, adding that this necessitated robust governance models.

Among the speakers was Alexandre Kech, chief executive of the Global Legal Entity Identifier Foundation (GLEIF), which recently devised the Legal Entity Identifier (LEI) standard that’s used by companies and financial organisations around the world. Its latest iteration, the virtual LEIs, or vLEI – a cryptographically secure digital representation of LEIs – has been adopted by a large number of companies, especially within global supply chains, Kech said.

The consistency that standards and identifiers bring is also crucial to enabling organisations to “stitch” together datasets across an enterprise, enabling them to identify patterns and outliers in those pools of information, offered Robert Muller, senior group manager and technology product owner at BNY. This, he added, can create the foundations on which AI can be applied and on which the accuracy of analytical models can be improved.

Despite recognising the wider benefits of identifiers, many companies are encountering challenges in realising them. Chief among them, according to another poll during the webinar, is their integration with existing applications and systems. Two-third of respondents cited this as their chief impediment to broader utilisation.

Integration Challenges

Laura Stanley, director of entity data and symbology at LSEG said she was unsurprised by the polling. The multiplicity of systems and software deployed by modern financial institutions makes integration of their technology difficult and imposes an obstacle on the sort of joined-up thinking that is enabled by identification standards.

Another key challenge facing organisation, according to the poll, was the variety of, and regular creation of, identification standards. As well as LEIs, other standards include Unique Product Identifiers (UPIs), the International Securities Identification Number (ISIN) and the ISO 20022. These join proprietary naming codes, which companies use internally.

Kalliomaki said that companies should not be deterred by the apparent complexity of these different codes because they are largely complementary. When making a business case for their wider application, they also have the benefit of being low-cost resources, she said.

Further, she added, their wider use also provides organisations the opportunity to help national standards committees play a part in the evolution of identifiers, making them even more useful and bringing greater benefits to financial institutions.

Stanley agreed, echoing a point stated by Muller, that the application of AI, and in particular Generative AI, was likely to simplify the currently complex process of switching between standards. This, the panel agreed, would require a programme of educating market participants on the benefits more broadly using identifiers.

The post Webinar Review: Harnessing the Wider Benefits of Data Identifiers appeared first on A-Team.

]]>
DMI Webinar Preview: How to Maximise the use of Data Standards and Identifiers Beyond Compliance and in the Interests of the Business https://a-teaminsight.com/blog/dmi-webinar-preview-how-to-maximise-the-use-of-data-standards-and-identifiers-beyond-compliance-and-in-the-interests-of-the-business/?brand=dmi Tue, 09 Jul 2024 14:43:57 +0000 https://a-teaminsight.com/?p=69176 Data must be consistent, accurate and interoperable to ensure financial institutions can use it in their investment, risk, regulatory compliance and other processes. Without those attributes, they won’t achieve the efficiencies, surface the insights, action decisions or realise the many other benefits of digitalisation. Identifiers and standards ensure those attributes can be met. The challenge...

The post DMI Webinar Preview: How to Maximise the use of Data Standards and Identifiers Beyond Compliance and in the Interests of the Business appeared first on A-Team.

]]>
Data must be consistent, accurate and interoperable to ensure financial institutions can use it in their investment, risk, regulatory compliance and other processes. Without those attributes, they won’t achieve the efficiencies, surface the insights, action decisions or realise the many other benefits of digitalisation.

Identifiers and standards ensure those attributes can be met. The challenge facing institutions, however, is that such rules often conflict or don’t exist. At the most fundamental level, for instance, company names may not be identically represented across datasets, meaning any analytics or other process that incudes that data could be skewed.

When identifiers and standards do align, however, they offer value beyond the advantages that come with clear categorisation. These benefits will form an important part of the conversation in A-Team Group Data Management Insight’s next webinar, entitled “How to Maximise the use of Data Standards and Identifiers Beyond Compliance and in the Interests of the Business”.

Industry Leaders

The webinar will see leading figures from the sector delve into the importance of identifiers and standards as well as provide context about their uses and benefits. On the panel will be: Alexandre Kech, chief executive of the Global Legal Entity Identifier Foundation (GLEIF); Robert Muller, director and senior group manager, technology product owner, at BNY; Emma Kalliomaki, managing director at Derivatives Service Bureau (DSB); and, Laura Stanley, director of entity data and symbology at LSEG.

“Identifiers and standards play a critical role in data management,” GLEIF’s Kech tells DMI. “They facilitate clear identification and categorisation of data, enabling efficient data integration, sharing, and analysis.

Without them financial institutions, corporates and other legal entities, would struggle with several challenges, he said.

Among those pain points are data inconsistency resulting from different systems using different naming conventions, which would lead to difficulties in data reconciliation and integration, and operational inefficiencies, with manual processes being used to verify and match data increasing the risk of errors and operational costs.

Additionally, Kech said, compliance risks that stem from fragmented and inconsistent data would prevent regulatory requirements to be met effectively; and, limited transparency would make tracing transactions and entities accurately difficult, potentially hindering risk management and auditing processes.

In essence, this would erode trust and reliability in the data, said DSB’s Kalliomaki.

“That is fundamental for firms to fulfil a lot of functions, but regulatory reporting is one that comes with great consequences if not undertaken properly,” she tells DMI.

“When it comes to having data standards, everyone is very aware that to better manage your data, to better assure the quality of your data, to ensure consistency alignment harmonisation with your counterparties and to mitigate the number of omissions and errors you may have, having standards is much more effective from a data management standpoint.”

Growing Need

“The amount of data that financial services firms are engaging with in their financial instrument processes is growing exponentially. Therefore, the need for data standards and identifiers is growing alongside this,” said Stanley at LSEG, which supports a number of identifiers, enabling delivery of a firm’s existing and evolving use cases.

LSEG issues proprietary identifiers such as SEDOL and RIC and acts as an National Numbering Agency for UK ISIN codes, is a globally accredited Local Operating Unit for LEI codes and recognises the importance of standards across the ecosystem and beyond regulation.

“At LSEG we acknowledge the potential of data when shared, the PermID is fully open and acts as the connective tissue that enables us to identify different objects of information and stitch data sets together.”

More Than Compliance

With robust identifiers and standards in place, the full value of data can be extracted. Among the benefits expected to be discussed in the webinar are:

  • Improved decision-making and analysis
  • Lower costs from reducing the need for manual data processing and reconciliation and from accelerating transaction processing
  • Innovation driven by seamless data exchange between different systems and organisations
  • Enhanced business agility and competitiveness that comes from providing reliable data for strategic planning and risk management.

“I see financial institutions using data standards and identifiers – beyond compliance – to a great extent,” says BNY’s Muller. “There are a number of best practices firms can employ, for instance strategy, design and education, to ensure standards and identifiers deliver value through associated business cases.”

With regulatory demands likely to increase over time the need for common identifiers and standards is expected to grow in importance and lead to harmonisation across borders.

“As a broader community, we all have to be willing to look at the greater good rather than commercialisation or IP-related aspects,” says Kalliomaki. “That harmonisation of us working together collaboratively is key.”

  • A-Team Group’s How to Maximise the use of Data Standards and Identifiers Beyond Compliance and in the Interests of the Business webinar will be held on July 18 at 10am ET / 3pm BST / 4pm CET. Click here to join the discussion.

The post DMI Webinar Preview: How to Maximise the use of Data Standards and Identifiers Beyond Compliance and in the Interests of the Business appeared first on A-Team.

]]>
GLEIF Creates vLEI Advisory Board to Support Digital Verification Technology Use https://a-teaminsight.com/blog/gleif-creates-vlei-advisory-board-to-support-digital-verification-technology-use/?brand=dmi Wed, 03 Jul 2024 07:00:34 +0000 https://a-teaminsight.com/?p=69079 The Global Legal Entity Identifier Foundation (GLEIF) has formed an international, cross-industry advisory board to provide support for users of its vLEI digital verification technology. The vLEI Technical Advisory Board will help stakeholders seeking technical, governance and developmental support. Its membership is drawn from expertise within companies including Accelerate, Esatus and Prosapien and will be...

The post GLEIF Creates vLEI Advisory Board to Support Digital Verification Technology Use appeared first on A-Team.

]]>
The Global Legal Entity Identifier Foundation (GLEIF) has formed an international, cross-industry advisory board to provide support for users of its vLEI digital verification technology.

The vLEI Technical Advisory Board will help stakeholders seeking technical, governance and developmental support. Its membership is drawn from expertise within companies including Accelerate, Esatus and Prosapien and will be chaired by GLEIF IT head Christoph Schneider.

The vLEI is a cryptographically secure digital representation of LEIs, the internationally recognised 20-digit code associated with companies around the world. A vLEI enables automatic verification without the need for human checks. GLEIF created the vLEI to help financial and other institutions organise entity-specific data within their systems.

“We have assembled the vLEI Technical Advisory Board to provide this new ecosystem with the best possible chance of succeeding,” said GLEIF chief executive Alexandre Kech. “By connecting the experts and creating the strategic partnerships that will evolve the vLEI’s supporting infrastructure, we aim to establish this system as the fundamental enabler of digital trust across the many value chains that underpin our global economy.”

System Growth

GLEIF said the new board would help accelerate the growth of the vLEI ecosystem by promoting its scalability and technical interoperability. It would also help to promote new use cases and build partnerships with the Open Source community. The board will meet once a month.

The vLEI is built on a chain of trust that is rooted back to GLEIF to provide verified proof of an entity’s identity. The vLEi infrastructure supports blockchain, self-sovereign identity and other decentralised key management systems.

“The strong, global and scalable governance framework of GLEIF combined with the vLEI technology finally offers a once-in-a-lifetime opportunity to establish a digital trust layer for a variety of enterprise use-cases across multiple industries,” said Vasily Suvorov, a member of the advisory board and chief technical officer of Accelerate.

“This will enable a new era of IT innovation targeting solutions that can automate any intercompany business process that requires regulatory, risk, and policy compliance certainty.”

  • GLEIF CEO Alexandre Kech will be among panellists at Data Management Insight’s next webinar, which will discuss how financial institutions can maximise the use of data standards and identifiers beyond compliance and in the interests of the business. Click here to register for the event, which will be held on 18 July.

The post GLEIF Creates vLEI Advisory Board to Support Digital Verification Technology Use appeared first on A-Team.

]]>
Better Data, Better Business: Combat Identity-Related Fraud with the LEI https://a-teaminsight.com/blog/better-data-better-business-combat-identity-related-fraud-with-the-lei/?brand=dmi Mon, 03 Jun 2024 09:21:36 +0000 https://a-teaminsight.com/?p=68685 By Clare Rowley, Head of Business Operations at the Global Legal Entity Identifier Foundation (GLEIF). The global economy is wrestling with never-before-seen levels of identity-related fraud. Cybercrime costs in the US reached an estimated $320 billion as of 2023, according to Statista. Between 2017 and 2023, this figure has seen a significant increase of over...

The post Better Data, Better Business: Combat Identity-Related Fraud with the LEI appeared first on A-Team.

]]>
By Clare Rowley, Head of Business Operations at the Global Legal Entity Identifier Foundation (GLEIF).

The global economy is wrestling with never-before-seen levels of identity-related fraud. Cybercrime costs in the US reached an estimated $320 billion as of 2023, according to Statista. Between 2017 and 2023, this figure has seen a significant increase of over $300 billion. According to the latest estimates, this dynamic will continue in upcoming years, reaching approximately $1.82 trillion in cybercrime costs by 2028. This increase in digital crime is causing substantial financial damage globally and destroying vital trust between counterparty organisations, particularly those operating across borders and legal jurisdictions.

In a world facing unprecedented digital crime, secure, reliable, and globally recognised organisational identities are a vital prerequisite and a foundation for prospering global trade. Data quality is the bedrock of trust and compliance in the international business sphere. Yet, for this data to deliver on its potential, it must be trustworthy, easily accessible, and accurate.

The Legal Entity Identifier (LEI)

This is where the LEI comes in. The LEI is the only global solution providing organisations with reliable data to unambiguously identify companies and corporate structures worldwide. As a universal ISO identification standard and a code that connects entities to crucial reference information, including ownership structure, the LEI tackles data reconciliation problems across borders and promotes an interoperable identity standard. With more than 2.5 million entities and 400,000 relationships, the openly available LEI dataset provides crucial information about the names, locations, and legal forms of subsidiaries, parents, and company holdings. LEI data helps businesses understand who they’re dealing with.

The Global LEI System creates a never-before-seen level of transparency in party identification. It lays the groundwork for more informed business decisions, fosters growth, encourages collaborations, and deters financial crimes. By addressing inconsistencies in identifying entities, connecting a greater range of datasets, and capturing entity relationships and ownership structures, the LEI can support improved risk management and enable enhanced monitoring, reporting, and analytics.

A deep understanding of corporate legal complexity to identify any challenges

The power of the LEI is underpinned by the quality and precision of its data. A new initiative, the Policy Conformity Flag, designed to encourage all LEI-holding entities to declare and maintain all requested data in the LEI record, is poised to further drive transparency on the completeness and accuracy of reference data within the LEI record. LEIs help businesses identify entities across borders and jurisdictions quickly and easily, making global trade safer and more transparent.

The Policy Conformity Flag is a compelling opportunity to encourage and promote enhanced transparency in transactions via the current and complete reporting by legal entities of open, standardised, and high-quality legal entity reference data available to users and maximising the transparency and trust among market participants.

Entities with current and complete information demonstrate their unwavering commitment to transparency by enabling closer monitoring of transaction data and supporting greater clarity in their ownership structures. Their conforming status also signals to partners and other organisations that their LEI can reliably streamline due diligence checks, onboarding, and other counterparty processes, making them easier to do business with.

An up-to-date LEI helps ensure compliance with international regulations

A legal entity with an LEI also benefits from being fast-tracked to regulatory compliance. A diverse and growing number of regulatory frameworks have already mandated its use globally. Offering a clear-cut and accessible profile of each entity can shortcut myriad due diligence processes, many of which are routinely hampered by basic questions like ‘Who is who?’ and ‘Who owns whom?’

For businesses, simplifying and streamlining risk management, compliance, Know Your Customer and Know Your Business processes and client relationship management will result in a faster and smoother path to growth.

LEI to facilitate faster and simpler transactions and partnerships

The effectiveness of LEI data is dependent on its timeliness and accuracy, making the LEI renewal process crucial. Delayed renewals can lead to many issues, including data reliability concerns and the potential for non-compliance penalties.

The quality of data is the bedrock of trust and compliance in the international business sphere. This call for exemplary quality in organisational data is not merely a play for greater compliance. It is a gateway to untapped global growth. It invites legal entities everywhere to unite for an open, transparent, and trustworthy business environment. Because better data means better business.

The post Better Data, Better Business: Combat Identity-Related Fraud with the LEI appeared first on A-Team.

]]>
Webinar Preview: ESG Data Management Challenge of New Sourcing Landscape https://a-teaminsight.com/blog/webinar-preview-esg-data-management-challenge-of-new-sourcing-landscape/?brand=dmi Wed, 29 May 2024 09:27:33 +0000 https://a-teaminsight.com/?p=68644 Financial institutions face a new set of ESG data management challenges even though data sourcing has become easier as the sustainability sector has matured. While more data is available to firms, thanks to a combination of new reporting regulations and standardisation of disclosure frameworks, the increasing variety of information needed and the volumes in which...

The post Webinar Preview: ESG Data Management Challenge of New Sourcing Landscape appeared first on A-Team.

]]>
Financial institutions face a new set of ESG data management challenges even though data sourcing has become easier as the sustainability sector has matured.

While more data is available to firms, thanks to a combination of new reporting regulations and standardisation of disclosure frameworks, the increasing variety of information needed and the volumes in which it will be delivered means that the pressure on data managers to get this information into their systems is unlikely to abate any time soon.

In our next ESG-themed webinar, A-Team Group’s Data Management Insight, we will examine the state of play for institutions as they grapple with the  implications of this new data sourcing landscape. Among the speakers, Ángel Agudo, board director and SVP of product at Clarity AI, explained that while obtaining ESG data had become somewhat easier, there are still areas in which vendors can add value.

“We are still in the early stages, and there are still limitations in how companies report their data,” Agudo told Data Management Insight. “So there remains a need to put all that unstructured data together to make it comparable and to complement what’s missing. That means there will be a need to emulate that data through estimates and leverage other sources of information, which could include reports of other organisations, NGO information, news, asset-level data – and more. Ultimately, investors need to make sure the data sourced is fit for purpose.”

Transformation

Agudo will be among a panel of three experts on the “ESG Data Sourcing and Management to Meet your ESG Strategy, Objectives and Timeline”, webinar, which will be held on June 11. The other speakers comprise Aria Goudarzi, SVP and head of ESG data at Neuberger Berman, along with Neil Sandle chief product officer at Alveo.

The sourcing of ESG data has undergone a transformation in  the past few years. The space was initially provisioned by established financial data providers. Their one-stop-shop approach was eventually supplemented by the arrival of innovative providers of ESG-specific datasets and analytics. Clarity AI is among those, offering clients tech-based end-to-end solutions, for more sophisticated use cases and a higher degree of flexibility to swiftly adapt to changing market needs and requirements.

Other relative newcomers offer customised datasets that are focused on specific ESG themes that are becoming more central to institutions’ investment and risk processes, such as nature and biodiversity, human rights and diversity.

The chain of processes required to enable the integration of ESG data into firms’ wider data estate is something that Agudo said needs to be addressed at the sourcing stage, rather than left until it has been ingested into data management systems. But he says the challenge lies in achieving this while also adhering to the firm’s overriding needs-based data management methodology.

“You can manage data in the most standardised way possible and there are many platforms that already offer those capabilities. However, embedding  the methodology into the data management process is the challenging part,” he said. “Making sure that you process all that information and can integrate it in a way that aligns with the methodology, providing you the right insights, is difficult.”

Easier Process

Nevertheless, institutions are benefiting from a greater convergence of elements required to widen the pipeline of ESG data and increase its availability.

The creation of reporting guidelines by the IFRS’ International Sustainability Standards Board has helped to dovetail several often-competing disclosure codes into one set of guidelines. These are being integrated into regulations being constructed by regulators around the world.

And on the regulatory front, the European Union’s Corporate Sustainability Reporting Directive, which compels 50,000 companies to begin disclosing their ESG performance data, is expected to provide a template for other jurisdictions to encourage greater data submissions.

“Now that we can start measuring and understanding companies’ ESG performance better, investors need to grow their knowledge of the dependencies of all those metrics and the implications for their own investment decisions,” he said.

“It will be interesting to see the new dynamics with companies and how service providers can support answer those questions that are coming to the table now.”

  • The “ESG Data Sourcing and Management to Meet Your ESG Strategy, Objectives and Timeline” webinar will be held on June 11, 2024, at 10:00am ET / 3:00pm London / 4:00pm CET. There’s still time to subscribe, by clicking here.

The post Webinar Preview: ESG Data Management Challenge of New Sourcing Landscape appeared first on A-Team.

]]>
SmartStream Adds Exchange Notification Services to Reference Data Utility https://a-teaminsight.com/blog/smartstream-adds-exchange-notification-services-to-reference-data-utility/?brand=dmi Tue, 21 May 2024 11:45:45 +0000 https://a-teaminsight.com/?p=68529 SmartStream Technologies, provider of the Reference Data Utility (RDU), has released an Exchange Notification Service (ENS) designed to track, consolidate and normalise reference data notifications published by exchanges. The service was developed in partnership with clients and extends the services of the RDU, which offers a managed service for vendor-sourced reference data. With more than...

The post SmartStream Adds Exchange Notification Services to Reference Data Utility appeared first on A-Team.

]]>
SmartStream Technologies, provider of the Reference Data Utility (RDU), has released an Exchange Notification Service (ENS) designed to track, consolidate and normalise reference data notifications published by exchanges. The service was developed in partnership with clients and extends the services of the RDU, which offers a managed service for vendor-sourced reference data.

With more than 100 exchanges trading derivatives and using multiple formats, managing all the subscriptions from each exchange is a tough task. SmartStream ENS recognises that missing exchange notifications in the reference data space can be costly, removes firms’ overheads of manually monitoring these notices, and delivers a consolidated list of normalised notifications that provides a cost-effective, timely, complete and accurate service.

Linda Coffman, executive vice president at the SmartStream RDU, says: “Each exchange publishes notifications at various time intervals throughout the day and does not follow a standard template or delivery method to publish the notifications used across the industry. Following conversations with our clients, we decided to build the ENS. The service overcomes these issues with real-time intelligence and encompasses all the information that a financial firm needs to manage its reference data from the exchanges.”

The ENS service supports intraday notifications, publishes them immediately and helps to drive quality improvements across areas such as trading, risk mitigation, validating vendor notifications for corporate actions and feed changes.

The post SmartStream Adds Exchange Notification Services to Reference Data Utility appeared first on A-Team.

]]>
The Hidden Cost of Bad Data – Why Accuracy Pays Off https://a-teaminsight.com/blog/the-hidden-cost-of-bad-data-why-accuracy-pays-off/?brand=dmi Tue, 21 May 2024 11:38:46 +0000 https://a-teaminsight.com/?p=68524 By Ariel Junqueira-DeGarcia, Strategy and Technology Leader at Broadridge. In the financial world, data is king. But many argue that firms are dangerously reliant on data they inadequately understand, unknowingly wielding a double-edged sword that can just as easily enrich as it can erode. This article pulls back the curtain on the murky world of...

The post The Hidden Cost of Bad Data – Why Accuracy Pays Off appeared first on A-Team.

]]>
By Ariel Junqueira-DeGarcia, Strategy and Technology Leader at Broadridge.

In the financial world, data is king. But many argue that firms are dangerously reliant on data they inadequately understand, unknowingly wielding a double-edged sword that can just as easily enrich as it can erode. This article pulls back the curtain on the murky world of data usage, examining the hidden costs of bad decisions driven by flawed information.

At a glance

  • Think bad data is just typos and misplaced commas? Think again. It’s a silent saboteur, shattering profits, sowing operational chaos, and leaving reputational damage in its wake.
  • Bad data hides in plain sight, silently emerging through flawed data capture, messy storage, and faulty processing. By zeroing in on these three battlegrounds, we can strategically deploy resources to better address these challenges and transform data into fuel for future growth.
  • The impact to the bottom line is typically seen in four areas: financial, operational, regulatory and reputation. High-profile financial losses are just as impactful as smaller, frequent incidents that can quickly avalanche into millions of dollars in losses.
  • The largest unseen cost of poor data quality is hindering technological advancement. Without clean data, AI tools are a waste of resources; with clean data they are the future of the industry.

Reliance on data
As firms are becoming more data driven, broker dealers, asset managers and companies across industries are becoming more reliant on data to support operations across the firm. Poor data quality costs organizations an average of $15 million USD per year, according to Gartner’s 2017 Data Quality Market Survey.

Data plays a crucial role in supporting the everyday critical decisions for your organization (figure 1). Beyond a collection of numbers and facts, it is the lifeblood of your business operations, influencing every aspect of your relationship with management, staff, customers and regulators.

Data guides decision-making to drive financial performance – when to buy or sell a stock or assess the risk of a credit agreement. Data delivered via management reporting provides the visibility to assess the strength of your operational performance – reliability of revenue forecasts, vendor spend. You are also sourcing data to demonstrate regulatory compliance across regulatory regimes, including incorporating new data to keep up with regulatory change. The credibility and reputation of your firm is at stake, as accurate data enables timely resolution of exceptions, control over money movements, and protection of customers from financial crimes.

Bad data is insidious

As data users, it can be frustrating to understand why data cannot be consistently delivered in a complete, accurate, and timely manner. Why is it so hard? The root causes of bad data can be pervasive, driven by a legacy of siloed solutions that were designed to meet short term problems. In addition, we can easily overlook just how complex it is to identify the data we need, when we need it, and how it will be managed. Bad data often boils down to three things: data capture, data storage, and data processing.

Data Capture – Poor data quality often manifests through problems with the accuracy or completeness of the data records captured, such as:

  • Incomplete or non-conforming data: empty fields, spelling mistakes, substitutions, non-standard data values excluded from analysis or storage, data entered in the wrong fields
  • Duplicate records: data brought in from multiple sources that results in duplicates that are not addressed
  • Data decay: mismanagement of out-of-date or irrelevant data that continues to be used for reporting.

Data Storage – Bad data from improper data governance often manifests when there are gaps in data practices, such as:

  • Data silos: data sources are often stored and analyzed separately, allowing for incomplete or inaccurate data to be consumed
  • Data swamps: data needs to be updated and cleaned frequently, a lack of orchestration processes to update and organize data will result in old data being used for analysis
  • Poor data versioning: gaps in approach to slowly changing dimensions

Data Processing – Under the hood, the data you use to inform critical decisions has gone through many complex processes to land on your screen. Every data pipeline is different, but often includes a set of common processes (figure 2): collection, ingestion, transformation, loading, and consumption.

If there are any gaps in workflow, tooling, or data engineering talent (or capacity), then it is likely that your data pipelines will be producing bad data. This is often seen with new data integrations and data migrations.

Additionally, the data scientists you hire get stuck dealing with these problems, spending much of their time verifying, cleaning, correcting and wrangling data. Not only are they unable to fix the underlying data issues, but also they are prevented from generating the valuable insights and predictions they were hired for in the first place.

Costs of bad data: impact to your bottom line

Bad data can result in tangible costs to your firm’s bottom line. Where can you see these costs manifest? Here are some real-world examples that highlight the importance of data accuracy and control in financial institutions:

  • Financial performance – a single spreadsheet error cost JPMorgan $6.2 billion due to inaccurate risk models built on faulty data.
  • Operational performance – Citigroup in 2021 accidentally wired $900 million to a group of lenders for the cosmetics company Revlon.
  • Regulatory Compliance – GDPR fines exceeding €4 billion serve as a stark reminder: bad data doesn’t just violate regulations, it’s a ticking time bomb that can explode into hefty financial penalties.

Credibility / reputation of firm – data leaks aren’t just breaches, they damage trust. Can you afford to lose key customers over preventable data errors? Bad data maintenance can lead to sensitive data leaks or financial loss for a customer whose investments your firm is managing. Both scenarios can result in a hit to the credibility and reputation of the firms responsible, as well as lead to customer dissatisfaction and attrition.

Conclusion

Clean, accurate and timely data is critical to the future of every organization. It can be both the fuel for your journey and the iceberg that sinks your ship. We can see the cost of bad data manifest across organizations. While identifying and measuring the impact of bad data remains critical, delaying taking steps to address the root cause is no longer a risk firms can afford to take.

The post The Hidden Cost of Bad Data – Why Accuracy Pays Off appeared first on A-Team.

]]>
BNP Paribas Becomes First EU G-SIB to Join GLEIF Validation Agent Programme https://a-teaminsight.com/blog/bnp-paribas-becomes-first-eu-g-sib-to-join-gleif-validation-agent-programme/?brand=dmi Tue, 07 May 2024 13:40:25 +0000 https://a-teaminsight.com/?p=68372 The Global Legal Entity Identifier Foundation (GLEIF) continues to build out the Global LEI System (GLEIS) with the addition of BNP Paribas as a Validation Agent. The addition of BNP Paribas marks the first global systemically important bank (G-SIB) headquartered in the EU to join the Validation Agent programme. Most recently, the GLEIF added Nord...

The post BNP Paribas Becomes First EU G-SIB to Join GLEIF Validation Agent Programme appeared first on A-Team.

]]>
The Global Legal Entity Identifier Foundation (GLEIF) continues to build out the Global LEI System (GLEIS) with the addition of BNP Paribas as a Validation Agent. The addition of BNP Paribas marks the first global systemically important bank (G-SIB) headquartered in the EU to join the Validation Agent programme. Most recently, the GLEIF added Nord vLEI as the first European GLEIF Qualified vLEI Issuer, and a second Validation Agent in both China and India.

The Validation Agent programme has been live for over two years and there are now more than 15 Validation Agents operating around the world. The framework was developed to enable banks and other regulated institutions to use their know-your-customer (KYC) and client onboarding procedures to help clients obtain LEIs.

Goulven Charlès, CDO at BNP Paribas Corporate & Institutional Banking (CIB), says: “Assuming the role of Validation Agent marks our commitment to promoting greater trust and transparency between global businesses. As part of our continuous data efforts, we are deploying the LEI, thereby delivering value for our clients and across BNP Paribas CIB.”

Stephan Wolf, CEO at GLEIF, adds: “The Validation Agent role was designed to offer a multitude of benefits to any financial institution that engages corporate clients. Both BNP Paribas and its clients will benefit from enhanced client onboarding and lifecycle management processes. In addition to the value derived from the role, BNP Paribas has a clear opportunity to evolve how LEIs are used, both in response to compliance mandates, and more broadly to facilitate greater transparency between organisations around the world to support the fight against financial crime.”

The post BNP Paribas Becomes First EU G-SIB to Join GLEIF Validation Agent Programme appeared first on A-Team.

]]>