Data Governance & Lineage - A-Team https://a-teaminsight.com/category/data-governance-lineage/ Thu, 08 Aug 2024 08:27:43 +0000 en-GB hourly 1 https://wordpress.org/?v=6.6.1 https://a-teaminsight.com/app/uploads/2018/08/favicon.png Data Governance & Lineage - A-Team https://a-teaminsight.com/category/data-governance-lineage/ 32 32 Unlocking Private Market ESG Data through AI https://a-teaminsight.com/blog/unlocking-private-market-esg-data-through-ai/?brand=dmi Thu, 08 Aug 2024 08:27:43 +0000 https://a-teaminsight.com/?p=69573 By Yann Bloch, VP Product Management at NeoXam. In today’s investment world, the importance of integrating environmental, social, and governance factors into investment strategies is no longer up for debate. Asset managers globally recognise that sustainable business practices are not only vital for ethical considerations but are also critical for long-term financial performance. Despite this recognition,...

The post Unlocking Private Market ESG Data through AI appeared first on A-Team.

]]>
By Yann Bloch, VP Product Management at NeoXam.

In today’s investment world, the importance of integrating environmental, social, and governance factors into investment strategies is no longer up for debate. Asset managers globally recognise that sustainable business practices are not only vital for ethical considerations but are also critical for long-term financial performance. Despite this recognition, a significant challenge persists: accessing reliable and comparable ESG data, particularly from private companies that often lack standardised reporting practices. The solution to this problem lies in the innovative use of artificial intelligence (AI) technologies.

Private companies are increasingly producing sustainability reports that provide valuable insights into their ESG performance. However, these reports come in various formats, use different terminologies and offer varying levels of detail, creating a complex, unstructured data landscape. This lack of standardisation makes it difficult for asset managers to efficiently extract and utilise the data, hindering their ability to make informed investment decisions that align with ESG criteria.

The emergence of AI is poised to revolutionise how asset managers handle private market ESG data. AI, particularly machine learning models, can be trained to recognise and interpret the diverse formats and terminologies used in these sustainability reports. Take natural language processing (NLP) as prime case in point. A subfield of AI focused on the interaction between computers and human language, NLP can automatically extract key data points from unstructured texts. This transformation of unstructured data into structured, actionable information is a major step forward for the industry.

One of the primary benefits of using AI in this context is the ability to automate the data extraction process. Traditionally, asset managers had to manually sift through reports, a time-consuming and error-prone process. AI tools can scan thousands of documents in a fraction of the time it would take a human, ensuring that no critical information is overlooked. This not only increases efficiency but also allows asset managers to process larger volumes of data, providing a more comprehensive view of a company’s ESG performance.

AI is great for the extraction of data and even better when combined with robust data management technology. At the receiving end of AI-driven data extraction, robust data management systems ensure data quality, including consistency and completeness, and combine it with data from other sources. This integrated approach amplifies the value of AI by providing a holistic view of ESG metrics, essential for informed decision-making.

In addition, AI can enhance the comparability of ESG data from private companies. By standardising the extracted information, these technologies enable asset managers to compare ESG metrics across different firms, even if the original reports were vastly different in format and detail. This level of comparability is crucial for making informed investment decisions and for accurately assessing the ESG performance of potential investment targets.

Another significant advantage is the ability to keep pace with the evolving ESG reporting landscape. As regulatory requirements and industry standards for ESG reporting continue to develop, AI models can be updated to incorporate new criteria and metrics. This ensures that asset managers are always working with the most current and relevant data, maintaining the accuracy and reliability of their ESG assessments.

The integration of AI into ESG data management also supports transparency and accountability. By providing clear, structured data, these technologies enable asset managers to present their ESG findings to stakeholders with greater confidence and clarity. This transparency is not only beneficial for investor relations but also for meeting regulatory requirements and for maintaining the trust of clients who are increasingly demanding sustainable investment options.

The application of AI technologies in extracting private market ESG data represents a significant advancement for asset managers. These tools address the critical challenge of unstructured data, providing a streamlined, efficient, and reliable means of accessing the information necessary to drive sustainable investment strategies. As the industry continues to evolve, embracing these technological innovations will be essential for asset managers looking to stay ahead of the curve and deliver on their commitments to sustainable investing.

The post Unlocking Private Market ESG Data through AI appeared first on A-Team.

]]>
Insurance Stress Test Success Hangs on Data Quality and Management https://a-teaminsight.com/blog/insurance-stress-test-success-hangs-on-data-quality-and-management/?brand=dmi Tue, 06 Aug 2024 14:04:15 +0000 https://a-teaminsight.com/?p=69558 Recently revealed tests to explore the resilience of insurers to external shocks are likely to succeed – or fail – on the data that the under-scrutiny firms possess. Data will be a central ingredient on the tests detailed last month by the Prudential Regulation Authority (PRA), which oversees the industry. In its most recent communique,...

The post Insurance Stress Test Success Hangs on Data Quality and Management appeared first on A-Team.

]]>
Recently revealed tests to explore the resilience of insurers to external shocks are likely to succeed – or fail – on the data that the under-scrutiny firms possess.

Data will be a central ingredient on the tests detailed last month by the Prudential Regulation Authority (PRA), which oversees the industry. In its most recent communique, the PRA detailed the design and timing of its Dynamic General Insurance Stress Test (DyGIST), which have been created as risks associated with cyber-attacks, climate change and market volatility are expected to rise.

The tests, to be held in May next year, will comprise live exercises during which firms will be presented with a set of hypothetical adverse events over three weeks to which insurers must respond as if they were real. The PRA will require detailed analyses of responses. Results will be announced as the tests progress and will go on to inform the regulator’s supervisory plans.

The exercises, which will be held alongside a similar test for life insurers, will expect firms to have their data in order to respond adequately, a stipulation that could be an opportunity to the insurance industry to boost its IT capabilities, said Wenzhe Sheng, senior product manager for EMEA prudential regulation at Clearwater Analytics.

“The Prudential Regulation Authority’s design of the DyGIST framework provides a strong foundation for ensuring the resilience of the insurance sector,” Sheng told Data Management Insight. “Moreover, it provides insurers with an incentive to fortify their data infrastructures and implement data driven risk management practices.”

Banking Assessments

The stress tests were announced last year and follow similar exercises focused largely on the UK’s banking industry. They will be carried out to gain a deep understanding of the insurance industry’s solvency and liquidity buffers and to examine the effectiveness of their management response to adverse scenarios.

The PRA held workshops with the Association of British Insurers, the Lloyd’s Market Association and the International Underwriting Association to devise the design and timing of the tests. Professional services giant Deloitte said last week that the bank tests had shown that insurers should prepare for their own assessment by ensuring their data is in order.

“General insurers need to enhance their stress and scenario testing processes to be able to perform the exercise live – including ensuring they can aggregate relevant data and identify potential management actions to deploy for any given scenario,” it said in a report.

Earlier Experience

The importance of having good data in responding to new risks was highlighted in similar stress tests held three years ago by the Bank of England to assess the resilience of insurers and lenders to climate risks. Initial exercises conducted by organisations including AXA, Allianz and AIG revealed concerning failures in data preparedness, which left some struggling to complete the tests, and surfacing gaps in critical datasets.

Clearwater’s Sheng said it is imperative that insurers have their data estates ready.

“In order to pass the first phase of the test – a live exercise that test’s a firm’s preparedness for adverse market-stressed events – it’s imperative that insurers have the capability to quickly pull up a very clear and transparent view of all of their holdings under these scenarios,” he said. “This is not as common as you would expect, as insurers are increasingly investing in a wide range of assets, which means they are often dealing with very different types of data in their internal systems.”

Sheng warned, however, that the DyGIST could also highlight shortcomings in firms’ data setups.

“When it comes to risk management you need to have a real-time understanding of your risk exposures in order to respond and manage that risk,” he said.

Reason for Hope

He is optimistic that insurance firms will be able overcome the challenges, thanks to the availability of new innovations and that “can provide daily, validated, and reconciled investment data on their entire portfolio, so that they can properly understand their market exposure across asset classes”

“Those who choose to invest in such modern data infrastructures will be best prepared to demonstrate their solvency and liquidity resilience and their effectiveness in risk management practice under the DyGIST regulatory exercise,” Sheng said.

The post Insurance Stress Test Success Hangs on Data Quality and Management appeared first on A-Team.

]]>
Latest UK SDR Measure Highlights Data Challenge https://a-teaminsight.com/blog/latest-uk-sdr-measure-highlights-data-challenge/?brand=dmi Tue, 06 Aug 2024 14:00:20 +0000 https://a-teaminsight.com/?p=69555 The UK has implemented the latest stage of its sustainability disclosure requirement (SDR), which is designed to encourage manufacturers of investment products to adopt measures that will prevent greenwashing. Before the measure was even introduced by the Financial Conduct Authority (FCA), however, it was apparent that fund managers’ likelihood of adopting the guidance would be...

The post Latest UK SDR Measure Highlights Data Challenge appeared first on A-Team.

]]>
The UK has implemented the latest stage of its sustainability disclosure requirement (SDR), which is designed to encourage manufacturers of investment products to adopt measures that will prevent greenwashing.

Before the measure was even introduced by the Financial Conduct Authority (FCA), however, it was apparent that fund managers’ likelihood of adopting the guidance would be limited by their data setups. Experts have told Data Management Insight that solving this challenge would be critical to meeting the goals that underpin the SDR.

Since July 31, managers have been asked to voluntarily label their products according to the degree to which they can be considered sustainable.

Those labelled “Sustainability Improvers” denote assets that have the potential to become sustainable but may not be now. “Sustainability Impact” products are those that invest in solutions that bring beneficial ESG impacts. “Sustainability Mixed Goals” labels indicate investment vehicles that combine the other two. A fourth, “Sustainability Focus”, is reserved for products that have at least 70% of allocations to sustainable assets.

Those seeking to adopt the labels must show they meet the requirements by the beginning of December.

Clarity Needed

Critics have predicted a slow uptake of the labels by fund houses, with some arguing that more clarity is needed about how the labels can be properly applied. At the heart of that challenge is likely to be firms’ ability to gather and use the data necessary to make those decisions.

The FCA said last year that asset managers and manufacturers must have robust data, governance and technology setups to adopt its measures. A poll during a webinar by consultancy firm Bovill Newgate, however, found that 90% of financial services respondents said they were not equipped with the correct ESG reporting data.

Emil Stigsgaard Fuglsang, co-founder at ESG data and consultancy firm Matter said data would be a potential pain point for firms operating in the UK.

“While many global investment managers already have these competencies in place thanks to the requirements of other regulations, the majority of smaller British firms do not,” Fugslang said.

“This means they face the challenge of accurately defining sustainability in their investments and implementing data and analytics solutions to track and document their performance against these definitions at the fund-level. This will be no easy task, but those who take action now will be best prepared by the December 2 deadline.”

Investor Protections

The labelling guidance follows the publication of anti-greenwashing advice by the FCA earlier this year, which seeks to protect investors from abuse by encouraging asset managers and manufacturers to be clear and precise in the descriptions of their products.

The FCA is keen to safeguard investors against being lured by false claims of an asset or product’s sustainability. The threat of greenwashing has been wielded as a weapon in an ESG backlash, most notably in the US, that has seen billions of dollars pulled from sustainability-linked funds.

While the measure is designed primarily to protect retail investors, it is expected also to have an impact on institutional capital allocators. One of the first funds to adopt an SDR label, AEW’s impact fund, has taken the Sustainable Impact categorisation and is offered only to institutions.

The SDR is also widely predicted to set transparency standards that institutions are likely to follow.

ESMA Guidance

The UK’s latest SDR implementation came as Europe’s regulators sought changes to some of the European Union’s disclosure rules. The European Securities and Markets Authority (ESMA), last week suggested changes that would affect the bloc’s lynchpin Sustainable Finance Disclosure Regulation (SFDR) and other measures.

In an opinion piece it set out a set of proposals that urge tweaks to the EU’s wider sustainable finance framework, arguing that there needs to be greater “interconnectedness between its different components”.

Among ESMA’s proposals are a phasing out of the phrase “sustainable investments” within the SFDR and a recommendation that market participants should instead make reference only to the green Taxonomy that underpins European market rules. Further, it suggested an acceleration of the Taxonomy’s completion, incorporating a social taxonomy.

It also urged that ESG data products be brought under regulatory scrutiny to improve their quality.

Clash of Standards

Other recommendations on how sustainability products should be described could conflict with the new measures introduced by the FCA.

ESMA suggests that all products provide basic information on their sustainability, with greatest detail given to institutional investors. It also urges the introduction of a “best in class” product categorisation system. That would include at least a “Sustainability” classification, denoting products that are already green, and a “Transition” grouping of funds that aim to be sustainable.

Martina Macpherson, head of ESG product strategy and management at SIX Financial Information, said institutions would need to familiarise themselves with each code.

“Challenges for asset managers remain to categorise funds in line with the UK’s labelling regime, and to align them with the EU’s fund labelling rules introduced by ESMA,” MacPherson said. “Overall, ESG fund labels represent a significant next step to address transparency and greenwashing concerns. Meanwhile, the mounting public and regulatory attention surrounding sustainable investment demands firms to use the most reliable, legitimate, and timely data to inform their decisions.”

The post Latest UK SDR Measure Highlights Data Challenge appeared first on A-Team.

]]>
Citigroup Fine Shows Importance of Having Robust Data Setup https://a-teaminsight.com/blog/citigroup-fine-shows-importance-of-having-robust-data-setup/?brand=dmi Tue, 30 Jul 2024 09:23:21 +0000 https://a-teaminsight.com/?p=69487 The US$136 million fine meted out to Citigroup for data irregularities dating back to 2020 should serve as a warning to all financial institutions that robust data management is essential to avoid sanctions amid tougher regulatory regimes. The Federal Reserve and Office of the Comptroller of the Currency (OCC) jointly imposed the penalty on the...

The post Citigroup Fine Shows Importance of Having Robust Data Setup appeared first on A-Team.

]]>
The US$136 million fine meted out to Citigroup for data irregularities dating back to 2020 should serve as a warning to all financial institutions that robust data management is essential to avoid sanctions amid tougher regulatory regimes.

The Federal Reserve and Office of the Comptroller of the Currency (OCC) jointly imposed the penalty on the international banking group after it was found to have put in place insufficient data management risk controls. Further, the group was told to hold quarterly checks to ensure it has safeguards in place.

The action has been seen a warning that regulators will take a tough stance against data management failings that could have a detrimental impact on banks’ clients and their business. Charlie Browne, head of market data, quant and risk solutions at data enterprise data management services provider GoldenSource, said the fine shows that there can be no hiding bad practices.

Greater Scrutiny

“Citigroup’s fine should be a warning to other banks and institutions who may have believed their insufficient data and risk controls could fly under the radar,” Browne told Data Management Insight. “It’s time to adapt, or be forced to pay up.”

Financial institution’s data management structures are likely to come under greater regulatory scrutiny to protect customers as more of their activities are digitalised, as artificial intelligence is incorporated into tech systems and amid growing acceptance of crypto finance.

As well as data privacy protection measures, organisations will be expected to tighten controls on many other data domains including trading information and ESG reporting. The fallout from the collapse of Silicon Valley Bank last year will also put pressure on lenders’ solvency requirements and crisis management, processes that are heavily data-dependent.

Data Care

Browne said the penalty imposed on Citigroup showed that institutions had to take greater care with their data and controls models because regulators are very aware of how important digital information is to the efficient running of all parts of an enterprise’s operations.

This fining of Citigroup demonstrates the very real costs associated with banks not being on top of their risk controls and data management,” he said.

“It’s a bold statement from the US rule makers that banks showing complacency about their data issues will be met with regulatory action. Regulators globally are now coming to the understanding that it’s fundamental that financial institutions have effective data management strategies.”

While breaches of Europe’s General Data Protection Regulation (GDPR) and anti-money laundering rules have already been at the root of fines imposed on banks and financial services firms, penalties related to operational use of data are expected to grow.

For example, institutions interviewed by A-Team Group have regularly said they are closely examining the data privacy and IP implications of using outputs from generative AI applications. The concern they have is that the content generated will be in beach of copywriter material on which the model has been trained.

Non-Negotiable

Browne’s comments were echoed by the found and chief executive of Monte Carlo Data Barr Moses, who said that as data needs become central to firms’ operations, “data quality becomes non-negotiable”.

“In 2024 data quality isn’t open for discussion — it’s a clear and present risk and it needs our attention,” Moses wrote on LinkedIn.

Browne said that ensuring compliance will require strenuous efforts by organisations to go deep into their data capabilities and processes.

“Data quality and accessibility are, rightly, front of mind, however, it’s also vital that banks consider concepts like data governance and data lineage when assessing the efficiency of their systems and adequately managing their risk. Being able to track data back to source is an important tool that rule makers are increasingly looking to demand of banks, visible in regulations like the ECB’s Risk Data Aggregation and Risk Reporting (RDARR) measures.”

The post Citigroup Fine Shows Importance of Having Robust Data Setup appeared first on A-Team.

]]>
Crafting an Effective Data Strategy to Unlock Innovation https://a-teaminsight.com/blog/crafting-an-effective-data-strategy-to-unlock-innovation/?brand=dmi Mon, 29 Jul 2024 08:29:01 +0000 https://a-teaminsight.com/?p=69479 By Kelly Attrill, Head of Advisory & Consulting APAC at Lab49. Data can be both an asset and a liability. Used correctly, it can transform an organisation’s ability to unlock value and enable innovation. However, if data is mismanaged it can have catastrophic consequences. In financial services, firms recognise that the ever-increasing volume of data...

The post Crafting an Effective Data Strategy to Unlock Innovation appeared first on A-Team.

]]>
By Kelly Attrill, Head of Advisory & Consulting APAC at Lab49.

Data can be both an asset and a liability. Used correctly, it can transform an organisation’s ability to unlock value and enable innovation. However, if data is mismanaged it can have catastrophic consequences. In financial services, firms recognise that the ever-increasing volume of data they handle constitutes an asset that, with the right tooling, can deliver value far offsetting the initial investment. However, in some cases, its applicability to client outcomes may be unclear, and there may be a disconnect between how a business seeks to use data and how it’s currently being managed and distributed.  To avoid this and make sure that data fulfils its potential, it’s crucial to develop and implement a robust data strategy.

Strong foundations

An effective data strategy starts with identifying business goals that will be achieved with data and defining clear operational principles for data management and usage. This includes defining what a firm can and cannot do with data and identifying which areas data can add value to the client and employee. Across the front and back offices, firms must be willing to invest not only in the technology but also in the necessary training to ensure these principles are embedded in client journeys and in the day-to-day work of the team.

A strategy that establishes a foundational set of goals and principles lays the groundwork for the development of frameworks, policies and plans across the firm’s divisions. For example, defining data usage boundaries in the data strategy enables the development of a well-defined data governance framework, ensuring the safe, ethical and compliant handling of data across an organisation.

It is crucial that the data strategy is linked directly to business goals and clear time horizons to achieve these goals. This will drive prioritisation and planning decisions and allow the organisation to monitor progress through the implementation of the strategy. Defining the right goals is important; focusing only on one dimension of the data strategy will limit potential value. With a focus on enabling AI use cases, many firms invest in uplifting and ensuring that the quality of data is correct and can be trusted across the whole landscape. On the whole, this is a good thing but it is just as important for firms to continually invest in skills and technology to unlock value. This includes training employees to understand, access, and use data assets effectively and ensuring that data management practices are integrated into their workflows.

Moreover, a data-driven strategy must be agile, supporting the entire data lifecycle and allowing firms to adopt new tools and techniques as they emerge. This agility is vital for balancing mid-term investments in technology and people with the ability to quickly implement proven or experimental technologies that enhance data management and use.

Enhancing services

To address challenges in securing stakeholder buy-in, it is essential to clearly demonstrate how a data strategy aligns with and supports direct business outcomes and client needs. By showcasing tangible benefits, such as improved product offerings and risk management, firms can build a compelling case for investment in data initiatives.

Effectively harnessing data offers significant promise for firms looking to enhance their service offering. OECD researchhas found that large firms’ investments in intangible assets like data and software—which can scale without linear cost increases—can help grow their market share.

Increasingly, data is being integrated with AI to unlock advanced capabilities. For instance, AI models can streamline risk management by quickly digesting large volumes of changing regulations, and digital lending services have sped up the time to lending approvals by using machine learning and automation to improve credit decisions.

Personalised products tailored to individual clients’ needs are another significant benefit of a data-driven strategy. For example, upgrading Customer Relationship Management (CRM) systems so client information is accessible through consistent channels in an intuitive way allows front line staff to build an understanding of client needs and enables the delivery of powerful insights that may unlock more targeted propositions and spur business growth. These can improve satisfaction and loyalty not only for existing customers but also boost new business opportunities by improving the productivity and efficiency of sales teams, supporting a more competitive commercial proposition. A data strategy that prioritises feedback loops, collecting information based on the insights and proposition value and feeding that into the next set of insights and propositions will enable firms to shift to a data-driven strategy across multiple dimensions – data-driven product, data-driven marketing, data-driven people, etc.

Given increased attention from regulators globally to appropriately manage and protect data, developing a mature data strategy is not only desirable in terms of compliance but can help firms stay competitive by protecting against financial loss and reputational harm.

Future-proofing

As technological change continues to accelerate, firms adopting a data-driven strategy are better placed to leverage that data in new ways across business lines, the product suite and the operating environment. When the focus of the strategy is on disconnecting tightly bound links between technology and vendor platforms and enabling access that is simple, secure and intuitive, the value of the firm’s data assets becomes clearer and more closely tied to business outcomes.

Fostering a culture of data literacy where the value of data-driven decision-making is promoted across the organisation can go a long way to ensuring that all stakeholders, from top management to front-line employees, understand the benefits of a data-driven approach and are equipped to adapt to new ways of working.

Investment in experimentation with AI and embedding trusted decision and insight models into the firm’s decision-making processes becomes much easier once data is more available and protected through the right governance environment. Feedback from the success of this will help drive a data-driven organisation and feed the next generation of data-driven strategy.

The post Crafting an Effective Data Strategy to Unlock Innovation appeared first on A-Team.

]]>
Data Warning After UK Signals New Law Covering AI Use https://a-teaminsight.com/blog/data-warning-after-uk-signals-new-law-covering-ai-use/?brand=dmi Fri, 26 Jul 2024 14:09:57 +0000 https://a-teaminsight.com/?p=69475 Financial institutions operating in the UK must begin ensuring the integrity of their data estates after the newly elected government signalled plans to forge a potentially far-reaching AI bill. Leaders of two large data management companies said that any new technology law could usher powers of intervention if AI models and processes are seen as...

The post Data Warning After UK Signals New Law Covering AI Use appeared first on A-Team.

]]>
Financial institutions operating in the UK must begin ensuring the integrity of their data estates after the newly elected government signalled plans to forge a potentially far-reaching AI bill.

Leaders of two large data management companies said that any new technology law could usher powers of intervention if AI models and processes are seen as likely to cause danger to individuals or companies. Only with robust data management setups would organisations be able to ensure they don’t breach any new law.

Greg Hanson, group vice president and head of EMEA North sales at Informatica, and Arun Kumar, UK regional director at ManageEngine, offered their thoughts after the government of new Prime Minister Kier Starmer revealed its legislative programme for the next parliament.

While the announcement made no mention of a full AI bill, the plans revealed in the King’s Speech made by King Charles at the opening of parliament last week, said the UK will seek to “establish the appropriate legislation to place requirements on those working to develop the most powerful artificial intelligence models”.

Bad Data, Bad Outcomes

“Businesses must now brace for greater intervention and be prepared to demonstrate how they are protecting the integrity of AI systems and large language models,” said Hanson. “Developing robust foundations and controls for AI tools is a good starting point.”

Hanson echoed a view common among technologists and users that AI can only be useful if it is fed good data. Without that, downstream processes will be erroneous and potentially catastrophic to workflows and operations.

“Bad data could ultimately risk bad outcomes, so organisations need to have full transparency of the data used to train AI models,” he added. “And just as importantly, businesses need to understand the decisions AI models are making and why.”

With more institutions putting critical parts of their data processes in the hands of AI technologies, policy makers are worried that miscalculations will lead to a snowball effect of failures and negative impacts on people and businesses. Recent AI malfunctions have led to companies paying damages to affected parties. In February, for instance, Air Canada was forced to offer reparations to a passenger who was given inaccurate information by an AI-powered chatbot.

Hanson said that organisations should begin by ensuring that machines don’t make decisions without human input.

“It’s critical that AI is designed, guided and interpreted from a human perspective,” he said, offering as an example careful consideration about whether large language models have been trained on “bias-free, inclusive data or whether AI systems can account for a diverse range of emotional responses”.

“These are important considerations that will help manage the wider social risks and implications it brings, allowing businesses to tackle some of the spikier challenges that generative AI poses so its transformative powers can be realised,” he said.

Driving Improvements

Arun at ManageEngine, the enterprise IT management division of Zoho Corporation, said a well-crafted bill would do more than simply list what organisations should not do.

“This bill promises to go a long way in helping to tackle the risks that come from a lack of specialised knowledge around this relatively new technology,” he said. Such a bill “could give businesses guidance on how to prioritise trust and safety, introducing essential guard rails to ensure the safe development and usage of AI”.

Pointing to recent ManageEngine research that showed 45% of IT professionals only have a basic understanding of generative AI technologies and that most have no governance frameworks for AI implementation, he said that a bill would provide the confidence needed AI systems improve.

“Introducing legislation on safety and control mechanisms, such as a requirement to protect the integrity of testing data, will help guide the use of AI so businesses can confidently use it to drive business growth,” he said.

The post Data Warning After UK Signals New Law Covering AI Use appeared first on A-Team.

]]>
Webinar Review: Harnessing the Wider Benefits of Data Identifiers https://a-teaminsight.com/blog/webinar-review-harnessing-the-wider-benefits-of-data-identifiers/?brand=dmi Tue, 23 Jul 2024 13:49:22 +0000 https://a-teaminsight.com/?p=69442 Almost three-quarters of capital markets participants are utilising data standards and identifiers beyond their immediate regulatory use cases, realising the huge benefits that ordered and consistent datasets can bring to an enterprise’s entire operations. The findings of an A-Team Group Data Management Insight poll showed that 40% of respondents said they are using the resources to a...

The post Webinar Review: Harnessing the Wider Benefits of Data Identifiers appeared first on A-Team.

]]>
Almost three-quarters of capital markets participants are utilising data standards and identifiers beyond their immediate regulatory use cases, realising the huge benefits that ordered and consistent datasets can bring to an enterprise’s entire operations.

The findings of an A-Team Group Data Management Insight poll showed that 40% of respondents said they are using the resources to a “great extent”, while another 33% are using them to a “good extent”. Just 13% reported they aren’t utilising them at all.

The poll illustrates how financial institutions are seizing on the consistency that identifiers bring to data to turbo-boost use cases such as know-your-customer (KYC) processes and risk management, as well as bring broad operational efficiencies, according to speakers at DMI’s most recent webinar, during which the poll was held.

The webinar, entitled “How to maximise the use of data standards and identifiers beyond compliance and in the interest of the business”, gathered leading participants in the data management and identifiers space. To confine the use of identifiers to satisfying regulatory obligations would be a waste, Emma Kalliomaki, managing director of the Derivatives Service Bureau (DSB) told the webinar.

Broad Strategy

While they are critical to bringing “efficiency and harmonisation”, their broader deployment has become part of data management best practices, Kalliomaki said. Having a data strategy that recognises the applications of such resources to data uses throughout the entire business is critical, she said, adding that this necessitated robust governance models.

Among the speakers was Alexandre Kech, chief executive of the Global Legal Entity Identifier Foundation (GLEIF), which recently devised the Legal Entity Identifier (LEI) standard that’s used by companies and financial organisations around the world. Its latest iteration, the virtual LEIs, or vLEI – a cryptographically secure digital representation of LEIs – has been adopted by a large number of companies, especially within global supply chains, Kech said.

The consistency that standards and identifiers bring is also crucial to enabling organisations to “stitch” together datasets across an enterprise, enabling them to identify patterns and outliers in those pools of information, offered Robert Muller, senior group manager and technology product owner at BNY. This, he added, can create the foundations on which AI can be applied and on which the accuracy of analytical models can be improved.

Despite recognising the wider benefits of identifiers, many companies are encountering challenges in realising them. Chief among them, according to another poll during the webinar, is their integration with existing applications and systems. Two-third of respondents cited this as their chief impediment to broader utilisation.

Integration Challenges

Laura Stanley, director of entity data and symbology at LSEG said she was unsurprised by the polling. The multiplicity of systems and software deployed by modern financial institutions makes integration of their technology difficult and imposes an obstacle on the sort of joined-up thinking that is enabled by identification standards.

Another key challenge facing organisation, according to the poll, was the variety of, and regular creation of, identification standards. As well as LEIs, other standards include Unique Product Identifiers (UPIs), the International Securities Identification Number (ISIN) and the ISO 20022. These join proprietary naming codes, which companies use internally.

Kalliomaki said that companies should not be deterred by the apparent complexity of these different codes because they are largely complementary. When making a business case for their wider application, they also have the benefit of being low-cost resources, she said.

Further, she added, their wider use also provides organisations the opportunity to help national standards committees play a part in the evolution of identifiers, making them even more useful and bringing greater benefits to financial institutions.

Stanley agreed, echoing a point stated by Muller, that the application of AI, and in particular Generative AI, was likely to simplify the currently complex process of switching between standards. This, the panel agreed, would require a programme of educating market participants on the benefits more broadly using identifiers.

The post Webinar Review: Harnessing the Wider Benefits of Data Identifiers appeared first on A-Team.

]]>
Informatica Sees a Future of AI-Focused Innovation Releases https://a-teaminsight.com/blog/informatica-sees-a-future-of-ai-focused-innovation-releases/?brand=dmi Mon, 15 Jul 2024 13:52:15 +0000 https://a-teaminsight.com/?p=69280 Informatica has had a busy 2024, announcing major new innovations and partnerships as it brings artificial intelligence to the fore of its cloud-based data management offering. Last month the California-based company deepened its association with Databricks, providing the full range of its AI-powered Intelligent Data Management Cloud capabilities within Databricks’ Data Intelligence Platform. The expanded partnership...

The post Informatica Sees a Future of AI-Focused Innovation Releases appeared first on A-Team.

]]>
Informatica has had a busy 2024, announcing major new innovations and partnerships as it brings artificial intelligence to the fore of its cloud-based data management offering.

Last month the California-based company deepened its association with Databricks, providing the full range of its AI-powered Intelligent Data Management Cloud capabilities within Databricks’ Data Intelligence Platform. The expanded partnership will enable joint customers to deploy enterprise-grade GenAI applications at scale, based on a foundation of high-quality, trusted data and metadata. That followed the unveiling of a similar association with Snowflake. It was also selected by Microsoft as the Independent Software Design (ISV) design partner for the software behemoth’s new data fabric product.

The frequency of the rollouts in recent months has been dictated by the rapidity with which Informatica’s financial institution clients are seizing on the potential of AI. Many are struggling to bring the technology into their legacy systems, while others have a vision of what they want to do with it but not the capability to implement it.

With the market also heavily weighted towards capitalising on the growing generative AI space, Informatica group vice president and head of EMEA North sales Greg Hanson said new developments and enhancements are on the cards for the near future.

“The critical foundational layer for companies is to get their data management right and if you look at the current state of most large organisations, their integration and their data management looks a bit like spaghetti,” Hanson tells Data Management Insight.

“They realise, though, that they have to pay attention to this strategic data management capability because it’s almost as fundamental as the machinery that manufacturers use to make cars.”

Rapid Change

Hanson says that the pace of innovation at Informatica is the fastest he’s seen in his two decades at the company because its clients understand the operational benefits to be gained from implementing AI-based data management processes. This “unstoppable trend towards AI” is being driven by board-level demand, especially within financial services, a sector he describes as being at the “bleeding edge” of technological adoption.

Many have had their appetites whetted by AI’s ability to streamline and improve the low-hanging fruit challenges they face, such as creating unique customer experiences and engagements. To embed and extend those AI-powered capabilities across their entire organisation, however, will take more effort, says Hanson.

“Their ability to harness data and exploit AI’s potential is going to be the difference between the winners and losers in the market,” he says. But the drive to get results quickly may lure firms towards rash decisions that could create more problems later.

“They need to think strategically about data management, but they can start small and focus on a small use case and an outcome that they can deliver quickly, then grow from there.”

Make it Simple

Among Informatica’s clients across 100 countries are banks such as Santander and Banco ABC Brasil, US mortgage underwriting giant Freddie Mac, insurer AXA XL and online payments provider PayPal. Among the services it’s providing such institutions are broad cost reduction by the optimisation of reference data operations and the simplification of their broader data processes.

This latter point is key to helping clients better use their data, says Hanson. Arguing that without good data inputs, AI’s outputs will be “garbage out at an accelerated pace”, he says that many companies have overcomplicated data setups that are hampering their adoption of the technology. By having separate tools to manage each element of their data management setup – including data access, quality, governance and mastering capabilities – large firms are strangling their ability to make AI work for them.

“But now complexity is out and simplicity is in,” Hanson says. “As companies modernise to take advantage of AI, they need to simplify their stacks.”

Enter GenAI

Informatica is helping that simplification through a variety of solutions including its own GenAI-powered technology for data management, CLAIRE GPT – the name being a contraction of “cloud AI for real-time execution”. The technology began life simply as CLAIRE seven years ago. Last year, however, it was boosted with the inclusion of GenAI technology, enabling clients to better control their data management processes through conversational prompts and deep-data interrogation.

Comparing the new iteration to Microsoft’s Copilot, Hanson says CLAIRE GPT now offers clients greater capabilities to simplify and accelerate how they consume, process, manage and analyse data.  Adding fuel to its firepower is CLAIRE GPT’s ability to enable individual clients to call on the combined metadata of Informatica’s 5,000-plus clients to provide them with smarter outputs.

While almost all of Informatica’s offerings are embedded with its new GenAI technology, the next step will be to ensure the company’s entire range of products benefits from it.

“Data management is complex and costly for many companies and it massively impacts the ability of the company to release new products, deliver new services and create more pleasing customer experiences,” he says.

“Our job with GenAI as the fundamental platform foundation is to offer more comprehensive services around that foundational layer of data management, and more automation and productivity around the end-to-end data management journey.”

The post Informatica Sees a Future of AI-Focused Innovation Releases appeared first on A-Team.

]]>
Financial Firms Have Widest Data Security Perception Gap: Survey https://a-teaminsight.com/blog/financial-firms-have-widest-data-security-perception-gap-survey/?brand=dmi Mon, 15 Jul 2024 13:46:42 +0000 https://a-teaminsight.com/?p=69277 The financial services sector has the widest gap between perceptions about its data security and its vulnerability to data attacks. A survey by data security provider Dasera found that 73% of institutions questioned said they had high levels of confidence in their ability to fend off ransomware attacks, data breaches and other unauthorised uses of...

The post Financial Firms Have Widest Data Security Perception Gap: Survey appeared first on A-Team.

]]>
The financial services sector has the widest gap between perceptions about its data security and its vulnerability to data attacks.

A survey by data security provider Dasera found that 73% of institutions questioned said they had high levels of confidence in their ability to fend off ransomware attacks, data breaches and other unauthorised uses of data. Nevertheless, records of attacks showed that those firms were among the worst affected in 2023.

“The significant number of breaches contradicts high confidence in their security strategy, suggesting overconfidence in their security posture,” the report, entitled The State of Data Risk Management 2024, stated. “The sector remains a prime target for cyberattacks due to valuable data, indicating a gap between perceived effectiveness and actual vulnerability.”

The report compared the perceptions of companies in a range of high-profile data-focused sectors, including healthcare and government, with statistics on data breaches compiled by a variety of organisations and studies. These include the Verizon Data Breach Security Report, Kroll’s Data Breach Outlook Report and the Identity Theft Resource Centre.

Record Year

The Dasera survey said the combined conclusions of those studies showed that 2023 was a “record-breaking year” for breaches.

According to Verizon, the financial services industry suffered 477 data security incidents in 2023, compared with 380 for IT firms and 433 in the healthcare sector. Only government bodies suffered more, at 582. Kroll found that financial firms accounted for the largest proportion of attacks, at 27%.

Two-thirds of breaches originated externally. With the balance coming from internal “threat actors”, the financial services firms were among the least protected against attacks from within their own systems.

The report found that 77% of breaches within the sector came from basic web application attacks, miscellaneous errors and system intrusions.

“The survey underscores the importance of adopting integrated and automated data security strategies to address these challenges,” the Dasera report stated. “Reliance on outdated, manual processes and slow adoption of automated systems contribute to current vulnerabilities. Organisations must prioritise modern, proactive approaches, including regular audits, strategic use of technology, and external consulting, to effectively navigate the evolving landscape of data risk.”

The post Financial Firms Have Widest Data Security Perception Gap: Survey appeared first on A-Team.

]]>
French Election Reminds Asset Managers to Expect the Unexpected https://a-teaminsight.com/blog/french-election-reminds-asset-managers-to-expect-the-unexpected/?brand=dmi Mon, 15 Jul 2024 13:41:17 +0000 https://a-teaminsight.com/?p=69274 By Sam Idle, Solutions Consultant at Clearwater Analytics. **The latest results of the surprising snap French election are a timely reminder for asset managers to always expect the unexpected. The knock-on effects on their investments can create a metaphorical line at the door from anxious investors with a million questions on how their portfolios have...

The post French Election Reminds Asset Managers to Expect the Unexpected appeared first on A-Team.

]]>
By Sam Idle, Solutions Consultant at Clearwater Analytics.

**The latest results of the surprising snap French election are a timely reminder for asset managers to always expect the unexpected. The knock-on effects on their investments can create a metaphorical line at the door from anxious investors with a million questions on how their portfolios have been impacted.

Going into the run-off, the RN was widely expected to have a reasonable chance at gaining a majority. Instead, the leftist Nouveau Front Populaire bloc won the most seats in this most strange of elections, with Le Pen’s RN coming in third place. While the reaction from markets wasn’t as significant as it could have been, it still had impact on the French sovereign long-term borrowing rate against the German equivalent – a barometer of market sentiment towards French fiscal fortunes.

A major market event is often when an investment manager’s reporting and client servicing capabilities are tested to the limit. In the wake of election results, investors are desperate to get clarity on their portfolios, and how they are impacted. They are often on the phone, calling up their asset managers, requesting information on risk exposures, price impacts, and a plethora of other bespoke inquiries.

Outdate Data Systems

This is not just something that happens in isolation when market-affecting events occur though, it is part of a general trend for investors to become more demanding of the people managing their money. Perhaps this is reflective of the relatively easier access that they have to news in 2024 than they did 20 years ago.

As the news cycle intensifies, asset managers are struggling with outdated systems. Reliance on legacy infrastructure, combined with the piecemeal addition of new products, has made managing the growing volume and variety of data increasingly difficult. This information often isn’t centralised, and with thousands of different clients with varying servicing requirements, there is always a tendency to focus on repeatable client reports. There is also a reliance that requests are coming into the same place, but this often isn’t the case. If there is no central repository, the data that is being provided to clients will often be different – depending on what information that particular team has access to. When bespoke inquiries come in, they are incredibly difficult to deal with effectively. This is felt even more starkly if the incumbent data architectures are not interacting with a modern reporting solution.

Modern Architectures

While the fall-out from the French election does not seem to have been too severe on markets, it is a timely reminder, nonetheless. Client engagement is a key differentiator in an age where performance is squeezed increasingly by passive investing through exchange traded funds (ETFs)– it is important that clients remember who was able to put their minds at ease rapidly in the aftermath of surprise elections or other market-shaking events.

When you consider that those accounts that generally require the most bespoke treatment are the largest accounts, the ones that drive the majority of a firm’s revenue, it becomes clear why asset managers need to expect the unexpected, and prepare themselves with modern, interactive data architectures and reporting solutions.

The post French Election Reminds Asset Managers to Expect the Unexpected appeared first on A-Team.

]]>