Data - A-Team https://a-teaminsight.com/category/data/ Wed, 21 Aug 2024 09:12:48 +0000 en-GB hourly 1 https://wordpress.org/?v=6.6.1 https://a-teaminsight.com/app/uploads/2018/08/favicon.png Data - A-Team https://a-teaminsight.com/category/data/ 32 32 TurinTech Deploys GenAI to Accelerate Financial Software https://a-teaminsight.com/blog/turintech-deploys-genai-to-accelerate-financial-software/?brand=dmi Wed, 21 Aug 2024 09:12:48 +0000 https://a-teaminsight.com/?p=69657 The costs associated with poor quality software coding are startling. In dollar terms alone, companies in the US incurred a US$2.4 trillion hit from the direct impacts and cost of correcting poor coding, according to a 2022 survey by the Consortium for Information and Software Quality (CISQ). That’s before indirect costs such as reputational and...

The post TurinTech Deploys GenAI to Accelerate Financial Software appeared first on A-Team.

]]>
The costs associated with poor quality software coding are startling. In dollar terms alone, companies in the US incurred a US$2.4 trillion hit from the direct impacts and cost of correcting poor coding, according to a 2022 survey by the Consortium for Information and Software Quality (CISQ).

That’s before indirect costs such as reputational and legal damages are considered.

With financial institutions’ greater dependence on technology and speed of execution, the costs of software failure and slow runtimes are potentially higher. The same survey found that the dollar-value impact of operational failure was as much as 72 times higher for large financial brokerages than it was for other industries.

Since its creation in 2018, TurinTech AI has been on a mission to help firms reduce costs. The UK-based company leverages GenAI to pinpoint areas in mission-critical software systems where optimisation is needed and then it generates better code to enhance performance and efficiency.

TurinTech AI’s technology offers a range of services that can streamline billions of lines of code to reduce applications’ pressure on CPU processing power and cloud use. By doing so companies also reduce the energy needed to carry out their everyday processes, a benefit that decreases their carbon footprint, said chief executive Leslie Kanthan.

“Financial institutions, banks and hedge funds have huge amounts of code –hundreds of millions of lines of code – and it would take 40 Guys 10 years just to review a couple of million lines of code; it’s an intractable problem,” Kanthan told Data Management Insight.

Efficiency Gains

TurinTech AI was formed by three PhD candidates who met at University College London and had gained experience of – and become frustrated by – the code optimisation tasks they’d been required to carry out at the financial companies where they subsequently worked. They founded TurinTech AI with its first product, a code generator powered by machine learning called evoML. Its GenAI Artemis AI code optimiser followed.

TurinTech AI says that its Artemis AI innovation has allowed clients to improve the efficiency of their coding by as much as a third. For example, it was able to improve by 32.7 per cent the runtime of pull requests on QuantLib – an open-source code library favoured by financial institutions for quantitative finance tasks, including modelling and trading.

“It ensures that you are using less of your service resources,” said Kanthan. “So if your code was taking up 30 per cent of your Amazon budget, it might now be taking 20 per cent of your Amazon budget and at the same time improving your footprint for ESG.”

Firms can be expected to dedicate about half of their overall software development budgets to debugging and fixing errors over the 25-year life expectancy of a large software system.

That low-skill work will most likely be carried out by highly trained technology professionals. Kanthan points to the experience of a globally known technology brand client that employed hundreds of developers to manually go over code to find inefficiencies.

“They’re all PhDs and professors who should be building new applications not going back through existing code,” he said. “We saved them the time, we saved them the labour resource, we gave them cost efficiency and we allow them to get more output from what they already had.”

Speed Bumps

Financial institutions face greater exposure to coding quality challenges also because they need to develop and deploy new applications at speed. Under such time-to-market constraints, developers will build applications as quickly as possible, and that might mean they use tried-and-trusted phrasing that may not be the most efficient.

A common example, Kanthan said, is the use of for-loops, which are quick to write and are reliable, but they are not as efficient as other structures.

“It’s so hard as a developer to do things the most efficient way because of the time constraints; if they’re given an objective and told it is needed by the end of the week, they do it as quickly they possibly can – their priority is to get the result they want,” said Kanthan. “So, they might do it in a messy way that duplicates many functions.”

The pressure to default to a reliable solution is also seen in the continued use by some organisations of the Fortran. It’s an old CPU-hogging language but it is dependable, and replacing it would incur a huge transitional cost. TurinTech’s Artemis AI can be deployed to translate those old-style codes into modern and more efficient C++.

“Fortran is a very old and obsolete language but because it works, it doesn’t break and no one wants to touch it,” said Kanthan. “It’s too expensive to get Fortran developers because they are hard to find and very expensive. So, you’re talking about spending thousands of pounds per day per person to work on millions of lines of code, using our product will bring enormous savings.”

The post TurinTech Deploys GenAI to Accelerate Financial Software appeared first on A-Team.

]]>
Asset Managers Can Learn Lessons from New Government’s Cost Pressures https://a-teaminsight.com/blog/asset-managers-can-learn-lessons-from-new-governments-cost-pressures/?brand=dmi Wed, 21 Aug 2024 08:26:40 +0000 https://a-teaminsight.com/?p=69650 By Thomas McHugh, CEO and Co-founder of FINBOURNE Technology. Any new government brings the inevitable “change” message, but one thing never changes regardless of who has the keys to the treasury – seeking out departmental cost savings wherever humanly possible. With unprotected departments facing cuts of up to 2.9% according to the Institute for Fiscal Studies,...

The post Asset Managers Can Learn Lessons from New Government’s Cost Pressures appeared first on A-Team.

]]>
By Thomas McHugh, CEO and Co-founder of FINBOURNE Technology.

Any new government brings the inevitable “change” message, but one thing never changes regardless of who has the keys to the treasury – seeking out departmental cost savings wherever humanly possible. With unprotected departments facing cuts of up to 2.9% according to the Institute for Fiscal Studies, Rachel Reeves faces the unenviable task of making eye-watering efficiency savings while also boosting growth.

This fiscal pressure provides a timely parallel for asset managers, who are also grappling with their own rising costs, particularly in the thorny area of operations. Just as any new government must prioritise where to make cuts without harming essential public services, asset managers need to navigate the longstanding challenge of reducing costs while protecting key business functions.

Moves as drastic as ditching trading terminals to save millions, as some have considered, underscores the immense pressure to streamline operations amid years of shrinking margins and heightened competition. Streamlining data management makes more sense as a cost cutting approach, as it brings the added benefits of more effective decision making and the opportunities to innovate that arise from better managed data.

Much like our political system, data in asset management firms is often a tangled mess. Years of patching together disparate solutions with siloed data sets have resulted in a Frankenstein–like tech stack. Changing this in a big all-encompassing program that promises a new way can lead to disappointment and a change program divorced from reality – more like a Liz Truss-style fiscal horror show than an efficient machine. Transforming data into a core operational asset, at a cost that is manageable, can be a real game changer for Asset Management firms.

With all this in mind, like politicians addressing their electorate, asset managers must prioritise the needs of their investors. Extensive change to data management strategies may be needed, but it doesn’t follow that changes should all take the form of ‘megaprojects.’ Technology should simplify, not complicate. The right software should work seamlessly, providing accurate and timely information that transforms how firms use data to deliver enhanced services to their clients .

The lesson from the government’s fiscal challenges is clear: prioritise, streamline, and modernise. Asset managers should invest in integrated data management solutions that will ultimately result in a leaner, more efficient operation capable of thriving in a competitive landscape.

The post Asset Managers Can Learn Lessons from New Government’s Cost Pressures appeared first on A-Team.

]]>
Unlocking Private Market ESG Data through AI https://a-teaminsight.com/blog/unlocking-private-market-esg-data-through-ai/?brand=dmi Thu, 08 Aug 2024 08:27:43 +0000 https://a-teaminsight.com/?p=69573 By Yann Bloch, VP Product Management at NeoXam. In today’s investment world, the importance of integrating environmental, social, and governance factors into investment strategies is no longer up for debate. Asset managers globally recognise that sustainable business practices are not only vital for ethical considerations but are also critical for long-term financial performance. Despite this recognition,...

The post Unlocking Private Market ESG Data through AI appeared first on A-Team.

]]>
By Yann Bloch, VP Product Management at NeoXam.

In today’s investment world, the importance of integrating environmental, social, and governance factors into investment strategies is no longer up for debate. Asset managers globally recognise that sustainable business practices are not only vital for ethical considerations but are also critical for long-term financial performance. Despite this recognition, a significant challenge persists: accessing reliable and comparable ESG data, particularly from private companies that often lack standardised reporting practices. The solution to this problem lies in the innovative use of artificial intelligence (AI) technologies.

Private companies are increasingly producing sustainability reports that provide valuable insights into their ESG performance. However, these reports come in various formats, use different terminologies and offer varying levels of detail, creating a complex, unstructured data landscape. This lack of standardisation makes it difficult for asset managers to efficiently extract and utilise the data, hindering their ability to make informed investment decisions that align with ESG criteria.

The emergence of AI is poised to revolutionise how asset managers handle private market ESG data. AI, particularly machine learning models, can be trained to recognise and interpret the diverse formats and terminologies used in these sustainability reports. Take natural language processing (NLP) as prime case in point. A subfield of AI focused on the interaction between computers and human language, NLP can automatically extract key data points from unstructured texts. This transformation of unstructured data into structured, actionable information is a major step forward for the industry.

One of the primary benefits of using AI in this context is the ability to automate the data extraction process. Traditionally, asset managers had to manually sift through reports, a time-consuming and error-prone process. AI tools can scan thousands of documents in a fraction of the time it would take a human, ensuring that no critical information is overlooked. This not only increases efficiency but also allows asset managers to process larger volumes of data, providing a more comprehensive view of a company’s ESG performance.

AI is great for the extraction of data and even better when combined with robust data management technology. At the receiving end of AI-driven data extraction, robust data management systems ensure data quality, including consistency and completeness, and combine it with data from other sources. This integrated approach amplifies the value of AI by providing a holistic view of ESG metrics, essential for informed decision-making.

In addition, AI can enhance the comparability of ESG data from private companies. By standardising the extracted information, these technologies enable asset managers to compare ESG metrics across different firms, even if the original reports were vastly different in format and detail. This level of comparability is crucial for making informed investment decisions and for accurately assessing the ESG performance of potential investment targets.

Another significant advantage is the ability to keep pace with the evolving ESG reporting landscape. As regulatory requirements and industry standards for ESG reporting continue to develop, AI models can be updated to incorporate new criteria and metrics. This ensures that asset managers are always working with the most current and relevant data, maintaining the accuracy and reliability of their ESG assessments.

The integration of AI into ESG data management also supports transparency and accountability. By providing clear, structured data, these technologies enable asset managers to present their ESG findings to stakeholders with greater confidence and clarity. This transparency is not only beneficial for investor relations but also for meeting regulatory requirements and for maintaining the trust of clients who are increasingly demanding sustainable investment options.

The application of AI technologies in extracting private market ESG data represents a significant advancement for asset managers. These tools address the critical challenge of unstructured data, providing a streamlined, efficient, and reliable means of accessing the information necessary to drive sustainable investment strategies. As the industry continues to evolve, embracing these technological innovations will be essential for asset managers looking to stay ahead of the curve and deliver on their commitments to sustainable investing.

The post Unlocking Private Market ESG Data through AI appeared first on A-Team.

]]>
Insurance Stress Test Success Hangs on Data Quality and Management https://a-teaminsight.com/blog/insurance-stress-test-success-hangs-on-data-quality-and-management/?brand=dmi Tue, 06 Aug 2024 14:04:15 +0000 https://a-teaminsight.com/?p=69558 Recently revealed tests to explore the resilience of insurers to external shocks are likely to succeed – or fail – on the data that the under-scrutiny firms possess. Data will be a central ingredient on the tests detailed last month by the Prudential Regulation Authority (PRA), which oversees the industry. In its most recent communique,...

The post Insurance Stress Test Success Hangs on Data Quality and Management appeared first on A-Team.

]]>
Recently revealed tests to explore the resilience of insurers to external shocks are likely to succeed – or fail – on the data that the under-scrutiny firms possess.

Data will be a central ingredient on the tests detailed last month by the Prudential Regulation Authority (PRA), which oversees the industry. In its most recent communique, the PRA detailed the design and timing of its Dynamic General Insurance Stress Test (DyGIST), which have been created as risks associated with cyber-attacks, climate change and market volatility are expected to rise.

The tests, to be held in May next year, will comprise live exercises during which firms will be presented with a set of hypothetical adverse events over three weeks to which insurers must respond as if they were real. The PRA will require detailed analyses of responses. Results will be announced as the tests progress and will go on to inform the regulator’s supervisory plans.

The exercises, which will be held alongside a similar test for life insurers, will expect firms to have their data in order to respond adequately, a stipulation that could be an opportunity to the insurance industry to boost its IT capabilities, said Wenzhe Sheng, senior product manager for EMEA prudential regulation at Clearwater Analytics.

“The Prudential Regulation Authority’s design of the DyGIST framework provides a strong foundation for ensuring the resilience of the insurance sector,” Sheng told Data Management Insight. “Moreover, it provides insurers with an incentive to fortify their data infrastructures and implement data driven risk management practices.”

Banking Assessments

The stress tests were announced last year and follow similar exercises focused largely on the UK’s banking industry. They will be carried out to gain a deep understanding of the insurance industry’s solvency and liquidity buffers and to examine the effectiveness of their management response to adverse scenarios.

The PRA held workshops with the Association of British Insurers, the Lloyd’s Market Association and the International Underwriting Association to devise the design and timing of the tests. Professional services giant Deloitte said last week that the bank tests had shown that insurers should prepare for their own assessment by ensuring their data is in order.

“General insurers need to enhance their stress and scenario testing processes to be able to perform the exercise live – including ensuring they can aggregate relevant data and identify potential management actions to deploy for any given scenario,” it said in a report.

Earlier Experience

The importance of having good data in responding to new risks was highlighted in similar stress tests held three years ago by the Bank of England to assess the resilience of insurers and lenders to climate risks. Initial exercises conducted by organisations including AXA, Allianz and AIG revealed concerning failures in data preparedness, which left some struggling to complete the tests, and surfacing gaps in critical datasets.

Clearwater’s Sheng said it is imperative that insurers have their data estates ready.

“In order to pass the first phase of the test – a live exercise that test’s a firm’s preparedness for adverse market-stressed events – it’s imperative that insurers have the capability to quickly pull up a very clear and transparent view of all of their holdings under these scenarios,” he said. “This is not as common as you would expect, as insurers are increasingly investing in a wide range of assets, which means they are often dealing with very different types of data in their internal systems.”

Sheng warned, however, that the DyGIST could also highlight shortcomings in firms’ data setups.

“When it comes to risk management you need to have a real-time understanding of your risk exposures in order to respond and manage that risk,” he said.

Reason for Hope

He is optimistic that insurance firms will be able overcome the challenges, thanks to the availability of new innovations and that “can provide daily, validated, and reconciled investment data on their entire portfolio, so that they can properly understand their market exposure across asset classes”

“Those who choose to invest in such modern data infrastructures will be best prepared to demonstrate their solvency and liquidity resilience and their effectiveness in risk management practice under the DyGIST regulatory exercise,” Sheng said.

The post Insurance Stress Test Success Hangs on Data Quality and Management appeared first on A-Team.

]]>
Latest UK SDR Measure Highlights Data Challenge https://a-teaminsight.com/blog/latest-uk-sdr-measure-highlights-data-challenge/?brand=dmi Tue, 06 Aug 2024 14:00:20 +0000 https://a-teaminsight.com/?p=69555 The UK has implemented the latest stage of its sustainability disclosure requirement (SDR), which is designed to encourage manufacturers of investment products to adopt measures that will prevent greenwashing. Before the measure was even introduced by the Financial Conduct Authority (FCA), however, it was apparent that fund managers’ likelihood of adopting the guidance would be...

The post Latest UK SDR Measure Highlights Data Challenge appeared first on A-Team.

]]>
The UK has implemented the latest stage of its sustainability disclosure requirement (SDR), which is designed to encourage manufacturers of investment products to adopt measures that will prevent greenwashing.

Before the measure was even introduced by the Financial Conduct Authority (FCA), however, it was apparent that fund managers’ likelihood of adopting the guidance would be limited by their data setups. Experts have told Data Management Insight that solving this challenge would be critical to meeting the goals that underpin the SDR.

Since July 31, managers have been asked to voluntarily label their products according to the degree to which they can be considered sustainable.

Those labelled “Sustainability Improvers” denote assets that have the potential to become sustainable but may not be now. “Sustainability Impact” products are those that invest in solutions that bring beneficial ESG impacts. “Sustainability Mixed Goals” labels indicate investment vehicles that combine the other two. A fourth, “Sustainability Focus”, is reserved for products that have at least 70% of allocations to sustainable assets.

Those seeking to adopt the labels must show they meet the requirements by the beginning of December.

Clarity Needed

Critics have predicted a slow uptake of the labels by fund houses, with some arguing that more clarity is needed about how the labels can be properly applied. At the heart of that challenge is likely to be firms’ ability to gather and use the data necessary to make those decisions.

The FCA said last year that asset managers and manufacturers must have robust data, governance and technology setups to adopt its measures. A poll during a webinar by consultancy firm Bovill Newgate, however, found that 90% of financial services respondents said they were not equipped with the correct ESG reporting data.

Emil Stigsgaard Fuglsang, co-founder at ESG data and consultancy firm Matter said data would be a potential pain point for firms operating in the UK.

“While many global investment managers already have these competencies in place thanks to the requirements of other regulations, the majority of smaller British firms do not,” Fugslang said.

“This means they face the challenge of accurately defining sustainability in their investments and implementing data and analytics solutions to track and document their performance against these definitions at the fund-level. This will be no easy task, but those who take action now will be best prepared by the December 2 deadline.”

Investor Protections

The labelling guidance follows the publication of anti-greenwashing advice by the FCA earlier this year, which seeks to protect investors from abuse by encouraging asset managers and manufacturers to be clear and precise in the descriptions of their products.

The FCA is keen to safeguard investors against being lured by false claims of an asset or product’s sustainability. The threat of greenwashing has been wielded as a weapon in an ESG backlash, most notably in the US, that has seen billions of dollars pulled from sustainability-linked funds.

While the measure is designed primarily to protect retail investors, it is expected also to have an impact on institutional capital allocators. One of the first funds to adopt an SDR label, AEW’s impact fund, has taken the Sustainable Impact categorisation and is offered only to institutions.

The SDR is also widely predicted to set transparency standards that institutions are likely to follow.

ESMA Guidance

The UK’s latest SDR implementation came as Europe’s regulators sought changes to some of the European Union’s disclosure rules. The European Securities and Markets Authority (ESMA), last week suggested changes that would affect the bloc’s lynchpin Sustainable Finance Disclosure Regulation (SFDR) and other measures.

In an opinion piece it set out a set of proposals that urge tweaks to the EU’s wider sustainable finance framework, arguing that there needs to be greater “interconnectedness between its different components”.

Among ESMA’s proposals are a phasing out of the phrase “sustainable investments” within the SFDR and a recommendation that market participants should instead make reference only to the green Taxonomy that underpins European market rules. Further, it suggested an acceleration of the Taxonomy’s completion, incorporating a social taxonomy.

It also urged that ESG data products be brought under regulatory scrutiny to improve their quality.

Clash of Standards

Other recommendations on how sustainability products should be described could conflict with the new measures introduced by the FCA.

ESMA suggests that all products provide basic information on their sustainability, with greatest detail given to institutional investors. It also urges the introduction of a “best in class” product categorisation system. That would include at least a “Sustainability” classification, denoting products that are already green, and a “Transition” grouping of funds that aim to be sustainable.

Martina Macpherson, head of ESG product strategy and management at SIX Financial Information, said institutions would need to familiarise themselves with each code.

“Challenges for asset managers remain to categorise funds in line with the UK’s labelling regime, and to align them with the EU’s fund labelling rules introduced by ESMA,” MacPherson said. “Overall, ESG fund labels represent a significant next step to address transparency and greenwashing concerns. Meanwhile, the mounting public and regulatory attention surrounding sustainable investment demands firms to use the most reliable, legitimate, and timely data to inform their decisions.”

The post Latest UK SDR Measure Highlights Data Challenge appeared first on A-Team.

]]>
Crafting an Effective Data Strategy to Unlock Innovation https://a-teaminsight.com/blog/crafting-an-effective-data-strategy-to-unlock-innovation/?brand=dmi Mon, 29 Jul 2024 08:29:01 +0000 https://a-teaminsight.com/?p=69479 By Kelly Attrill, Head of Advisory & Consulting APAC at Lab49. Data can be both an asset and a liability. Used correctly, it can transform an organisation’s ability to unlock value and enable innovation. However, if data is mismanaged it can have catastrophic consequences. In financial services, firms recognise that the ever-increasing volume of data...

The post Crafting an Effective Data Strategy to Unlock Innovation appeared first on A-Team.

]]>
By Kelly Attrill, Head of Advisory & Consulting APAC at Lab49.

Data can be both an asset and a liability. Used correctly, it can transform an organisation’s ability to unlock value and enable innovation. However, if data is mismanaged it can have catastrophic consequences. In financial services, firms recognise that the ever-increasing volume of data they handle constitutes an asset that, with the right tooling, can deliver value far offsetting the initial investment. However, in some cases, its applicability to client outcomes may be unclear, and there may be a disconnect between how a business seeks to use data and how it’s currently being managed and distributed.  To avoid this and make sure that data fulfils its potential, it’s crucial to develop and implement a robust data strategy.

Strong foundations

An effective data strategy starts with identifying business goals that will be achieved with data and defining clear operational principles for data management and usage. This includes defining what a firm can and cannot do with data and identifying which areas data can add value to the client and employee. Across the front and back offices, firms must be willing to invest not only in the technology but also in the necessary training to ensure these principles are embedded in client journeys and in the day-to-day work of the team.

A strategy that establishes a foundational set of goals and principles lays the groundwork for the development of frameworks, policies and plans across the firm’s divisions. For example, defining data usage boundaries in the data strategy enables the development of a well-defined data governance framework, ensuring the safe, ethical and compliant handling of data across an organisation.

It is crucial that the data strategy is linked directly to business goals and clear time horizons to achieve these goals. This will drive prioritisation and planning decisions and allow the organisation to monitor progress through the implementation of the strategy. Defining the right goals is important; focusing only on one dimension of the data strategy will limit potential value. With a focus on enabling AI use cases, many firms invest in uplifting and ensuring that the quality of data is correct and can be trusted across the whole landscape. On the whole, this is a good thing but it is just as important for firms to continually invest in skills and technology to unlock value. This includes training employees to understand, access, and use data assets effectively and ensuring that data management practices are integrated into their workflows.

Moreover, a data-driven strategy must be agile, supporting the entire data lifecycle and allowing firms to adopt new tools and techniques as they emerge. This agility is vital for balancing mid-term investments in technology and people with the ability to quickly implement proven or experimental technologies that enhance data management and use.

Enhancing services

To address challenges in securing stakeholder buy-in, it is essential to clearly demonstrate how a data strategy aligns with and supports direct business outcomes and client needs. By showcasing tangible benefits, such as improved product offerings and risk management, firms can build a compelling case for investment in data initiatives.

Effectively harnessing data offers significant promise for firms looking to enhance their service offering. OECD researchhas found that large firms’ investments in intangible assets like data and software—which can scale without linear cost increases—can help grow their market share.

Increasingly, data is being integrated with AI to unlock advanced capabilities. For instance, AI models can streamline risk management by quickly digesting large volumes of changing regulations, and digital lending services have sped up the time to lending approvals by using machine learning and automation to improve credit decisions.

Personalised products tailored to individual clients’ needs are another significant benefit of a data-driven strategy. For example, upgrading Customer Relationship Management (CRM) systems so client information is accessible through consistent channels in an intuitive way allows front line staff to build an understanding of client needs and enables the delivery of powerful insights that may unlock more targeted propositions and spur business growth. These can improve satisfaction and loyalty not only for existing customers but also boost new business opportunities by improving the productivity and efficiency of sales teams, supporting a more competitive commercial proposition. A data strategy that prioritises feedback loops, collecting information based on the insights and proposition value and feeding that into the next set of insights and propositions will enable firms to shift to a data-driven strategy across multiple dimensions – data-driven product, data-driven marketing, data-driven people, etc.

Given increased attention from regulators globally to appropriately manage and protect data, developing a mature data strategy is not only desirable in terms of compliance but can help firms stay competitive by protecting against financial loss and reputational harm.

Future-proofing

As technological change continues to accelerate, firms adopting a data-driven strategy are better placed to leverage that data in new ways across business lines, the product suite and the operating environment. When the focus of the strategy is on disconnecting tightly bound links between technology and vendor platforms and enabling access that is simple, secure and intuitive, the value of the firm’s data assets becomes clearer and more closely tied to business outcomes.

Fostering a culture of data literacy where the value of data-driven decision-making is promoted across the organisation can go a long way to ensuring that all stakeholders, from top management to front-line employees, understand the benefits of a data-driven approach and are equipped to adapt to new ways of working.

Investment in experimentation with AI and embedding trusted decision and insight models into the firm’s decision-making processes becomes much easier once data is more available and protected through the right governance environment. Feedback from the success of this will help drive a data-driven organisation and feed the next generation of data-driven strategy.

The post Crafting an Effective Data Strategy to Unlock Innovation appeared first on A-Team.

]]>
Data Warning After UK Signals New Law Covering AI Use https://a-teaminsight.com/blog/data-warning-after-uk-signals-new-law-covering-ai-use/?brand=dmi Fri, 26 Jul 2024 14:09:57 +0000 https://a-teaminsight.com/?p=69475 Financial institutions operating in the UK must begin ensuring the integrity of their data estates after the newly elected government signalled plans to forge a potentially far-reaching AI bill. Leaders of two large data management companies said that any new technology law could usher powers of intervention if AI models and processes are seen as...

The post Data Warning After UK Signals New Law Covering AI Use appeared first on A-Team.

]]>
Financial institutions operating in the UK must begin ensuring the integrity of their data estates after the newly elected government signalled plans to forge a potentially far-reaching AI bill.

Leaders of two large data management companies said that any new technology law could usher powers of intervention if AI models and processes are seen as likely to cause danger to individuals or companies. Only with robust data management setups would organisations be able to ensure they don’t breach any new law.

Greg Hanson, group vice president and head of EMEA North sales at Informatica, and Arun Kumar, UK regional director at ManageEngine, offered their thoughts after the government of new Prime Minister Kier Starmer revealed its legislative programme for the next parliament.

While the announcement made no mention of a full AI bill, the plans revealed in the King’s Speech made by King Charles at the opening of parliament last week, said the UK will seek to “establish the appropriate legislation to place requirements on those working to develop the most powerful artificial intelligence models”.

Bad Data, Bad Outcomes

“Businesses must now brace for greater intervention and be prepared to demonstrate how they are protecting the integrity of AI systems and large language models,” said Hanson. “Developing robust foundations and controls for AI tools is a good starting point.”

Hanson echoed a view common among technologists and users that AI can only be useful if it is fed good data. Without that, downstream processes will be erroneous and potentially catastrophic to workflows and operations.

“Bad data could ultimately risk bad outcomes, so organisations need to have full transparency of the data used to train AI models,” he added. “And just as importantly, businesses need to understand the decisions AI models are making and why.”

With more institutions putting critical parts of their data processes in the hands of AI technologies, policy makers are worried that miscalculations will lead to a snowball effect of failures and negative impacts on people and businesses. Recent AI malfunctions have led to companies paying damages to affected parties. In February, for instance, Air Canada was forced to offer reparations to a passenger who was given inaccurate information by an AI-powered chatbot.

Hanson said that organisations should begin by ensuring that machines don’t make decisions without human input.

“It’s critical that AI is designed, guided and interpreted from a human perspective,” he said, offering as an example careful consideration about whether large language models have been trained on “bias-free, inclusive data or whether AI systems can account for a diverse range of emotional responses”.

“These are important considerations that will help manage the wider social risks and implications it brings, allowing businesses to tackle some of the spikier challenges that generative AI poses so its transformative powers can be realised,” he said.

Driving Improvements

Arun at ManageEngine, the enterprise IT management division of Zoho Corporation, said a well-crafted bill would do more than simply list what organisations should not do.

“This bill promises to go a long way in helping to tackle the risks that come from a lack of specialised knowledge around this relatively new technology,” he said. Such a bill “could give businesses guidance on how to prioritise trust and safety, introducing essential guard rails to ensure the safe development and usage of AI”.

Pointing to recent ManageEngine research that showed 45% of IT professionals only have a basic understanding of generative AI technologies and that most have no governance frameworks for AI implementation, he said that a bill would provide the confidence needed AI systems improve.

“Introducing legislation on safety and control mechanisms, such as a requirement to protect the integrity of testing data, will help guide the use of AI so businesses can confidently use it to drive business growth,” he said.

The post Data Warning After UK Signals New Law Covering AI Use appeared first on A-Team.

]]>
Webinar Review: Harnessing the Wider Benefits of Data Identifiers https://a-teaminsight.com/blog/webinar-review-harnessing-the-wider-benefits-of-data-identifiers/?brand=dmi Tue, 23 Jul 2024 13:49:22 +0000 https://a-teaminsight.com/?p=69442 Almost three-quarters of capital markets participants are utilising data standards and identifiers beyond their immediate regulatory use cases, realising the huge benefits that ordered and consistent datasets can bring to an enterprise’s entire operations. The findings of an A-Team Group Data Management Insight poll showed that 40% of respondents said they are using the resources to a...

The post Webinar Review: Harnessing the Wider Benefits of Data Identifiers appeared first on A-Team.

]]>
Almost three-quarters of capital markets participants are utilising data standards and identifiers beyond their immediate regulatory use cases, realising the huge benefits that ordered and consistent datasets can bring to an enterprise’s entire operations.

The findings of an A-Team Group Data Management Insight poll showed that 40% of respondents said they are using the resources to a “great extent”, while another 33% are using them to a “good extent”. Just 13% reported they aren’t utilising them at all.

The poll illustrates how financial institutions are seizing on the consistency that identifiers bring to data to turbo-boost use cases such as know-your-customer (KYC) processes and risk management, as well as bring broad operational efficiencies, according to speakers at DMI’s most recent webinar, during which the poll was held.

The webinar, entitled “How to maximise the use of data standards and identifiers beyond compliance and in the interest of the business”, gathered leading participants in the data management and identifiers space. To confine the use of identifiers to satisfying regulatory obligations would be a waste, Emma Kalliomaki, managing director of the Derivatives Service Bureau (DSB) told the webinar.

Broad Strategy

While they are critical to bringing “efficiency and harmonisation”, their broader deployment has become part of data management best practices, Kalliomaki said. Having a data strategy that recognises the applications of such resources to data uses throughout the entire business is critical, she said, adding that this necessitated robust governance models.

Among the speakers was Alexandre Kech, chief executive of the Global Legal Entity Identifier Foundation (GLEIF), which recently devised the Legal Entity Identifier (LEI) standard that’s used by companies and financial organisations around the world. Its latest iteration, the virtual LEIs, or vLEI – a cryptographically secure digital representation of LEIs – has been adopted by a large number of companies, especially within global supply chains, Kech said.

The consistency that standards and identifiers bring is also crucial to enabling organisations to “stitch” together datasets across an enterprise, enabling them to identify patterns and outliers in those pools of information, offered Robert Muller, senior group manager and technology product owner at BNY. This, he added, can create the foundations on which AI can be applied and on which the accuracy of analytical models can be improved.

Despite recognising the wider benefits of identifiers, many companies are encountering challenges in realising them. Chief among them, according to another poll during the webinar, is their integration with existing applications and systems. Two-third of respondents cited this as their chief impediment to broader utilisation.

Integration Challenges

Laura Stanley, director of entity data and symbology at LSEG said she was unsurprised by the polling. The multiplicity of systems and software deployed by modern financial institutions makes integration of their technology difficult and imposes an obstacle on the sort of joined-up thinking that is enabled by identification standards.

Another key challenge facing organisation, according to the poll, was the variety of, and regular creation of, identification standards. As well as LEIs, other standards include Unique Product Identifiers (UPIs), the International Securities Identification Number (ISIN) and the ISO 20022. These join proprietary naming codes, which companies use internally.

Kalliomaki said that companies should not be deterred by the apparent complexity of these different codes because they are largely complementary. When making a business case for their wider application, they also have the benefit of being low-cost resources, she said.

Further, she added, their wider use also provides organisations the opportunity to help national standards committees play a part in the evolution of identifiers, making them even more useful and bringing greater benefits to financial institutions.

Stanley agreed, echoing a point stated by Muller, that the application of AI, and in particular Generative AI, was likely to simplify the currently complex process of switching between standards. This, the panel agreed, would require a programme of educating market participants on the benefits more broadly using identifiers.

The post Webinar Review: Harnessing the Wider Benefits of Data Identifiers appeared first on A-Team.

]]>
Meeting New Capital Markets Challenges: Gresham and Alveo Leaders Discuss Merger and Future Plans https://a-teaminsight.com/blog/meeting-new-capital-markets-challenges-gresham-and-alveo-leaders-discuss-merger-and-future-plans/?brand=dmi Tue, 23 Jul 2024 09:28:28 +0000 https://a-teaminsight.com/?p=69434 The merger of Gresham Technologies and Alveo, which was announced last week, was born of a desire by each company to scale their capabilities to meet growing international demand from financial institutions at a time of increased focus on data management. The venture saw Gresham Technologies delist from the public market to create the new...

The post Meeting New Capital Markets Challenges: Gresham and Alveo Leaders Discuss Merger and Future Plans appeared first on A-Team.

]]>
The merger of Gresham Technologies and Alveo, which was announced last week, was born of a desire by each company to scale their capabilities to meet growing international demand from financial institutions at a time of increased focus on data management.

The venture saw Gresham Technologies delist from the public market to create the new company, which will be known as Gresham. The deal has resulted in a company that combines Gresham Technologies’ transaction control and reconciliations, data aggregation, connectivity solutions and regulatory reporting capabilities with Alveo’s enterprise data management for market, reference and ESG data.

Backed by Alveo’s majority investor STG, a technology-focused private equity firm, the combined business has got to work promoting what it calls its enterprise data automation offering.

Data Management Insight spoke to chief executive Ian Manocha, formerly head of Gresham Technologies, and chair Mark Hepsworth, who held the leadership role at Alveo, about the genesis of the merger and their plans for the future.

“We think it’s a big thing, and I hope the industry recognises that too,” says Hepsworth.

Data Management Insight: What was the rationale behind this merger?

Ian Manocha: Mark and I have known each other and for quite a few years and have always seen the strategic value of working together.

Mark Hepsworth: We’re complementary businesses. We at Alveo focus on enterprise data management, market data, reference data and ESG data and Gresham has built a business around reconciliation, investment management data and connectivity services through to regulatory reporting. The common thread is that we’re both solving data management problems for customers in financial services.

DMI: Where do you see complementarity?

MH: There’s a lot of overlap in terms of some of our customers but also the type of customers that we both sell to, the parts of those customers that we sell to both on the sell side and the buy side, and in areas like exchanges. Also, often at a senior level the same person is responsible for what their firm is doing around market data, as well as reconciliations data for example, and data management..

DMI: What triggered the eventual decision to merge?

IM: A number of things really came together at the right time. There was STG’s interest in us and the board’s view that our shareholders would be open to an exit at the right price. And from a Gresham perspective, we had a sense that, at this stage of the company’s development, we were going to be better served coming off the public markets and having the backing of a large firm like STG to accelerate our journey to take on the opportunities that we were seeing in the market. Mark and I started having the ‘we are finally going to make this happen’, conversation.

DMI: What are those opportunities?

IM: Between us we’ve got the landscape well covered so the question is now, having got all that data and now having the capability to manage it and ensure the quality of it, and of course, the reconciliation capabilities, a part of that question is, ‘what more can we do with it – how can we convert that into a business opportunity for our clients’? That’s the exciting area. So we see an opportunity now to invest more in areas like AI and to invest more in other players in the market.

DMI: What are your plans for growth?

IM: Gresham built a business organically and with some M&A work – we’ve acquired four firms in my nine years at the company. But that’s become more difficult for us on the public markets. It’s well known that there are challenges around liquidity particularly for small caps. We now have the financial backing of STG to look at those opportunities, whether we go after other firms or through organic investment, to fill out that vision of being the leading player in the data automation space for capital markets.

DMI: What will the new company offer its customers?

MH: What we’re really looking to do is create a significant new player in data management for financial services. We now have a broader range of capabilities and data management solutions that stretches further across the enterprise than they did before so we can solve more problems for clients.

Clients have a real focus on data both operationally and in terms of efficiently processing that data and delivering to business users, and doing that with the right level of governance control and transparency. All our customers are regulated and ensuring that they’re using high-quality data in their downstream processes is very important.

IM: Our customers are looking for a real heavyweight player in the data automation and data management space. They want a single heavyweight, well-funded, global company with strong technology capabilities and deep domain expertise to be their partner in their digital transformation because they’re fed up working with people that don’t understand the detail of capital markets data, and they’re fed up with having too many parties to work with.

DMI: What factors are driving the demand you want to meet?

MH: What I’ve seen over the years is that clients effectively feel data management could be easier than it is, that there’s more manual process than then they’d like. Both our companies have really focused their roadmaps in recent years around how we make that easier for customers. We moved to the cloud and both adopted open-source technologies that facilitates easy data management, as well as focused on improving business user self service. We really want to make data management easier for customers and that’s really where we’re going with the automation piece in our new tag line

IM: I’ve long felt that customers are looking to simplify their operating models. It’s not just about having the technical software, it’s also the skills and the capacity to deliver the change that’s needed. That’s particularly true in the mid-sized and smaller firms. There’s no way they can possibly build all that capability in house so we want to be the partner that they seek to deliver that end to-end-capability as a service.

DMI: Are there any practical technical issues you have had to overcome in your integration?

IM: Both firms have got modern development shops, cloud-native tech stacks and we use modern tools, so the kind of legacy stuff that’s harder to move forward is not an issue for us. And at the product level, things like APIs and cloud solutions, you don’t necessarily need to have the deep level of integration you did in the past. So for customers that won’t be visible.

DMI: What products and services will you be offering initially and what do you have in the pipeline?

MH: We will continue to offer those solutions we’re famous for: data automation and control, reconciliations and exceptions management, market data EDM, investment management data aggregation and regulatory reporting. But we’re also excited to get going on new initiatives.

IM: First out of the gates will be offerings for investment managers leveraging the greater richness of data that we now manage on their behalf. Let me give you a practical example, in Alveo market data pricing projects we are readily able to source pricing data for liquid assets but often struggle to obtain pricing for illiquid assets. Whereas in many Gresham NAV reconciliation projects were are pulling latest available pricing for some illiquid assets. So together we can fill a price visibility gap for our customers.

There are many other examples where we can now inject valuable insights into core processes without firms having to invest in costly, risky, data lake projects.  And thinking more strategically, leveraging the Alveo data management technology will help business users with self-service and distribution of these combined data sets. It’s super exciting for us and the customers we’ve spoken to are also enthusiastic which it the acid test.

The post Meeting New Capital Markets Challenges: Gresham and Alveo Leaders Discuss Merger and Future Plans appeared first on A-Team.

]]>
Alveo and Gresham Merge to Offer Data Services at ‘Significant’ Scale https://a-teaminsight.com/blog/alveo-and-gresham-merge-to-offer-data-services-at-significant-scale/?brand=dmi Wed, 17 Jul 2024 10:45:20 +0000 https://a-teaminsight.com/?p=69331 Data management software and services providers Alveo and Gresham Technologies have merged in a deal that the newly augmented company says will offer clients data automation and optimisation at “significant” scale. The new business, which will be known as Gresham, will be based in London with former Gresham Technologies chief executive Ian Manocha continuing the...

The post Alveo and Gresham Merge to Offer Data Services at ‘Significant’ Scale appeared first on A-Team.

]]>
Data management software and services providers Alveo and Gresham Technologies have merged in a deal that the newly augmented company says will offer clients data automation and optimisation at “significant” scale.

The new business, which will be known as Gresham, will be based in London with former Gresham Technologies chief executive Ian Manocha continuing the role at the company and Mark Hepsworth, who headed Alveo, taking the chair’s position.

The combined companies marry Gresham Technologies’’ transaction control and reconciliations, data aggregation, connectivity solutions and regulatory reporting capabilities with Alveo’s enterprise data management for market, reference and ESG data.

The range of data automation and process solutions it can offer will reduce the total cost of ownership of clients’ data, Gresham said.

“The combination of the two firms accelerates our journey to bring digital integrity, agility, operational efficiency and data confidence to financial markets globally,” said Manocha. “It creates a comprehensive set of solutions for data automation, operational efficiency, data management, analytics and risk mitigation for financial and corporate clients globally.”

The terms of the deal were not disclosed but Alveo’s majority owner, technology-focused private equity firm STG, backed the merger.

London-based Alveo was founded in 1991 as Asset Control, one of the first third-party enterprise data management service providers. It changed its name in 2020 after becoming a cloud-native, managed-service provider.

Gresham Technologies began life as Gresham Computing offering real-time transaction control and enterprise data integrity solutions.

Hepsworth said the newly enlarged company will be able to meet the increasing data demands of clients.

“We can now offer clients greater scale and a wider range of solutions that will simplify their operations and enable them to manage data more effectively,” he said.

The post Alveo and Gresham Merge to Offer Data Services at ‘Significant’ Scale appeared first on A-Team.

]]>