Data Science & Analytics - A-Team https://a-teaminsight.com/category/data-science-analytics-categories-dmi/ Wed, 21 Aug 2024 09:12:48 +0000 en-GB hourly 1 https://wordpress.org/?v=6.6.1 https://a-teaminsight.com/app/uploads/2018/08/favicon.png Data Science & Analytics - A-Team https://a-teaminsight.com/category/data-science-analytics-categories-dmi/ 32 32 TurinTech Deploys GenAI to Accelerate Financial Software https://a-teaminsight.com/blog/turintech-deploys-genai-to-accelerate-financial-software/?brand=dmi Wed, 21 Aug 2024 09:12:48 +0000 https://a-teaminsight.com/?p=69657 The costs associated with poor quality software coding are startling. In dollar terms alone, companies in the US incurred a US$2.4 trillion hit from the direct impacts and cost of correcting poor coding, according to a 2022 survey by the Consortium for Information and Software Quality (CISQ). That’s before indirect costs such as reputational and...

The post TurinTech Deploys GenAI to Accelerate Financial Software appeared first on A-Team.

]]>
The costs associated with poor quality software coding are startling. In dollar terms alone, companies in the US incurred a US$2.4 trillion hit from the direct impacts and cost of correcting poor coding, according to a 2022 survey by the Consortium for Information and Software Quality (CISQ).

That’s before indirect costs such as reputational and legal damages are considered.

With financial institutions’ greater dependence on technology and speed of execution, the costs of software failure and slow runtimes are potentially higher. The same survey found that the dollar-value impact of operational failure was as much as 72 times higher for large financial brokerages than it was for other industries.

Since its creation in 2018, TurinTech AI has been on a mission to help firms reduce costs. The UK-based company leverages GenAI to pinpoint areas in mission-critical software systems where optimisation is needed and then it generates better code to enhance performance and efficiency.

TurinTech AI’s technology offers a range of services that can streamline billions of lines of code to reduce applications’ pressure on CPU processing power and cloud use. By doing so companies also reduce the energy needed to carry out their everyday processes, a benefit that decreases their carbon footprint, said chief executive Leslie Kanthan.

“Financial institutions, banks and hedge funds have huge amounts of code –hundreds of millions of lines of code – and it would take 40 Guys 10 years just to review a couple of million lines of code; it’s an intractable problem,” Kanthan told Data Management Insight.

Efficiency Gains

TurinTech AI was formed by three PhD candidates who met at University College London and had gained experience of – and become frustrated by – the code optimisation tasks they’d been required to carry out at the financial companies where they subsequently worked. They founded TurinTech AI with its first product, a code generator powered by machine learning called evoML. Its GenAI Artemis AI code optimiser followed.

TurinTech AI says that its Artemis AI innovation has allowed clients to improve the efficiency of their coding by as much as a third. For example, it was able to improve by 32.7 per cent the runtime of pull requests on QuantLib – an open-source code library favoured by financial institutions for quantitative finance tasks, including modelling and trading.

“It ensures that you are using less of your service resources,” said Kanthan. “So if your code was taking up 30 per cent of your Amazon budget, it might now be taking 20 per cent of your Amazon budget and at the same time improving your footprint for ESG.”

Firms can be expected to dedicate about half of their overall software development budgets to debugging and fixing errors over the 25-year life expectancy of a large software system.

That low-skill work will most likely be carried out by highly trained technology professionals. Kanthan points to the experience of a globally known technology brand client that employed hundreds of developers to manually go over code to find inefficiencies.

“They’re all PhDs and professors who should be building new applications not going back through existing code,” he said. “We saved them the time, we saved them the labour resource, we gave them cost efficiency and we allow them to get more output from what they already had.”

Speed Bumps

Financial institutions face greater exposure to coding quality challenges also because they need to develop and deploy new applications at speed. Under such time-to-market constraints, developers will build applications as quickly as possible, and that might mean they use tried-and-trusted phrasing that may not be the most efficient.

A common example, Kanthan said, is the use of for-loops, which are quick to write and are reliable, but they are not as efficient as other structures.

“It’s so hard as a developer to do things the most efficient way because of the time constraints; if they’re given an objective and told it is needed by the end of the week, they do it as quickly they possibly can – their priority is to get the result they want,” said Kanthan. “So, they might do it in a messy way that duplicates many functions.”

The pressure to default to a reliable solution is also seen in the continued use by some organisations of the Fortran. It’s an old CPU-hogging language but it is dependable, and replacing it would incur a huge transitional cost. TurinTech’s Artemis AI can be deployed to translate those old-style codes into modern and more efficient C++.

“Fortran is a very old and obsolete language but because it works, it doesn’t break and no one wants to touch it,” said Kanthan. “It’s too expensive to get Fortran developers because they are hard to find and very expensive. So, you’re talking about spending thousands of pounds per day per person to work on millions of lines of code, using our product will bring enormous savings.”

The post TurinTech Deploys GenAI to Accelerate Financial Software appeared first on A-Team.

]]>
Insurance Stress Test Success Hangs on Data Quality and Management https://a-teaminsight.com/blog/insurance-stress-test-success-hangs-on-data-quality-and-management/?brand=dmi Tue, 06 Aug 2024 14:04:15 +0000 https://a-teaminsight.com/?p=69558 Recently revealed tests to explore the resilience of insurers to external shocks are likely to succeed – or fail – on the data that the under-scrutiny firms possess. Data will be a central ingredient on the tests detailed last month by the Prudential Regulation Authority (PRA), which oversees the industry. In its most recent communique,...

The post Insurance Stress Test Success Hangs on Data Quality and Management appeared first on A-Team.

]]>
Recently revealed tests to explore the resilience of insurers to external shocks are likely to succeed – or fail – on the data that the under-scrutiny firms possess.

Data will be a central ingredient on the tests detailed last month by the Prudential Regulation Authority (PRA), which oversees the industry. In its most recent communique, the PRA detailed the design and timing of its Dynamic General Insurance Stress Test (DyGIST), which have been created as risks associated with cyber-attacks, climate change and market volatility are expected to rise.

The tests, to be held in May next year, will comprise live exercises during which firms will be presented with a set of hypothetical adverse events over three weeks to which insurers must respond as if they were real. The PRA will require detailed analyses of responses. Results will be announced as the tests progress and will go on to inform the regulator’s supervisory plans.

The exercises, which will be held alongside a similar test for life insurers, will expect firms to have their data in order to respond adequately, a stipulation that could be an opportunity to the insurance industry to boost its IT capabilities, said Wenzhe Sheng, senior product manager for EMEA prudential regulation at Clearwater Analytics.

“The Prudential Regulation Authority’s design of the DyGIST framework provides a strong foundation for ensuring the resilience of the insurance sector,” Sheng told Data Management Insight. “Moreover, it provides insurers with an incentive to fortify their data infrastructures and implement data driven risk management practices.”

Banking Assessments

The stress tests were announced last year and follow similar exercises focused largely on the UK’s banking industry. They will be carried out to gain a deep understanding of the insurance industry’s solvency and liquidity buffers and to examine the effectiveness of their management response to adverse scenarios.

The PRA held workshops with the Association of British Insurers, the Lloyd’s Market Association and the International Underwriting Association to devise the design and timing of the tests. Professional services giant Deloitte said last week that the bank tests had shown that insurers should prepare for their own assessment by ensuring their data is in order.

“General insurers need to enhance their stress and scenario testing processes to be able to perform the exercise live – including ensuring they can aggregate relevant data and identify potential management actions to deploy for any given scenario,” it said in a report.

Earlier Experience

The importance of having good data in responding to new risks was highlighted in similar stress tests held three years ago by the Bank of England to assess the resilience of insurers and lenders to climate risks. Initial exercises conducted by organisations including AXA, Allianz and AIG revealed concerning failures in data preparedness, which left some struggling to complete the tests, and surfacing gaps in critical datasets.

Clearwater’s Sheng said it is imperative that insurers have their data estates ready.

“In order to pass the first phase of the test – a live exercise that test’s a firm’s preparedness for adverse market-stressed events – it’s imperative that insurers have the capability to quickly pull up a very clear and transparent view of all of their holdings under these scenarios,” he said. “This is not as common as you would expect, as insurers are increasingly investing in a wide range of assets, which means they are often dealing with very different types of data in their internal systems.”

Sheng warned, however, that the DyGIST could also highlight shortcomings in firms’ data setups.

“When it comes to risk management you need to have a real-time understanding of your risk exposures in order to respond and manage that risk,” he said.

Reason for Hope

He is optimistic that insurance firms will be able overcome the challenges, thanks to the availability of new innovations and that “can provide daily, validated, and reconciled investment data on their entire portfolio, so that they can properly understand their market exposure across asset classes”

“Those who choose to invest in such modern data infrastructures will be best prepared to demonstrate their solvency and liquidity resilience and their effectiveness in risk management practice under the DyGIST regulatory exercise,” Sheng said.

The post Insurance Stress Test Success Hangs on Data Quality and Management appeared first on A-Team.

]]>
Crafting an Effective Data Strategy to Unlock Innovation https://a-teaminsight.com/blog/crafting-an-effective-data-strategy-to-unlock-innovation/?brand=dmi Mon, 29 Jul 2024 08:29:01 +0000 https://a-teaminsight.com/?p=69479 By Kelly Attrill, Head of Advisory & Consulting APAC at Lab49. Data can be both an asset and a liability. Used correctly, it can transform an organisation’s ability to unlock value and enable innovation. However, if data is mismanaged it can have catastrophic consequences. In financial services, firms recognise that the ever-increasing volume of data...

The post Crafting an Effective Data Strategy to Unlock Innovation appeared first on A-Team.

]]>
By Kelly Attrill, Head of Advisory & Consulting APAC at Lab49.

Data can be both an asset and a liability. Used correctly, it can transform an organisation’s ability to unlock value and enable innovation. However, if data is mismanaged it can have catastrophic consequences. In financial services, firms recognise that the ever-increasing volume of data they handle constitutes an asset that, with the right tooling, can deliver value far offsetting the initial investment. However, in some cases, its applicability to client outcomes may be unclear, and there may be a disconnect between how a business seeks to use data and how it’s currently being managed and distributed.  To avoid this and make sure that data fulfils its potential, it’s crucial to develop and implement a robust data strategy.

Strong foundations

An effective data strategy starts with identifying business goals that will be achieved with data and defining clear operational principles for data management and usage. This includes defining what a firm can and cannot do with data and identifying which areas data can add value to the client and employee. Across the front and back offices, firms must be willing to invest not only in the technology but also in the necessary training to ensure these principles are embedded in client journeys and in the day-to-day work of the team.

A strategy that establishes a foundational set of goals and principles lays the groundwork for the development of frameworks, policies and plans across the firm’s divisions. For example, defining data usage boundaries in the data strategy enables the development of a well-defined data governance framework, ensuring the safe, ethical and compliant handling of data across an organisation.

It is crucial that the data strategy is linked directly to business goals and clear time horizons to achieve these goals. This will drive prioritisation and planning decisions and allow the organisation to monitor progress through the implementation of the strategy. Defining the right goals is important; focusing only on one dimension of the data strategy will limit potential value. With a focus on enabling AI use cases, many firms invest in uplifting and ensuring that the quality of data is correct and can be trusted across the whole landscape. On the whole, this is a good thing but it is just as important for firms to continually invest in skills and technology to unlock value. This includes training employees to understand, access, and use data assets effectively and ensuring that data management practices are integrated into their workflows.

Moreover, a data-driven strategy must be agile, supporting the entire data lifecycle and allowing firms to adopt new tools and techniques as they emerge. This agility is vital for balancing mid-term investments in technology and people with the ability to quickly implement proven or experimental technologies that enhance data management and use.

Enhancing services

To address challenges in securing stakeholder buy-in, it is essential to clearly demonstrate how a data strategy aligns with and supports direct business outcomes and client needs. By showcasing tangible benefits, such as improved product offerings and risk management, firms can build a compelling case for investment in data initiatives.

Effectively harnessing data offers significant promise for firms looking to enhance their service offering. OECD researchhas found that large firms’ investments in intangible assets like data and software—which can scale without linear cost increases—can help grow their market share.

Increasingly, data is being integrated with AI to unlock advanced capabilities. For instance, AI models can streamline risk management by quickly digesting large volumes of changing regulations, and digital lending services have sped up the time to lending approvals by using machine learning and automation to improve credit decisions.

Personalised products tailored to individual clients’ needs are another significant benefit of a data-driven strategy. For example, upgrading Customer Relationship Management (CRM) systems so client information is accessible through consistent channels in an intuitive way allows front line staff to build an understanding of client needs and enables the delivery of powerful insights that may unlock more targeted propositions and spur business growth. These can improve satisfaction and loyalty not only for existing customers but also boost new business opportunities by improving the productivity and efficiency of sales teams, supporting a more competitive commercial proposition. A data strategy that prioritises feedback loops, collecting information based on the insights and proposition value and feeding that into the next set of insights and propositions will enable firms to shift to a data-driven strategy across multiple dimensions – data-driven product, data-driven marketing, data-driven people, etc.

Given increased attention from regulators globally to appropriately manage and protect data, developing a mature data strategy is not only desirable in terms of compliance but can help firms stay competitive by protecting against financial loss and reputational harm.

Future-proofing

As technological change continues to accelerate, firms adopting a data-driven strategy are better placed to leverage that data in new ways across business lines, the product suite and the operating environment. When the focus of the strategy is on disconnecting tightly bound links between technology and vendor platforms and enabling access that is simple, secure and intuitive, the value of the firm’s data assets becomes clearer and more closely tied to business outcomes.

Fostering a culture of data literacy where the value of data-driven decision-making is promoted across the organisation can go a long way to ensuring that all stakeholders, from top management to front-line employees, understand the benefits of a data-driven approach and are equipped to adapt to new ways of working.

Investment in experimentation with AI and embedding trusted decision and insight models into the firm’s decision-making processes becomes much easier once data is more available and protected through the right governance environment. Feedback from the success of this will help drive a data-driven organisation and feed the next generation of data-driven strategy.

The post Crafting an Effective Data Strategy to Unlock Innovation appeared first on A-Team.

]]>
Duco Unveils AI-Powered Reconciliation Product for Unstructured Data https://a-teaminsight.com/blog/duco-unveils-ai-powered-reconciliation-product-for-unstructured-data/?brand=dmi Tue, 09 Jul 2024 14:37:59 +0000 https://a-teaminsight.com/?p=69173 Duco, a data management automation specialist and recent A-Team Group RegTech Insight Awards winner, has launched an artificial intelligence-powered end-to-end reconciliation capability for unstructured data. The Adaptive Intelligent Document Processing product will enable financial institutions to automate the extraction of unstructured data for ingestion into their systems. The London-based company said this will let market...

The post Duco Unveils AI-Powered Reconciliation Product for Unstructured Data appeared first on A-Team.

]]>
Duco, a data management automation specialist and recent A-Team Group RegTech Insight Awards winner, has launched an artificial intelligence-powered end-to-end reconciliation capability for unstructured data.

The Adaptive Intelligent Document Processing product will enable financial institutions to automate the extraction of unstructured data for ingestion into their systems. The London-based company said this will let market participants automate a choke-point that is often solved through error-prone manual processes.

Duco’s AI can be trained on clients’ specific documents, learning how to interpret layout and text in order to replicate data gathering procedures with ever-greater accuracy. It will work within Duco’s SaaS-based, no-code platform.

The company won the award for Best Transaction Reporting Solution in A-Team Group’s RegTech Insight Awards Europe 2024 in May.

Managing unstructured data has become a key goal of capital markets participants as they take on new use cases, such as private market access and sustainability reporting. These domains are largely built on datasets that lack the order of reference, pricing and other data formats with which it must be amalgamated in their systems.

“Our integrated platform strategy will unlock significant value for our clients,” said Duco chief executive Michael Chin. “We’re solving a huge problem for the industry, one that clients have repeatedly told us lacks a robust and efficient solution on the market. They can now ingest, transform, normalise, enrich and reconcile structured and unstructured data in Duco, automating data processing throughout its lifecycle.”

The post Duco Unveils AI-Powered Reconciliation Product for Unstructured Data appeared first on A-Team.

]]>
Practicalities of Implementing GenAI in Capital Markets https://a-teaminsight.com/blog/practicalities-of-implementing-genai-in-capital-markets/?brand=dmi Wed, 26 Jun 2024 10:27:41 +0000 https://a-teaminsight.com/?p=69037 Following the opening keynote of A-Team Group’s AI in Capital Markets Summit (AICMS), a panel of expert speakers focused on the practicalities of implementing GenAI. The panel agreed that industry hype is waning and there is enthusiasm for GenAI with firms beginning to develop use cases, although one speaker noted: “People understand the risks and...

The post Practicalities of Implementing GenAI in Capital Markets appeared first on A-Team.

]]>
Following the opening keynote of A-Team Group’s AI in Capital Markets Summit (AICMS), a panel of expert speakers focused on the practicalities of implementing GenAI. The panel agreed that industry hype is waning and there is enthusiasm for GenAI with firms beginning to develop use cases, although one speaker noted: “People understand the risks and costs involved, but they were initially underestimated, I would say dramatically in some cases.”

The panel was moderated by Nicola Poole, formerly at Citi, and joined by Dara Sosulski, head of AI and model management markets and securities services at HSBC; Dr. Paul Dongha, group head of data and AI ethics at Lloyds Banking Group; Fatima Abukar, data, algorithms and AI ethics lead at the Financial Conduct Authority (FCA); Nathan Marlor, head of data and AI at Version 1; and Vahe Andonians, founder, chief product officer and chief technology officer at Cognaize.

Considering the use of GenAI, an early audience poll question asked to what extent organisations are committed to GenAI applications. Some 46% said they are testing GenAI apps, 24% are using one or two apps, and 20% are using a number of apps. Nine percent are researching GenAI and 2% say there is nothing in the technology for them.

Value of GenAI applications

A second poll questioned which GenAI applications would be of most value to a delegate’s organisation. In this case, 53% of respondents cited predictive analytics, 39% risk assessment, 39% KYC automation, 28% fraud detection and 19% portfolio management.

The panel shared their own use cases, with one member experimenting with GenAI to produce programming code and creating an internal chat box for data migration, as well as scanning data to surface information that can be categorised, sorted, filtered and summarised to create ‘kind of conversational extracts that can be used.’

All agreed that GenAI produces some low hanging fruit, particularly in operational activities such as KYC automation, but that the technology is too young for many applications, leading firms to build capability internally before unleashing GenAI apps for customers as there is still work to do around issues such as risk integration and ensuring copyright and data protection are not compromised. One speaker said: “There is a lot of experimentation and some research to do before we’re confident that we can use this at scale.” Another added: “There are just not enough skilled people to allow us to push hard, even if we wanted to. There’s a real pinch point in terms of skills here.”

Risks of adopting GenAI

Turning to risk, a third audience poll asked the audience what it considered to be the biggest risk around adopting GenAI. Here data quality was a clear leader, followed by lack of explainability, hallucinations, data privacy and potential misuse. Considering these results, a speaker commented: “We’ve already got existing policies and governance frameworks to manage traditional AI. We should be using those to better effect, perhaps in response to people identifying data quality as one of the key risks.”

The benefits of AI and GenAI include personalisation that can deliver better products to consumers and improve the way in which they interact with technology. From a regulatory perspective, the technologies are focused on reducing financial crime and money laundering, and resulting enforcements against fraudulent activity.

On the downside, the challenges that come with AI technologies are many and include ethical risk and bias, which needs to be addressed and mitigated. One speaker explained: “We have a data science lifecycle. At the beginning of this we have a piece around the ethical risk of problem conception. Throughout the lifecycle stages our data scientists, machine learning engineers and future engineers have access to python libraries so that when they test models, things like bias and fairness are surfaced. We can then see and remediate any issues during the development phase so that by the time models come to validation and risk management we can demonstrate all the good stuff we’ve done.” Which leads us to the need, at least in the short term, for a human element for verification and quality assurance of GenAI models in their infancy.

Getting skills right

Skills were also discussed, with one panel member saying: “We are living in a constantly more complex world, no organisation can claim that all its workforce has the skill set necessary for AI and GenAI, but ultimately I am hopeful that we are going to create more jobs than we are going to destroy, although the shift is not going to be easy.” Another said: “In compliance, we will be able to move people away from being data and document gatherers and assessors of data in a manual way to understand risk models, have a better capability and play a more interesting part.”

Taking a step back and a final look at the potential of GenAI, a speaker concluded: “Figuring out how to make safe products that we can offer to our customers is the only way we have a chance of reaching any sort of utopian conclusion. We must chart the right course for society and for people at work, because we’re all going to be affected by generative AI.”

The post Practicalities of Implementing GenAI in Capital Markets appeared first on A-Team.

]]>
AI in Capital Markets Summit Tracks Evolution of GenAI and Value Creation https://a-teaminsight.com/blog/ai-in-capital-markets-summit-tracks-evolution-of-genai-and-value-creation/?brand=dmi Wed, 26 Jun 2024 09:24:18 +0000 https://a-teaminsight.com/?p=69031 Generative AI (GenAI) took the world by storm in November 2022 when OpenAI introduced ChatGPT. It has since become a talking point across capital markets as financial institutions review its potential to deliver value, consider the challenges it raises, and question whether they have the data foundation in place to deliver meaningful, unbiased and ethical...

The post AI in Capital Markets Summit Tracks Evolution of GenAI and Value Creation appeared first on A-Team.

]]>
Generative AI (GenAI) took the world by storm in November 2022 when OpenAI introduced ChatGPT. It has since become a talking point across capital markets as financial institutions review its potential to deliver value, consider the challenges it raises, and question whether they have the data foundation in place to deliver meaningful, unbiased and ethical results from GenAI applications. While applications have yet to be implemented to any significant extent in the market, financial institutions are running internal proofs of concept.

The potential and problems of AI and GenAI were the subject of lively discussion at A-Team Group’s inaugural AI in Capital Markets Summit (AICMS) in London last week, with speakers exploring current and emerging trends in AI, the potential of GenAI and large language models (LLMs), and how AI can be applied to achieve efficiencies and business value across the organisation. With a note of caution, the conversation also covered the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

Opening the summit and introduced by A-Team president and chief content officer Andrew Delaney, Edward J. Achter from the office of applied AI at HSBC set the scene for the day, noting the need to build AI and GenAI products that are responsible and ethical and can be scaled, and describing the importance of educating and engaging the workforce to ensure solutions are used effectively and ethically.

In more detail, the keynote speaker explained the explosion of interest in AI and GenAI following the release of ChatGPT and a change in conversation at financial institutions. He also warned of risks inherent to the technology including fairness and bias, data privacy, and the deliberate spread of false information. To mitigate risk and create value, Achter emphasised the need to get your data house in order and, perhaps a long time in the asking, pay attention to data leaders as data is the lifeblood of AI and GenAI applications.

Also important to consider are regulatory requirements around AI and GenAI, addressing the carbon emission costs of using LLMs, and perhaps most importantly, writing a clear company policy that can be shared with all stakeholders. Demonstrating the benefits of AI and GenAI products can turn scepticism into an understanding of benefits, including productivity gains that can be measured, and change negative perspectives into positive approaches to doing more with the technology.

Ultimately, a skilled workforce, educated customers, technology used in the right context of conduct, and confidence across the organisation will result in value creation.

The post AI in Capital Markets Summit Tracks Evolution of GenAI and Value Creation appeared first on A-Team.

]]>
Datactics Enhances Augmented Data Quality Solution with Magic Wand and Rule Wizard https://a-teaminsight.com/blog/datactics-enhances-augmented-data-quality-solution-with-magic-wand-and-rule-wizard/?brand=dmi Wed, 19 Jun 2024 11:42:03 +0000 https://a-teaminsight.com/?p=68978 Datactics has enhanced the Augmented Data Quality Solution (ADQ) it brought to market in November 2023 with the addition of an AI magic wand, Snowflake connectivity and an SQL rule wizard in ADQ v1.4. The company is also working towards the release of ADQ v1.5 that will include generative AI (GenAI) rules and predictive remediation...

The post Datactics Enhances Augmented Data Quality Solution with Magic Wand and Rule Wizard appeared first on A-Team.

]]>
Datactics has enhanced the Augmented Data Quality Solution (ADQ) it brought to market in November 2023 with the addition of an AI magic wand, Snowflake connectivity and an SQL rule wizard in ADQ v1.4. The company is also working towards the release of ADQ v1.5 that will include generative AI (GenAI) rules and predictive remediation suggestions based on machine learning (ML).

ADQ was born out of market interest in a data quality solution designed for not only dedicated data quality experts and data stewards, but also non-tech users who could write their own data quality roles. A user-friendly experience, automation and a reduction in manual processes were also top of mind.

Kieran Seaward, head of sales and business development at Datactics, explains: “Customers said their challenges with data quality were the time it took to stand up solutions and enable users to manage data quality across various use cases. There were also motivational challenges around tasks associated with data ownership and data quality. We took all this on board and built ADQ.”

ADQ v1.4

ADQ made a strong start with v1.4 also a response to customer interests, this time in automation, reduced manual intervention, improved data profiling and exception management, increased connectivity, predictive data quality analytics, and more.

Accelerating automation, ADQ v1.4 offers enhanced out-of-the box data quality rules that ease the burden for non-tech users. The AI magic wand includes reworked AI and ML features and an icon showing where users can benefit from Datactics ML in ADQ. Data quality process automation also accelerates the assignment of issues to nominated data users.

Increased connectivity features the ability to configure a Snowflake connection straight through the ADQ user interface, eliminating the need to set this up in the backend. The company is working on additional integrations as it moves towards v1.5.

Predictive data quality analytics monitor data quality and alert data stewards of breaks and other issues. Stewards can then view the problems and ADQ v1.4 will suggest solutions. Based on a breakage table of historical data from data quality rules, ADQ v1.4 can also predict why data quality will fail in the future. Seaward comments: “Data quality is usually reactive but now we can put preventative processes in place. Predictive data quality is very safe to use as the ML does not change the data, instead providing helpful suggestions based on pattern recognition.”

The SQL rule wizard allows data quality authors to build SQL rules in ADQ, performing data quality checks in-situ to optimise processing time.

ADQ v1.5

Moving on to ADQ v1.5 and the integration of GenAI, users will be able to query the model, write a rule for business logic specific to their domain and test the rule to see if it produces desired results. Datactics is currently using OpenAI ChatGPT to look at the potential of GenAI, but acknowledges that financial institutuiosn are likely to have their own take on LLMs and will point its solution to these internal models.

Other developments include a data readiness solution including preconfigured rules that can check data quality and allow any remedial action before regulatory data submissions are made for regulations including EMIR Refit, MiFID III and MiFIR II, and the US Data Transparency Act and SEC rule 10c-1.

Criticality rules that will help data stewards prioritise data problems and solutions are also being prototyped, along with improved dashboards and permissioning, and as it started, next stage development will continue to make ADQ more friendly for business users.

The post Datactics Enhances Augmented Data Quality Solution with Magic Wand and Rule Wizard appeared first on A-Team.

]]>
Global Indices Launch Marks SIX’s Latest Expansion of ETF Business https://a-teaminsight.com/blog/global-indices-launch-marks-sixs-latest-expansion-of-etf-business/?brand=dmi Tue, 18 Jun 2024 12:08:39 +0000 https://a-teaminsight.com/?p=68969 SIX, the data aggregator and operator of Swiss and Spanish stock exchanges, has expanded its indexing business with the creation of two families of global equities indices that can be used by the company’s retail, private banking and asset management clients. The SIX World Indices provide a broad view of global markets through a diversified and...

The post Global Indices Launch Marks SIX’s Latest Expansion of ETF Business appeared first on A-Team.

]]>
SIX, the data aggregator and operator of Swiss and Spanish stock exchanges, has expanded its indexing business with the creation of two families of global equities indices that can be used by the company’s retail, private banking and asset management clients.

The SIX World Indices provide a broad view of global markets through a diversified and variable roster of stocks traded across major markets. Meanwhile, the SIX Broad & Blue-Chip Indices present views into a fixed array of equities that constitute the most representative companies within geographic markets and regions.

The indices are aimed at helping clients streamline their data operations by providing direct access to information on the stocks in which they are invested without having to subscribe to costly third-party services and products.

Further Expansion

The new indices represent the latest step in SIX’s plan to become a one-stop-shop for exchange-traded funds (ETFs), providing fund manufacturers with the tools to create and list their products and take advantage of SIX’s trading, custody, index and data services.

It already publishes indices around its Swiss and Spanish stock exchange operations, and has also launched Nordic and ESG gauges. The company signalled its intention to build out its index business earlier this year when it made a strategic investment in BITA, a provider of indexing technology and services used by exchanges, delta one desks and asset management firms. The cash injection cemented a relationship begun two years ago and which has aided SIX’s move into the cryptocurrency sector.

In an interview with Data Management Insight, which will be published in full next week, SIX head of financial information Marion Leslie explained that the Bita investment provides the opportunity “to do so much more”.

“The ability to start running global indices, custom indices, thematic indices –which we are investing in – is a great asset to have,” Leslie said.

SIX head of index services, financial information Christian Bahr said that the company’s index business was responding to rising demand for benchmarks, especially from passive funds, and that he expected more to be created soon.

“Establishing a strong presence across the banking sector is paramount for recognition as a key player in global indices and global market data,” Bahr said. “Many financial institutions are in the process of renewing their online banking products, including their own websites and apps,” he said.

Fast Connections

The indices also provide API connection to data on each component enabling clients to access real-time pricing and granular performance data, as well as other datasets. This, said Bahr, provides a more “sophisticated overview of market performance for their investment-savvy customers, and our combined proposition around API delivery will make accessing this data faster, simpler, and more cost-effective for financial institutions”.

All indices are priced in dollars, euros and Swiss francs and components are weighted by free-float market capitalisation. The SIX World Indices will be reviewed each June and December, while those in the SIX Broad and Blue-Chip family will be updated individually.

The post Global Indices Launch Marks SIX’s Latest Expansion of ETF Business appeared first on A-Team.

]]>
TurinTech innovates with Artemis code optimisation https://a-teaminsight.com/blog/turintech-innovates-with-artemis-code-optimisation/?brand=dmi Wed, 12 Jun 2024 12:09:23 +0000 https://a-teaminsight.com/?p=68841 TurinTech, a London-based technology vendor, plans to revolutionise code optimisation with its GenAI Artemis solution. Artemis is based on a proprietary large language model (LLM) – although it can be used with other LLMs – that is trained to help financial firms optimise software code, speed up execution, reduce cloud costs and lower carbon emissions....

The post TurinTech innovates with Artemis code optimisation appeared first on A-Team.

]]>
TurinTech, a London-based technology vendor, plans to revolutionise code optimisation with its GenAI Artemis solution. Artemis is based on a proprietary large language model (LLM) – although it can be used with other LLMs – that is trained to help financial firms optimise software code, speed up execution, reduce cloud costs and lower carbon emissions. To date, Artemis has been implemented by investment banks in the UK, France and US.

The company was set up in 2019 by co-founders who met at University College London while doing PhD research work. They went on to work in financial institutions, where they experienced problems of getting code into production at any speed, internal bottlenecks holding up developers, and the pain points of code reviews.

 “There had to be a better way of doing things and a way to resolve these problems,” says Leslie Kanthan, CEO and co-founder of TurinTech, noting that while financial institutions tend not to have code optimisation teams, Artemis code optimisation can help them improve code quality, make developers more efficient, and give firms spending vast amounts of money on cloud savings of about 10% by optimising code, a potentially huge saving.

As well as optimising code and reducing costs, Artemis plays well into financial institutions’ sustainability goals by running better code faster, descreasing compute usage and providing energy savings.

Artemis scans software code on-premises or in the cloud. It uses TurinTech’s LLM, which has been trained on millions of lines of code and informed by the team’s proprietary knowledge, although it can also be used with other LLMs, perhaps less effectively, and takes hardware into consideration to allow legacy systems to perform to the best of their ability.

Use cases of the solution include identifying weaknesses in code and providing recommendations for optimal changes that enhance performance, noting code that could be sped up or improved by modifying particular lines, and analysing code bases to predict their efficiency – all with a human in the loop but reducing resource requirements overall.

Kanthan concludes: “Everyone wants to use AI, but will it add value to the business? LLMs are just another form of data, so you need apps for use cases. TurinTech has an app for code optimisation and is, at the moment, leading the market.”

The post TurinTech innovates with Artemis code optimisation appeared first on A-Team.

]]>
GoldenSource Partners Snowflake to Deliver Omni Data Management App for the Buy-Side https://a-teaminsight.com/blog/goldensource-partners-snowflake-to-deliver-omni-data-management-app-for-the-buy-side/?brand=dmi Wed, 05 Jun 2024 09:44:48 +0000 https://a-teaminsight.com/?p=68724 GoldenSource has partnered Snowflake to deliver GoldenSource Omni, a native application that deploys the company’s data model on the Snowflake Data Cloud and combines data from multiple sources to centralise all processing and provide analytics and reporting of investment data within the cloud. As well as combining operational and analytical investment data in a unified...

The post GoldenSource Partners Snowflake to Deliver Omni Data Management App for the Buy-Side appeared first on A-Team.

]]>
GoldenSource has partnered Snowflake to deliver GoldenSource Omni, a native application that deploys the company’s data model on the Snowflake Data Cloud and combines data from multiple sources to centralise all processing and provide analytics and reporting of investment data within the cloud.

As well as combining operational and analytical investment data in a unified data model, GoldenSource Omni allows investment managers to view datasets including securities, prices, listed and private portfolios, transactions and ESG data within a single, easy-to-understand format on the Snowflake platform. They can then analyse data more effectively and accelerate the application of generative AI, including training AI and machine learning models. They can also analyse portfolio holdings and exposures in a timely manner, drill into specific attributes of a portfolio and automate attribution reporting.

“Historically, buy-side participants have struggled with the management of disparate datasets. This was exacerbated by the absence of a comprehensive data model,” says Jeremy Katzeff, head of buy-side solutions at GoldenSource. “With the release of GoldenSource Omni, datasets for different asset classes and functions can be integrated and analysed within a modern cloud-native environment. Firms can replace outdated legacy systems with centralised, cloud-based enterprise data management that is far more efficient and cost effective.”

Rinesh Patel, global head of industry, financial services at Snowflake, adds: “GoldenSource is an ideal partner for us with its market experience and data model that links data across domains, providing a more efficient way for joint customers to run analytics, derive insights and train AI models within the Snowflake platform.”

The post GoldenSource Partners Snowflake to Deliver Omni Data Management App for the Buy-Side appeared first on A-Team.

]]>