Data Delivery Platforms, Cloud & Managed Services - A-Team https://a-teaminsight.com/category/data-delivery-platforms-cloud-managed-services/ Wed, 21 Aug 2024 09:12:48 +0000 en-GB hourly 1 https://wordpress.org/?v=6.6.1 https://a-teaminsight.com/app/uploads/2018/08/favicon.png Data Delivery Platforms, Cloud & Managed Services - A-Team https://a-teaminsight.com/category/data-delivery-platforms-cloud-managed-services/ 32 32 TurinTech Deploys GenAI to Accelerate Financial Software https://a-teaminsight.com/blog/turintech-deploys-genai-to-accelerate-financial-software/?brand=dmi Wed, 21 Aug 2024 09:12:48 +0000 https://a-teaminsight.com/?p=69657 The costs associated with poor quality software coding are startling. In dollar terms alone, companies in the US incurred a US$2.4 trillion hit from the direct impacts and cost of correcting poor coding, according to a 2022 survey by the Consortium for Information and Software Quality (CISQ). That’s before indirect costs such as reputational and...

The post TurinTech Deploys GenAI to Accelerate Financial Software appeared first on A-Team.

]]>
The costs associated with poor quality software coding are startling. In dollar terms alone, companies in the US incurred a US$2.4 trillion hit from the direct impacts and cost of correcting poor coding, according to a 2022 survey by the Consortium for Information and Software Quality (CISQ).

That’s before indirect costs such as reputational and legal damages are considered.

With financial institutions’ greater dependence on technology and speed of execution, the costs of software failure and slow runtimes are potentially higher. The same survey found that the dollar-value impact of operational failure was as much as 72 times higher for large financial brokerages than it was for other industries.

Since its creation in 2018, TurinTech AI has been on a mission to help firms reduce costs. The UK-based company leverages GenAI to pinpoint areas in mission-critical software systems where optimisation is needed and then it generates better code to enhance performance and efficiency.

TurinTech AI’s technology offers a range of services that can streamline billions of lines of code to reduce applications’ pressure on CPU processing power and cloud use. By doing so companies also reduce the energy needed to carry out their everyday processes, a benefit that decreases their carbon footprint, said chief executive Leslie Kanthan.

“Financial institutions, banks and hedge funds have huge amounts of code –hundreds of millions of lines of code – and it would take 40 Guys 10 years just to review a couple of million lines of code; it’s an intractable problem,” Kanthan told Data Management Insight.

Efficiency Gains

TurinTech AI was formed by three PhD candidates who met at University College London and had gained experience of – and become frustrated by – the code optimisation tasks they’d been required to carry out at the financial companies where they subsequently worked. They founded TurinTech AI with its first product, a code generator powered by machine learning called evoML. Its GenAI Artemis AI code optimiser followed.

TurinTech AI says that its Artemis AI innovation has allowed clients to improve the efficiency of their coding by as much as a third. For example, it was able to improve by 32.7 per cent the runtime of pull requests on QuantLib – an open-source code library favoured by financial institutions for quantitative finance tasks, including modelling and trading.

“It ensures that you are using less of your service resources,” said Kanthan. “So if your code was taking up 30 per cent of your Amazon budget, it might now be taking 20 per cent of your Amazon budget and at the same time improving your footprint for ESG.”

Firms can be expected to dedicate about half of their overall software development budgets to debugging and fixing errors over the 25-year life expectancy of a large software system.

That low-skill work will most likely be carried out by highly trained technology professionals. Kanthan points to the experience of a globally known technology brand client that employed hundreds of developers to manually go over code to find inefficiencies.

“They’re all PhDs and professors who should be building new applications not going back through existing code,” he said. “We saved them the time, we saved them the labour resource, we gave them cost efficiency and we allow them to get more output from what they already had.”

Speed Bumps

Financial institutions face greater exposure to coding quality challenges also because they need to develop and deploy new applications at speed. Under such time-to-market constraints, developers will build applications as quickly as possible, and that might mean they use tried-and-trusted phrasing that may not be the most efficient.

A common example, Kanthan said, is the use of for-loops, which are quick to write and are reliable, but they are not as efficient as other structures.

“It’s so hard as a developer to do things the most efficient way because of the time constraints; if they’re given an objective and told it is needed by the end of the week, they do it as quickly they possibly can – their priority is to get the result they want,” said Kanthan. “So, they might do it in a messy way that duplicates many functions.”

The pressure to default to a reliable solution is also seen in the continued use by some organisations of the Fortran. It’s an old CPU-hogging language but it is dependable, and replacing it would incur a huge transitional cost. TurinTech’s Artemis AI can be deployed to translate those old-style codes into modern and more efficient C++.

“Fortran is a very old and obsolete language but because it works, it doesn’t break and no one wants to touch it,” said Kanthan. “It’s too expensive to get Fortran developers because they are hard to find and very expensive. So, you’re talking about spending thousands of pounds per day per person to work on millions of lines of code, using our product will bring enormous savings.”

The post TurinTech Deploys GenAI to Accelerate Financial Software appeared first on A-Team.

]]>
Rich Talent Pool Lures Arcesium to Portugal https://a-teaminsight.com/blog/rich-talent-pool-lures-arcesium-to-portugal/?brand=dmi Wed, 21 Aug 2024 09:06:50 +0000 https://a-teaminsight.com/?p=69654 Portugal’s combination of high-quality expertise in technology and finance was the key driver behind Arcesium’s decision to locate its first continental European office in Lisbon, the data management provider has revealed. The Atlantic coastal city of 3 million people came out as the top pick from a list of 50 European cities considered by the...

The post Rich Talent Pool Lures Arcesium to Portugal appeared first on A-Team.

]]>
Portugal’s combination of high-quality expertise in technology and finance was the key driver behind Arcesium’s decision to locate its first continental European office in Lisbon, the data management provider has revealed.

The Atlantic coastal city of 3 million people came out as the top pick from a list of 50 European cities considered by the New York-headquartered company, managing director David Nable told Data Management Insight.

“What we were particularly attracted to in Portugal, besides, of course, the climate, was the talent pool across both the financial and technology domains,” Nable said. “The talent pool is great, the business climate is attractive, the time zone support is there – it is really a good match for our business and for our clients’ business.”

New Hires

The office, located on the city’s prestigious Avenida da Liberdade, will eventually house around 50 staff, focused on building data solutions for the company’s financial clients. Arcesium hopes to have half that number in place by the end of this year and be fully staffed by 2025.

The Portuguese office will be managed by Arcesium managing director Ranvijay Lamba and will serve customers within the European Union. Among the services the company will deliver from Portugal is its newly launched Aquata platform, which has been designed to help financial institutions integrate data and data tools into the core of their business operations. Locating within the 27-nation bloc eliminates the additional legal and regulatory requirements of doing so from the company’s established office in London.

Arcesium was formed in 2015 and initially served a handful of clients that were located within its Midtown, Manhattan, neighbourhood. It now has around 2,000 software engineering staff around the world serving customers that oversee more than US$4.3 trillion of assets.

Economic Opportunity

The company enters Portugal as the southern European nation rides a robust economic wave. GDP rose 2.3 per cent in 2023, compared with a 0.6 per cent average across the 20 euro-currency nations. Inflation is forecast by the Bank of Portugal to drop to 2.9 per cent this year and unemployment is near a record low at 6.5 per cent. The firm is in good company too; its American compatriots are the largest group of foreign direct investors in the country, committing €2.3 billion in 2023, according to US State Department data.

Nable said Arcesium had weighed the relevant benefits of cities across continental Europe on multiple factors including real estate values. It was Lisbon’s broad-based skill sets that won over the American company after it had spoken to recruiters and talent consultants.

“Oftentimes you’ll find a centre that’s really good for technological talent, but they don’t really have the financial domain – they don’t know the specifics of supporting or working with asset management banks,” he said. “Alternatively, you find the opposite, where you’ll have deep accounting and operational talent pools, but not the technology talent pool.

“Portugal, and Lisbon in particular, was a pretty clear winner for our ambitions.”

Expansion Plans

Staff have already been hired from within the local labour market, filling urgently needed customer-facing, engineering and customer engineering roles.

“The intention was to hire locally – we have a long queue of people who would love to relocate,” Nable said. “But the foundational business concept is to hire local talent. That’s why we chose to open in Portugal.”

The Portuguese move is part of a longer-term expansion programme at Arcesium. Recently, the company moved its New York headquarters and its London office into larger, more modern premises. A widening of its footprint in India – Arcesium’s technology and operations hub is located in Hyderabad – is in the pipeline as are plans to open more offices in North America and also in South America.

“Opening new offices really relates to the nature of where we’re growing, how we’re growing, where we want to source talent and how we want to support our clients,” Nable said.

  • Arcesium head of institutional asset management Mahesh Narayan will be among speakers at A-Team Group’s Data Management Summit New York on September 26. Book your place at the event here.

The post Rich Talent Pool Lures Arcesium to Portugal appeared first on A-Team.

]]>
Asset Managers Can Learn Lessons from New Government’s Cost Pressures https://a-teaminsight.com/blog/asset-managers-can-learn-lessons-from-new-governments-cost-pressures/?brand=dmi Wed, 21 Aug 2024 08:26:40 +0000 https://a-teaminsight.com/?p=69650 By Thomas McHugh, CEO and Co-founder of FINBOURNE Technology. Any new government brings the inevitable “change” message, but one thing never changes regardless of who has the keys to the treasury – seeking out departmental cost savings wherever humanly possible. With unprotected departments facing cuts of up to 2.9% according to the Institute for Fiscal Studies,...

The post Asset Managers Can Learn Lessons from New Government’s Cost Pressures appeared first on A-Team.

]]>
By Thomas McHugh, CEO and Co-founder of FINBOURNE Technology.

Any new government brings the inevitable “change” message, but one thing never changes regardless of who has the keys to the treasury – seeking out departmental cost savings wherever humanly possible. With unprotected departments facing cuts of up to 2.9% according to the Institute for Fiscal Studies, Rachel Reeves faces the unenviable task of making eye-watering efficiency savings while also boosting growth.

This fiscal pressure provides a timely parallel for asset managers, who are also grappling with their own rising costs, particularly in the thorny area of operations. Just as any new government must prioritise where to make cuts without harming essential public services, asset managers need to navigate the longstanding challenge of reducing costs while protecting key business functions.

Moves as drastic as ditching trading terminals to save millions, as some have considered, underscores the immense pressure to streamline operations amid years of shrinking margins and heightened competition. Streamlining data management makes more sense as a cost cutting approach, as it brings the added benefits of more effective decision making and the opportunities to innovate that arise from better managed data.

Much like our political system, data in asset management firms is often a tangled mess. Years of patching together disparate solutions with siloed data sets have resulted in a Frankenstein–like tech stack. Changing this in a big all-encompassing program that promises a new way can lead to disappointment and a change program divorced from reality – more like a Liz Truss-style fiscal horror show than an efficient machine. Transforming data into a core operational asset, at a cost that is manageable, can be a real game changer for Asset Management firms.

With all this in mind, like politicians addressing their electorate, asset managers must prioritise the needs of their investors. Extensive change to data management strategies may be needed, but it doesn’t follow that changes should all take the form of ‘megaprojects.’ Technology should simplify, not complicate. The right software should work seamlessly, providing accurate and timely information that transforms how firms use data to deliver enhanced services to their clients .

The lesson from the government’s fiscal challenges is clear: prioritise, streamline, and modernise. Asset managers should invest in integrated data management solutions that will ultimately result in a leaner, more efficient operation capable of thriving in a competitive landscape.

The post Asset Managers Can Learn Lessons from New Government’s Cost Pressures appeared first on A-Team.

]]>
Crafting an Effective Data Strategy to Unlock Innovation https://a-teaminsight.com/blog/crafting-an-effective-data-strategy-to-unlock-innovation/?brand=dmi Mon, 29 Jul 2024 08:29:01 +0000 https://a-teaminsight.com/?p=69479 By Kelly Attrill, Head of Advisory & Consulting APAC at Lab49. Data can be both an asset and a liability. Used correctly, it can transform an organisation’s ability to unlock value and enable innovation. However, if data is mismanaged it can have catastrophic consequences. In financial services, firms recognise that the ever-increasing volume of data...

The post Crafting an Effective Data Strategy to Unlock Innovation appeared first on A-Team.

]]>
By Kelly Attrill, Head of Advisory & Consulting APAC at Lab49.

Data can be both an asset and a liability. Used correctly, it can transform an organisation’s ability to unlock value and enable innovation. However, if data is mismanaged it can have catastrophic consequences. In financial services, firms recognise that the ever-increasing volume of data they handle constitutes an asset that, with the right tooling, can deliver value far offsetting the initial investment. However, in some cases, its applicability to client outcomes may be unclear, and there may be a disconnect between how a business seeks to use data and how it’s currently being managed and distributed.  To avoid this and make sure that data fulfils its potential, it’s crucial to develop and implement a robust data strategy.

Strong foundations

An effective data strategy starts with identifying business goals that will be achieved with data and defining clear operational principles for data management and usage. This includes defining what a firm can and cannot do with data and identifying which areas data can add value to the client and employee. Across the front and back offices, firms must be willing to invest not only in the technology but also in the necessary training to ensure these principles are embedded in client journeys and in the day-to-day work of the team.

A strategy that establishes a foundational set of goals and principles lays the groundwork for the development of frameworks, policies and plans across the firm’s divisions. For example, defining data usage boundaries in the data strategy enables the development of a well-defined data governance framework, ensuring the safe, ethical and compliant handling of data across an organisation.

It is crucial that the data strategy is linked directly to business goals and clear time horizons to achieve these goals. This will drive prioritisation and planning decisions and allow the organisation to monitor progress through the implementation of the strategy. Defining the right goals is important; focusing only on one dimension of the data strategy will limit potential value. With a focus on enabling AI use cases, many firms invest in uplifting and ensuring that the quality of data is correct and can be trusted across the whole landscape. On the whole, this is a good thing but it is just as important for firms to continually invest in skills and technology to unlock value. This includes training employees to understand, access, and use data assets effectively and ensuring that data management practices are integrated into their workflows.

Moreover, a data-driven strategy must be agile, supporting the entire data lifecycle and allowing firms to adopt new tools and techniques as they emerge. This agility is vital for balancing mid-term investments in technology and people with the ability to quickly implement proven or experimental technologies that enhance data management and use.

Enhancing services

To address challenges in securing stakeholder buy-in, it is essential to clearly demonstrate how a data strategy aligns with and supports direct business outcomes and client needs. By showcasing tangible benefits, such as improved product offerings and risk management, firms can build a compelling case for investment in data initiatives.

Effectively harnessing data offers significant promise for firms looking to enhance their service offering. OECD researchhas found that large firms’ investments in intangible assets like data and software—which can scale without linear cost increases—can help grow their market share.

Increasingly, data is being integrated with AI to unlock advanced capabilities. For instance, AI models can streamline risk management by quickly digesting large volumes of changing regulations, and digital lending services have sped up the time to lending approvals by using machine learning and automation to improve credit decisions.

Personalised products tailored to individual clients’ needs are another significant benefit of a data-driven strategy. For example, upgrading Customer Relationship Management (CRM) systems so client information is accessible through consistent channels in an intuitive way allows front line staff to build an understanding of client needs and enables the delivery of powerful insights that may unlock more targeted propositions and spur business growth. These can improve satisfaction and loyalty not only for existing customers but also boost new business opportunities by improving the productivity and efficiency of sales teams, supporting a more competitive commercial proposition. A data strategy that prioritises feedback loops, collecting information based on the insights and proposition value and feeding that into the next set of insights and propositions will enable firms to shift to a data-driven strategy across multiple dimensions – data-driven product, data-driven marketing, data-driven people, etc.

Given increased attention from regulators globally to appropriately manage and protect data, developing a mature data strategy is not only desirable in terms of compliance but can help firms stay competitive by protecting against financial loss and reputational harm.

Future-proofing

As technological change continues to accelerate, firms adopting a data-driven strategy are better placed to leverage that data in new ways across business lines, the product suite and the operating environment. When the focus of the strategy is on disconnecting tightly bound links between technology and vendor platforms and enabling access that is simple, secure and intuitive, the value of the firm’s data assets becomes clearer and more closely tied to business outcomes.

Fostering a culture of data literacy where the value of data-driven decision-making is promoted across the organisation can go a long way to ensuring that all stakeholders, from top management to front-line employees, understand the benefits of a data-driven approach and are equipped to adapt to new ways of working.

Investment in experimentation with AI and embedding trusted decision and insight models into the firm’s decision-making processes becomes much easier once data is more available and protected through the right governance environment. Feedback from the success of this will help drive a data-driven organisation and feed the next generation of data-driven strategy.

The post Crafting an Effective Data Strategy to Unlock Innovation appeared first on A-Team.

]]>
Meeting New Capital Markets Challenges: Gresham and Alveo Leaders Discuss Merger and Future Plans https://a-teaminsight.com/blog/meeting-new-capital-markets-challenges-gresham-and-alveo-leaders-discuss-merger-and-future-plans/?brand=dmi Tue, 23 Jul 2024 09:28:28 +0000 https://a-teaminsight.com/?p=69434 The merger of Gresham Technologies and Alveo, which was announced last week, was born of a desire by each company to scale their capabilities to meet growing international demand from financial institutions at a time of increased focus on data management. The venture saw Gresham Technologies delist from the public market to create the new...

The post Meeting New Capital Markets Challenges: Gresham and Alveo Leaders Discuss Merger and Future Plans appeared first on A-Team.

]]>
The merger of Gresham Technologies and Alveo, which was announced last week, was born of a desire by each company to scale their capabilities to meet growing international demand from financial institutions at a time of increased focus on data management.

The venture saw Gresham Technologies delist from the public market to create the new company, which will be known as Gresham. The deal has resulted in a company that combines Gresham Technologies’ transaction control and reconciliations, data aggregation, connectivity solutions and regulatory reporting capabilities with Alveo’s enterprise data management for market, reference and ESG data.

Backed by Alveo’s majority investor STG, a technology-focused private equity firm, the combined business has got to work promoting what it calls its enterprise data automation offering.

Data Management Insight spoke to chief executive Ian Manocha, formerly head of Gresham Technologies, and chair Mark Hepsworth, who held the leadership role at Alveo, about the genesis of the merger and their plans for the future.

“We think it’s a big thing, and I hope the industry recognises that too,” says Hepsworth.

Data Management Insight: What was the rationale behind this merger?

Ian Manocha: Mark and I have known each other and for quite a few years and have always seen the strategic value of working together.

Mark Hepsworth: We’re complementary businesses. We at Alveo focus on enterprise data management, market data, reference data and ESG data and Gresham has built a business around reconciliation, investment management data and connectivity services through to regulatory reporting. The common thread is that we’re both solving data management problems for customers in financial services.

DMI: Where do you see complementarity?

MH: There’s a lot of overlap in terms of some of our customers but also the type of customers that we both sell to, the parts of those customers that we sell to both on the sell side and the buy side, and in areas like exchanges. Also, often at a senior level the same person is responsible for what their firm is doing around market data, as well as reconciliations data for example, and data management..

DMI: What triggered the eventual decision to merge?

IM: A number of things really came together at the right time. There was STG’s interest in us and the board’s view that our shareholders would be open to an exit at the right price. And from a Gresham perspective, we had a sense that, at this stage of the company’s development, we were going to be better served coming off the public markets and having the backing of a large firm like STG to accelerate our journey to take on the opportunities that we were seeing in the market. Mark and I started having the ‘we are finally going to make this happen’, conversation.

DMI: What are those opportunities?

IM: Between us we’ve got the landscape well covered so the question is now, having got all that data and now having the capability to manage it and ensure the quality of it, and of course, the reconciliation capabilities, a part of that question is, ‘what more can we do with it – how can we convert that into a business opportunity for our clients’? That’s the exciting area. So we see an opportunity now to invest more in areas like AI and to invest more in other players in the market.

DMI: What are your plans for growth?

IM: Gresham built a business organically and with some M&A work – we’ve acquired four firms in my nine years at the company. But that’s become more difficult for us on the public markets. It’s well known that there are challenges around liquidity particularly for small caps. We now have the financial backing of STG to look at those opportunities, whether we go after other firms or through organic investment, to fill out that vision of being the leading player in the data automation space for capital markets.

DMI: What will the new company offer its customers?

MH: What we’re really looking to do is create a significant new player in data management for financial services. We now have a broader range of capabilities and data management solutions that stretches further across the enterprise than they did before so we can solve more problems for clients.

Clients have a real focus on data both operationally and in terms of efficiently processing that data and delivering to business users, and doing that with the right level of governance control and transparency. All our customers are regulated and ensuring that they’re using high-quality data in their downstream processes is very important.

IM: Our customers are looking for a real heavyweight player in the data automation and data management space. They want a single heavyweight, well-funded, global company with strong technology capabilities and deep domain expertise to be their partner in their digital transformation because they’re fed up working with people that don’t understand the detail of capital markets data, and they’re fed up with having too many parties to work with.

DMI: What factors are driving the demand you want to meet?

MH: What I’ve seen over the years is that clients effectively feel data management could be easier than it is, that there’s more manual process than then they’d like. Both our companies have really focused their roadmaps in recent years around how we make that easier for customers. We moved to the cloud and both adopted open-source technologies that facilitates easy data management, as well as focused on improving business user self service. We really want to make data management easier for customers and that’s really where we’re going with the automation piece in our new tag line

IM: I’ve long felt that customers are looking to simplify their operating models. It’s not just about having the technical software, it’s also the skills and the capacity to deliver the change that’s needed. That’s particularly true in the mid-sized and smaller firms. There’s no way they can possibly build all that capability in house so we want to be the partner that they seek to deliver that end to-end-capability as a service.

DMI: Are there any practical technical issues you have had to overcome in your integration?

IM: Both firms have got modern development shops, cloud-native tech stacks and we use modern tools, so the kind of legacy stuff that’s harder to move forward is not an issue for us. And at the product level, things like APIs and cloud solutions, you don’t necessarily need to have the deep level of integration you did in the past. So for customers that won’t be visible.

DMI: What products and services will you be offering initially and what do you have in the pipeline?

MH: We will continue to offer those solutions we’re famous for: data automation and control, reconciliations and exceptions management, market data EDM, investment management data aggregation and regulatory reporting. But we’re also excited to get going on new initiatives.

IM: First out of the gates will be offerings for investment managers leveraging the greater richness of data that we now manage on their behalf. Let me give you a practical example, in Alveo market data pricing projects we are readily able to source pricing data for liquid assets but often struggle to obtain pricing for illiquid assets. Whereas in many Gresham NAV reconciliation projects were are pulling latest available pricing for some illiquid assets. So together we can fill a price visibility gap for our customers.

There are many other examples where we can now inject valuable insights into core processes without firms having to invest in costly, risky, data lake projects.  And thinking more strategically, leveraging the Alveo data management technology will help business users with self-service and distribution of these combined data sets. It’s super exciting for us and the customers we’ve spoken to are also enthusiastic which it the acid test.

The post Meeting New Capital Markets Challenges: Gresham and Alveo Leaders Discuss Merger and Future Plans appeared first on A-Team.

]]>
Alveo and Gresham Merge to Offer Data Services at ‘Significant’ Scale https://a-teaminsight.com/blog/alveo-and-gresham-merge-to-offer-data-services-at-significant-scale/?brand=dmi Wed, 17 Jul 2024 10:45:20 +0000 https://a-teaminsight.com/?p=69331 Data management software and services providers Alveo and Gresham Technologies have merged in a deal that the newly augmented company says will offer clients data automation and optimisation at “significant” scale. The new business, which will be known as Gresham, will be based in London with former Gresham Technologies chief executive Ian Manocha continuing the...

The post Alveo and Gresham Merge to Offer Data Services at ‘Significant’ Scale appeared first on A-Team.

]]>
Data management software and services providers Alveo and Gresham Technologies have merged in a deal that the newly augmented company says will offer clients data automation and optimisation at “significant” scale.

The new business, which will be known as Gresham, will be based in London with former Gresham Technologies chief executive Ian Manocha continuing the role at the company and Mark Hepsworth, who headed Alveo, taking the chair’s position.

The combined companies marry Gresham Technologies’’ transaction control and reconciliations, data aggregation, connectivity solutions and regulatory reporting capabilities with Alveo’s enterprise data management for market, reference and ESG data.

The range of data automation and process solutions it can offer will reduce the total cost of ownership of clients’ data, Gresham said.

“The combination of the two firms accelerates our journey to bring digital integrity, agility, operational efficiency and data confidence to financial markets globally,” said Manocha. “It creates a comprehensive set of solutions for data automation, operational efficiency, data management, analytics and risk mitigation for financial and corporate clients globally.”

The terms of the deal were not disclosed but Alveo’s majority owner, technology-focused private equity firm STG, backed the merger.

London-based Alveo was founded in 1991 as Asset Control, one of the first third-party enterprise data management service providers. It changed its name in 2020 after becoming a cloud-native, managed-service provider.

Gresham Technologies began life as Gresham Computing offering real-time transaction control and enterprise data integrity solutions.

Hepsworth said the newly enlarged company will be able to meet the increasing data demands of clients.

“We can now offer clients greater scale and a wider range of solutions that will simplify their operations and enable them to manage data more effectively,” he said.

The post Alveo and Gresham Merge to Offer Data Services at ‘Significant’ Scale appeared first on A-Team.

]]>
Informatica Sees a Future of AI-Focused Innovation Releases https://a-teaminsight.com/blog/informatica-sees-a-future-of-ai-focused-innovation-releases/?brand=dmi Mon, 15 Jul 2024 13:52:15 +0000 https://a-teaminsight.com/?p=69280 Informatica has had a busy 2024, announcing major new innovations and partnerships as it brings artificial intelligence to the fore of its cloud-based data management offering. Last month the California-based company deepened its association with Databricks, providing the full range of its AI-powered Intelligent Data Management Cloud capabilities within Databricks’ Data Intelligence Platform. The expanded partnership...

The post Informatica Sees a Future of AI-Focused Innovation Releases appeared first on A-Team.

]]>
Informatica has had a busy 2024, announcing major new innovations and partnerships as it brings artificial intelligence to the fore of its cloud-based data management offering.

Last month the California-based company deepened its association with Databricks, providing the full range of its AI-powered Intelligent Data Management Cloud capabilities within Databricks’ Data Intelligence Platform. The expanded partnership will enable joint customers to deploy enterprise-grade GenAI applications at scale, based on a foundation of high-quality, trusted data and metadata. That followed the unveiling of a similar association with Snowflake. It was also selected by Microsoft as the Independent Software Design (ISV) design partner for the software behemoth’s new data fabric product.

The frequency of the rollouts in recent months has been dictated by the rapidity with which Informatica’s financial institution clients are seizing on the potential of AI. Many are struggling to bring the technology into their legacy systems, while others have a vision of what they want to do with it but not the capability to implement it.

With the market also heavily weighted towards capitalising on the growing generative AI space, Informatica group vice president and head of EMEA North sales Greg Hanson said new developments and enhancements are on the cards for the near future.

“The critical foundational layer for companies is to get their data management right and if you look at the current state of most large organisations, their integration and their data management looks a bit like spaghetti,” Hanson tells Data Management Insight.

“They realise, though, that they have to pay attention to this strategic data management capability because it’s almost as fundamental as the machinery that manufacturers use to make cars.”

Rapid Change

Hanson says that the pace of innovation at Informatica is the fastest he’s seen in his two decades at the company because its clients understand the operational benefits to be gained from implementing AI-based data management processes. This “unstoppable trend towards AI” is being driven by board-level demand, especially within financial services, a sector he describes as being at the “bleeding edge” of technological adoption.

Many have had their appetites whetted by AI’s ability to streamline and improve the low-hanging fruit challenges they face, such as creating unique customer experiences and engagements. To embed and extend those AI-powered capabilities across their entire organisation, however, will take more effort, says Hanson.

“Their ability to harness data and exploit AI’s potential is going to be the difference between the winners and losers in the market,” he says. But the drive to get results quickly may lure firms towards rash decisions that could create more problems later.

“They need to think strategically about data management, but they can start small and focus on a small use case and an outcome that they can deliver quickly, then grow from there.”

Make it Simple

Among Informatica’s clients across 100 countries are banks such as Santander and Banco ABC Brasil, US mortgage underwriting giant Freddie Mac, insurer AXA XL and online payments provider PayPal. Among the services it’s providing such institutions are broad cost reduction by the optimisation of reference data operations and the simplification of their broader data processes.

This latter point is key to helping clients better use their data, says Hanson. Arguing that without good data inputs, AI’s outputs will be “garbage out at an accelerated pace”, he says that many companies have overcomplicated data setups that are hampering their adoption of the technology. By having separate tools to manage each element of their data management setup – including data access, quality, governance and mastering capabilities – large firms are strangling their ability to make AI work for them.

“But now complexity is out and simplicity is in,” Hanson says. “As companies modernise to take advantage of AI, they need to simplify their stacks.”

Enter GenAI

Informatica is helping that simplification through a variety of solutions including its own GenAI-powered technology for data management, CLAIRE GPT – the name being a contraction of “cloud AI for real-time execution”. The technology began life simply as CLAIRE seven years ago. Last year, however, it was boosted with the inclusion of GenAI technology, enabling clients to better control their data management processes through conversational prompts and deep-data interrogation.

Comparing the new iteration to Microsoft’s Copilot, Hanson says CLAIRE GPT now offers clients greater capabilities to simplify and accelerate how they consume, process, manage and analyse data.  Adding fuel to its firepower is CLAIRE GPT’s ability to enable individual clients to call on the combined metadata of Informatica’s 5,000-plus clients to provide them with smarter outputs.

While almost all of Informatica’s offerings are embedded with its new GenAI technology, the next step will be to ensure the company’s entire range of products benefits from it.

“Data management is complex and costly for many companies and it massively impacts the ability of the company to release new products, deliver new services and create more pleasing customer experiences,” he says.

“Our job with GenAI as the fundamental platform foundation is to offer more comprehensive services around that foundational layer of data management, and more automation and productivity around the end-to-end data management journey.”

The post Informatica Sees a Future of AI-Focused Innovation Releases appeared first on A-Team.

]]>
Financial Firms Have Widest Data Security Perception Gap: Survey https://a-teaminsight.com/blog/financial-firms-have-widest-data-security-perception-gap-survey/?brand=dmi Mon, 15 Jul 2024 13:46:42 +0000 https://a-teaminsight.com/?p=69277 The financial services sector has the widest gap between perceptions about its data security and its vulnerability to data attacks. A survey by data security provider Dasera found that 73% of institutions questioned said they had high levels of confidence in their ability to fend off ransomware attacks, data breaches and other unauthorised uses of...

The post Financial Firms Have Widest Data Security Perception Gap: Survey appeared first on A-Team.

]]>
The financial services sector has the widest gap between perceptions about its data security and its vulnerability to data attacks.

A survey by data security provider Dasera found that 73% of institutions questioned said they had high levels of confidence in their ability to fend off ransomware attacks, data breaches and other unauthorised uses of data. Nevertheless, records of attacks showed that those firms were among the worst affected in 2023.

“The significant number of breaches contradicts high confidence in their security strategy, suggesting overconfidence in their security posture,” the report, entitled The State of Data Risk Management 2024, stated. “The sector remains a prime target for cyberattacks due to valuable data, indicating a gap between perceived effectiveness and actual vulnerability.”

The report compared the perceptions of companies in a range of high-profile data-focused sectors, including healthcare and government, with statistics on data breaches compiled by a variety of organisations and studies. These include the Verizon Data Breach Security Report, Kroll’s Data Breach Outlook Report and the Identity Theft Resource Centre.

Record Year

The Dasera survey said the combined conclusions of those studies showed that 2023 was a “record-breaking year” for breaches.

According to Verizon, the financial services industry suffered 477 data security incidents in 2023, compared with 380 for IT firms and 433 in the healthcare sector. Only government bodies suffered more, at 582. Kroll found that financial firms accounted for the largest proportion of attacks, at 27%.

Two-thirds of breaches originated externally. With the balance coming from internal “threat actors”, the financial services firms were among the least protected against attacks from within their own systems.

The report found that 77% of breaches within the sector came from basic web application attacks, miscellaneous errors and system intrusions.

“The survey underscores the importance of adopting integrated and automated data security strategies to address these challenges,” the Dasera report stated. “Reliance on outdated, manual processes and slow adoption of automated systems contribute to current vulnerabilities. Organisations must prioritise modern, proactive approaches, including regular audits, strategic use of technology, and external consulting, to effectively navigate the evolving landscape of data risk.”

The post Financial Firms Have Widest Data Security Perception Gap: Survey appeared first on A-Team.

]]>
S&P Global Market Intelligence Updates Capital IQ Pro with Fixed Income Data and GenAI Summarisation https://a-teaminsight.com/blog/sp-global-market-intelligence-updates-capital-iq-pro-with-fixed-income-data-and-genai-summarisation/?brand=dmi Wed, 26 Jun 2024 10:59:39 +0000 https://a-teaminsight.com/?p=69042 S&P Global Market Intelligence continues to update its Capital IQ Pro data and analytics platform with the addition of more than 19.4 million fixed income securities with full reference data, pricing and analytics. The company has also added GenAI-powered earnings transcript summarisation capabilities and enhanced private markets and segment data. The fixed income data includes...

The post S&P Global Market Intelligence Updates Capital IQ Pro with Fixed Income Data and GenAI Summarisation appeared first on A-Team.

]]>
S&P Global Market Intelligence continues to update its Capital IQ Pro data and analytics platform with the addition of more than 19.4 million fixed income securities with full reference data, pricing and analytics. The company has also added GenAI-powered earnings transcript summarisation capabilities and enhanced private markets and segment data.

The fixed income data includes reference data and pricing across government, sovereign, agency and corporate securities, while the transcript summarisation capabilities use GenAI and machine learning models to offer a comprehensive overview of an earnings call by providing a summary of the transcript organised by topics and sentiment.

Other updates in this round include ongoing expansion of deep industry-level and segment-specific data across banking, energy, insurance, real estate and technology, media and telecommunications, and seamless notifications of event updates by enhancing ability to automatically sync corporate and industry events within the platform’s events calendar with users’ personal calendars.

“This latest release underscores our commitment to continually innovate with new technologies and capitalise on synergies from our merger with IHS Markit,” says Warren Breakstone, head of Capital IQ solutions at S&P Global Market Intelligence. “The addition of robust fixed income content further expands the value proposition of S&P Capital IQ Pro and the introduction of earnings call transcript summarisations, along with further advancements in AI-powered platform search capabilities, creates new efficiencies for our users.”

Recent additions to the platform include AI-enabled search and the integration of IHS Markit content including loan pricing and analytics, and Purchasing Managers’ Index indicators, country risk scores and economic data. The company’s acquisition of Visible Alpha created a premium offering of fundamental investment research capabilities to be offered as an add-on to S&P Capital IQ Pro.

The post S&P Global Market Intelligence Updates Capital IQ Pro with Fixed Income Data and GenAI Summarisation appeared first on A-Team.

]]>
Practicalities of Implementing GenAI in Capital Markets https://a-teaminsight.com/blog/practicalities-of-implementing-genai-in-capital-markets/?brand=dmi Wed, 26 Jun 2024 10:27:41 +0000 https://a-teaminsight.com/?p=69037 Following the opening keynote of A-Team Group’s AI in Capital Markets Summit (AICMS), a panel of expert speakers focused on the practicalities of implementing GenAI. The panel agreed that industry hype is waning and there is enthusiasm for GenAI with firms beginning to develop use cases, although one speaker noted: “People understand the risks and...

The post Practicalities of Implementing GenAI in Capital Markets appeared first on A-Team.

]]>
Following the opening keynote of A-Team Group’s AI in Capital Markets Summit (AICMS), a panel of expert speakers focused on the practicalities of implementing GenAI. The panel agreed that industry hype is waning and there is enthusiasm for GenAI with firms beginning to develop use cases, although one speaker noted: “People understand the risks and costs involved, but they were initially underestimated, I would say dramatically in some cases.”

The panel was moderated by Nicola Poole, formerly at Citi, and joined by Dara Sosulski, head of AI and model management markets and securities services at HSBC; Dr. Paul Dongha, group head of data and AI ethics at Lloyds Banking Group; Fatima Abukar, data, algorithms and AI ethics lead at the Financial Conduct Authority (FCA); Nathan Marlor, head of data and AI at Version 1; and Vahe Andonians, founder, chief product officer and chief technology officer at Cognaize.

Considering the use of GenAI, an early audience poll question asked to what extent organisations are committed to GenAI applications. Some 46% said they are testing GenAI apps, 24% are using one or two apps, and 20% are using a number of apps. Nine percent are researching GenAI and 2% say there is nothing in the technology for them.

Value of GenAI applications

A second poll questioned which GenAI applications would be of most value to a delegate’s organisation. In this case, 53% of respondents cited predictive analytics, 39% risk assessment, 39% KYC automation, 28% fraud detection and 19% portfolio management.

The panel shared their own use cases, with one member experimenting with GenAI to produce programming code and creating an internal chat box for data migration, as well as scanning data to surface information that can be categorised, sorted, filtered and summarised to create ‘kind of conversational extracts that can be used.’

All agreed that GenAI produces some low hanging fruit, particularly in operational activities such as KYC automation, but that the technology is too young for many applications, leading firms to build capability internally before unleashing GenAI apps for customers as there is still work to do around issues such as risk integration and ensuring copyright and data protection are not compromised. One speaker said: “There is a lot of experimentation and some research to do before we’re confident that we can use this at scale.” Another added: “There are just not enough skilled people to allow us to push hard, even if we wanted to. There’s a real pinch point in terms of skills here.”

Risks of adopting GenAI

Turning to risk, a third audience poll asked the audience what it considered to be the biggest risk around adopting GenAI. Here data quality was a clear leader, followed by lack of explainability, hallucinations, data privacy and potential misuse. Considering these results, a speaker commented: “We’ve already got existing policies and governance frameworks to manage traditional AI. We should be using those to better effect, perhaps in response to people identifying data quality as one of the key risks.”

The benefits of AI and GenAI include personalisation that can deliver better products to consumers and improve the way in which they interact with technology. From a regulatory perspective, the technologies are focused on reducing financial crime and money laundering, and resulting enforcements against fraudulent activity.

On the downside, the challenges that come with AI technologies are many and include ethical risk and bias, which needs to be addressed and mitigated. One speaker explained: “We have a data science lifecycle. At the beginning of this we have a piece around the ethical risk of problem conception. Throughout the lifecycle stages our data scientists, machine learning engineers and future engineers have access to python libraries so that when they test models, things like bias and fairness are surfaced. We can then see and remediate any issues during the development phase so that by the time models come to validation and risk management we can demonstrate all the good stuff we’ve done.” Which leads us to the need, at least in the short term, for a human element for verification and quality assurance of GenAI models in their infancy.

Getting skills right

Skills were also discussed, with one panel member saying: “We are living in a constantly more complex world, no organisation can claim that all its workforce has the skill set necessary for AI and GenAI, but ultimately I am hopeful that we are going to create more jobs than we are going to destroy, although the shift is not going to be easy.” Another said: “In compliance, we will be able to move people away from being data and document gatherers and assessors of data in a manual way to understand risk models, have a better capability and play a more interesting part.”

Taking a step back and a final look at the potential of GenAI, a speaker concluded: “Figuring out how to make safe products that we can offer to our customers is the only way we have a chance of reaching any sort of utopian conclusion. We must chart the right course for society and for people at work, because we’re all going to be affected by generative AI.”

The post Practicalities of Implementing GenAI in Capital Markets appeared first on A-Team.

]]>