About a-team Marketing Services

A-Team Insight Blogs

GFT Blue Paper Presents the Potential of Big Data Technologies

Subscribe to our newsletter

As the technology infrastructure of financial services firms begins to buckle under the strain of huge and increasing volumes of data, many firms are investing in big data projects to improve data management for the purposes of regulatory compliance, risk reduction, cost efficiency and business benefit. The extent and intent of investment varies, however, with some firms taking an evolutionary approach that imposes big data technologies on existing processes and others taking a revolutionary approach that rethinks processes using big data technologies.

A Blue Paper by GFT, a specialist in designing and implementing IT solutions for the financial services industry, looks at the potential of big data solutions for investment banks, retail banks and insurance companies. Entitled Big Data – Uncovering the Hidden Business Value in the Financial Services Industry, the paper also considers use cases and benefits of big data, and provides recommendations to help firms successfully implement big data technologies.

Focussing on the investment banking sector, Karl Rieder, executive consultant at GFT and co-author of the Blue Paper, notes the problems of data silos in investment banks and the difficulty of capturing a complete view of activities and generating reports, but says banks are beginning to look at big data technologies that have the capacity to store huge volumes of data and the power to process it.

He explains: “Investment banks taking a revolutionary approach are restructuring IT systems and centralising data storage. This takes some doing and is often driven by regulations that require, for example, a complete view of the activity of clients or comprehensive risk calculations.”

The report cites MiFID, Basel III, Fatca and Dodd-Frank as some of the regulations that are driving change, as they require banks to report across asset classes and necessitate enterprise-wide views of operations and activities. Rieder adds BCBS 239, a Basel Committee regulation that forces change in how data is managed, controlled and governed. He also notes the Volcker Rule, which is part of the Frank Dodd Reform Act relating to bank levels of proprietary trading and requiring banks to report on the inventory aging of all their positions. To achieve this, explains Rieder, a bank needs to hold a view of positions for today and the past 365 days, apply a complex algorithm and calculate how long positions have been held. Without big data technologies that can store, access and process both current and historic data, the calculation is difficult and can take days rather than the minutes achieved by big data solutions.

Rieder warns that technology alone cannot solve banks’ big data problems and says it may be necessary to upskill to implement technology successfully, and that it is essential to address data quality, policies and procedures before technology deployment.

Once these issues are resolved, the technologies GFT proposes to support big data are distributed storage and distributed processing, which can together hold and process larger volumes of data than could previously be managed. When selecting specific solutions to support distributed storage and computing, GFT names Hadoop, an open source framework that includes distributed storage and processing, as its platform of choice; NoSQL databases as a means of storing and searching both structured and unstructured data; and event management platforms to support real-time processing of big data. Focussing on NoSQL, Rieder says the open source MongoDB and Cassandra

databases are favourites at investment banks, but also notes the popularity of MarkLogic, a commercial, enterprise ready NoSQL platform that includes access control and security policies.

With big data technologies in place, GFT says investment banks should be able to support a consolidated view of trades, trade analytics, market and credit risk calculations, rogue trade detection, counterparty risk monitoring and regulatory reporting. Improved IT efficiency should drive better operations, reduced IT costs and, ultimately, increased business profitability.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: How to ensure employees meet fit and proper requirements under global accountability regimes

Date: 17 September 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Fitness and proprietary requirements for employees of financial institutions are not an option, but a regulatory obligation that calls on employers to regularly assess employees’ honesty, integrity and reputation, competence and capability, and financial soundness. In the UK, these...

BLOG

Adopting a Principles-Based Framework for AI Governance

Governance in IT is a well-established discipline underpinned by multiple standards from international and national organisations like ISO, IEC and AICPA. What these standards share is a principles-based approach to the implementation of standards within the organisation. A new governance framework to cover emerging AI technologies is not just a necessity but an urgent need....

EVENT

Buy AND Build: The Future of Capital Markets Technology, London

Buy AND Build: The Future of Capital Markets Technology London on September 19th at Marriott Hotel Canary Wharf London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Solvency II Data Management Handbook

Want to get a handle on Solvency II and what it means for data management? Need to make sure you have all the bases covered for the looming January 2016 deadline? Our Solvency II Data Management Handbook is now available for free download to help you. This Handbook is the ultimate guide to all things...