About a-team Marketing Services

A-Team Insight Blogs

The Hidden Cost of Bad Data – Why Accuracy Pays Off

Subscribe to our newsletter

By Ariel Junqueira-DeGarcia, Strategy and Technology Leader at Broadridge.

In the financial world, data is king. But many argue that firms are dangerously reliant on data they inadequately understand, unknowingly wielding a double-edged sword that can just as easily enrich as it can erode. This article pulls back the curtain on the murky world of data usage, examining the hidden costs of bad decisions driven by flawed information.

At a glance

  • Think bad data is just typos and misplaced commas? Think again. It’s a silent saboteur, shattering profits, sowing operational chaos, and leaving reputational damage in its wake.
  • Bad data hides in plain sight, silently emerging through flawed data capture, messy storage, and faulty processing. By zeroing in on these three battlegrounds, we can strategically deploy resources to better address these challenges and transform data into fuel for future growth.
  • The impact to the bottom line is typically seen in four areas: financial, operational, regulatory and reputation. High-profile financial losses are just as impactful as smaller, frequent incidents that can quickly avalanche into millions of dollars in losses.
  • The largest unseen cost of poor data quality is hindering technological advancement. Without clean data, AI tools are a waste of resources; with clean data they are the future of the industry.

Reliance on data
As firms are becoming more data driven, broker dealers, asset managers and companies across industries are becoming more reliant on data to support operations across the firm. Poor data quality costs organizations an average of $15 million USD per year, according to Gartner’s 2017 Data Quality Market Survey.

Data plays a crucial role in supporting the everyday critical decisions for your organization (figure 1). Beyond a collection of numbers and facts, it is the lifeblood of your business operations, influencing every aspect of your relationship with management, staff, customers and regulators.

Data guides decision-making to drive financial performance – when to buy or sell a stock or assess the risk of a credit agreement. Data delivered via management reporting provides the visibility to assess the strength of your operational performance – reliability of revenue forecasts, vendor spend. You are also sourcing data to demonstrate regulatory compliance across regulatory regimes, including incorporating new data to keep up with regulatory change. The credibility and reputation of your firm is at stake, as accurate data enables timely resolution of exceptions, control over money movements, and protection of customers from financial crimes.

Bad data is insidious

As data users, it can be frustrating to understand why data cannot be consistently delivered in a complete, accurate, and timely manner. Why is it so hard? The root causes of bad data can be pervasive, driven by a legacy of siloed solutions that were designed to meet short term problems. In addition, we can easily overlook just how complex it is to identify the data we need, when we need it, and how it will be managed. Bad data often boils down to three things: data capture, data storage, and data processing.

Data Capture – Poor data quality often manifests through problems with the accuracy or completeness of the data records captured, such as:

  • Incomplete or non-conforming data: empty fields, spelling mistakes, substitutions, non-standard data values excluded from analysis or storage, data entered in the wrong fields
  • Duplicate records: data brought in from multiple sources that results in duplicates that are not addressed
  • Data decay: mismanagement of out-of-date or irrelevant data that continues to be used for reporting.

Data Storage – Bad data from improper data governance often manifests when there are gaps in data practices, such as:

  • Data silos: data sources are often stored and analyzed separately, allowing for incomplete or inaccurate data to be consumed
  • Data swamps: data needs to be updated and cleaned frequently, a lack of orchestration processes to update and organize data will result in old data being used for analysis
  • Poor data versioning: gaps in approach to slowly changing dimensions

Data Processing – Under the hood, the data you use to inform critical decisions has gone through many complex processes to land on your screen. Every data pipeline is different, but often includes a set of common processes (figure 2): collection, ingestion, transformation, loading, and consumption.

If there are any gaps in workflow, tooling, or data engineering talent (or capacity), then it is likely that your data pipelines will be producing bad data. This is often seen with new data integrations and data migrations.

Additionally, the data scientists you hire get stuck dealing with these problems, spending much of their time verifying, cleaning, correcting and wrangling data. Not only are they unable to fix the underlying data issues, but also they are prevented from generating the valuable insights and predictions they were hired for in the first place.

Costs of bad data: impact to your bottom line

Bad data can result in tangible costs to your firm’s bottom line. Where can you see these costs manifest? Here are some real-world examples that highlight the importance of data accuracy and control in financial institutions:

  • Financial performance – a single spreadsheet error cost JPMorgan $6.2 billion due to inaccurate risk models built on faulty data.
  • Operational performance – Citigroup in 2021 accidentally wired $900 million to a group of lenders for the cosmetics company Revlon.
  • Regulatory Compliance – GDPR fines exceeding €4 billion serve as a stark reminder: bad data doesn’t just violate regulations, it’s a ticking time bomb that can explode into hefty financial penalties.

Credibility / reputation of firm – data leaks aren’t just breaches, they damage trust. Can you afford to lose key customers over preventable data errors? Bad data maintenance can lead to sensitive data leaks or financial loss for a customer whose investments your firm is managing. Both scenarios can result in a hit to the credibility and reputation of the firms responsible, as well as lead to customer dissatisfaction and attrition.

Conclusion

Clean, accurate and timely data is critical to the future of every organization. It can be both the fuel for your journey and the iceberg that sinks your ship. We can see the cost of bad data manifest across organizations. While identifying and measuring the impact of bad data remains critical, delaying taking steps to address the root cause is no longer a risk firms can afford to take.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Strategies, tools and techniques to extract value from unstructured data

Date: 12 September 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Unstructured data is voluminous, unwieldy and difficult to store and manage. For capital markets participants, it is also key to generating business insight, making better-informed decisions and delivering both internal and external value. Solving this dichotomy can be a...

BLOG

French Election Reminds Asset Managers to Expect the Unexpected

By Sam Idle, Solutions Consultant at Clearwater Analytics. **The latest results of the surprising snap French election are a timely reminder for asset managers to always expect the unexpected. The knock-on effects on their investments can create a metaphorical line at the door from anxious investors with a million questions on how their portfolios have...

EVENT

Data Management Summit New York City

Now in its 14th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Complex Event Processing

Over the past couple of years, Complex Event Processing has emerged as a hot technology for the financial markets, and its flexibility has been leveraged in applications as diverse as market data cleansing, to algorithmic trading, to compliance monitoring, to risk management. CEP is a solution to many problems, which is one reason why the...