Data Matters

by | Jun 9, 2025 | Risk Report | 0 comments

In our business—measuring financial risk in all its glorious variations—data is so absolutely foundational that it does not occur to us to imagine a world without it.

But if we were to entertain that thought experiment, a world without data would be like a mind without memory, a rhyme without reason, a sprinkle without ice cream. Our brains are (literally) wired to recognize patterns—data—and use them to predict what comes next. Data is an integral part of our survival. If we have no knowledge of tigers eating people, how will we know not to keep them as pets?

Or perhaps more relevant, if we don’t know that global temperatures keep rising, how will we realize that we should probably do something to curb it?

This week, FT brought an opinion piece about the effort to rescue data that is at risk of disappearing. The article talks about the Data Rescue Project, which serves as a clearinghouse for data preservation efforts and data access points for at-risk public US data.

As government agencies are made more efficient and budget spending is cut, people who collect and maintain data are no longer employed, and the data itself might be lost.

And this is not just any old data but data from the Centers for Disease Control and Prevention, the Consumer Financial Protection Bureau, the National Hurricane Center, and the like.

While imperative, data in itself is not enough. It also has to be complete and trustworthy. An on-going concern in the UK is that the very data that is used to gauge the economic climate and base policy on might not be accurate. This week, the Office for National Statistics (ONS) had to send out a correction of the official inflation number for April 2025—it was reduced from 3.5% to 3.4%. You only have to read the past many Risk Reports to see that inflation is pretty much the Queen of Economic Indicators, only rivaled by the unemployment rate, which the ONS coincidentally also has had some issues with this year.

As anyone who has taken an introductory course in econometrics will know, statistics are…hmm…widely interpretable, and an error of 0.1% can seem insignificant. However, much of what we put our trust in is the consistent methodology with which the numbers are calculated. Repeated mishaps do not help the sense of a reliable process.

Nowhere is that more apparent than with AI. Remember when we were outraged that AI was perfectly fine with us eating rocks and putting glue on pizza? These hallucinations were (hopefully) grave enough for us to know not to trust them, but what if the answer is not so blaringly wrong? I was at the #SASInnovateOnTour conference this week, and the message there was, yes, you have to have excellent (AI) tools, but they are worth very little if the data you apply them to is not sound.

Data matters!


Regitze Ladekarl, FRM, is FRG’s Director of Company Intelligence. She has 25-plus years of experience where finance meets technology.

This article is part of the FRG Risk Report, published weekly on the FRG blog. To read other entries of the Risk Report, visit frgrisk.com/category/risk-report/.