The study—by climatologist John R. Christy—does something refreshingly simple: it looks at actual observed temperature extremes across the United States from 1899 to 2025.
No modelling.
No sweeping global averages.
Just raw, station-based data.
And the results? They contradict what we’re constantly told.
What the Study Actually Did
The paper (titled “Declines in hot and cold daily temperature extremes in the conterminous US”) analysed:
Daily maximum temperatures in summer
Daily minimum temperatures in winter
Covering over a century of observations (1899–2025)
Using real station data, not heavily adjusted or homogenised datasets (newswise.com)
In other words, this is about temperature extremes—the events people actually feel—not abstract averages.
The Key Finding: Extremes Were Worse in the Past
The headline result is striking:
The most extreme heat events in the US occurred in the early 20th century, particularly the 1930s
Both hot and cold extremes have generally declined over time
The overall pattern shows a moderation, not escalation, of temperature extremes
Yes, you read that correctly.
According to this dataset, the United States experienced more intense temperature swings decades ago than it does today.
The 1930s: America’s Real Heat Crisis
If you want a period that truly tested the limits of heat in the United States, look no further than the Dust Bowl era.
That decade saw:
Record-breaking heatwaves
Widespread agricultural collapse
Extreme drought conditions
And—crucially—these events still dominate many all-time temperature records today.
So Why Does the Narrative Feel So Different?
Here’s where things get interesting—and controversial.
The paper deliberately avoids heavy data “adjustments” and instead relies on observed station data. That matters because:
Many global datasets use homogenisation techniques to adjust historical records
Urbanisation can introduce heat biases over time (more concrete, less vegetation)
Modern reporting focuses heavily on averages, not extremes
This study flips that focus and asks a simple question:
What do the raw extremes actually show?
And the answer is: less volatility, not more.
But Let’s Be Clear… This Isn’t the Whole Story
Before anyone jumps to conclusions, it’s important to keep perspective.
This paper:
Focuses on the United States only, not global temperatures
Examines extremes, not long-term average warming trends
Uses a specific methodological approach that differs from many mainstream datasets
So while it contradicts claims about extreme temperatures and rising volatility, it doesn't overturn the entire narrative relating to climate change. Yet given these observations one must hold all extreme climate claims up to scrutiny.
Why This Matters
What this paper really exposes is something deeper:
A growing disconnect between:
What people are told
And what specific datasets actually show
Climate science is complex. But public messaging often isn’t.
And when a peer-reviewed paper suggests that the worst heat extremes occurred nearly a century ago, it raises a legitimate question:
Are we getting the full picture—or just the most convenient version of it?
The Bottom Line
The new study doesn’t deny climate change.
But it does challenge a commonly repeated claim:
That recent years represent an unprecedented explosion in extreme heat—at least in the United States.
According to this research, the truth is more nuanced:
The past—especially the 1930s—was more extreme than many realise
And today’s climate is more stable in terms of extremes than the headlines suggest
Which leaves us with a simple takeaway:
Before accepting sweeping claims about “unprecedented” conditions, it might be worth asking—unprecedented compared to what?



