The Valencia flood1 provides a striking example of how difficult the role of climate change remains in the most complex types of extreme weather. The authors present a high-resolution, process-based attribution analysis of this extreme rainfall event—one of the most challenging type of events in attribution science due to its strongly convective nature. Their work not only demonstrates a clear climate-change signal in the event’s rainfall intensities, but also shows that such signals can be uncovered even in extremes that operate at the very edge of what models can resolve. Understanding how climate change influenced such an event, however, depends critically on the tools used to analyse it.

Why trust a numerical weather or climate model?

It’s a fair question, and one that modellers hear often. Change a few parameter settings and the model may produce different results—so why trust any of them? A useful starting point is to recognise that all models, no matter how sophisticated, are simplifications of the real world. As George Box famously put it: “All models are wrong, but some are useful.” The task is to ensure they are useful in scientifically meaningful ways.

The most straightforward way to assess a model’s usefulness is to compare its output with observations or observation-based datasets such as reanalyses. If a model can reproduce the key features of an event—its intensity, extent and timing—then we gain confidence that it represents the underlying processes realistically.

This is precisely what Calvo-Sancho et al. demonstrate1. Their high-resolution regional model simulations faithfully reproduce the observed characteristics of the Valencia flash flood: where it rained, how intensely and across what area. This close match is essential. Without it, any attribution result would rest on uncertain ground.

It is important to keep in mind that not all weather extremes are created equal. Large-scale events such as heatwaves are comparatively straightforward to analyse, but convective precipitation extremes, like the Valencia storm, unfold on spatial and temporal scales where small shifts in moisture, instability or topography can dramatically alter outcomes. This makes them notoriously difficult for climate models to capture2, and it raises a central question: how can we robustly attribute climate-change influence in events defined by their microscale physics? These challenges have shaped how extreme-weather attribution has developed as a field.

Extreme weather attribution: from statistical scaling to event-specific insights

Attribution science emerged more than two decades ago with the aim of saying something meaningful about individual extreme events in a changing climate. Early attribution studies approached this challenge within a statistical framework, treating a given event as part of a class of similar events. This made it possible to quantify how human-driven climate change had altered the likelihood of such events occurring3—a major conceptual breakthrough that established event attribution as a rigorous scientific endeavour.

As attribution science matured, it became increasingly clear that probability-based approaches alone are not sufficient for addressing all questions related to extreme weather events. While these methods are powerful for quantifying changes in likelihood, individual events can deviate strongly from average behaviour, and their impacts often depend on local, nonlinear processes that are not captured when events are grouped into classes. Much of the remaining uncertainty in climate projections arises from how the atmosphere moves and circulates—the dynamics of weather systems—rather than from simple thermodynamic effects alone4. This highlights why physical understanding of atmospheric processes in individual events, rather than event classes, is essential alongside statistical methods, and has driven the development of event-specific approaches known as conditional attribution.

Conditional versus unconditional attribution

A widely used distinction in extreme weather attribution separates methods into unconditional and conditional approaches5. Most long-term climate simulations are free-running: they simulate plausible climates under known boundary conditions but generate their own weather. These models are typically used for unconditional attribution, which examines how climate change has shifted the probability of event classes6. This method answers questions such as: “How has the likelihood of extreme precipitation events in this region changed due to climate change?”

Conditional approaches, also known as storyline methods, address a different question7. They keep the large-scale meteorological situation constant and ask how the event would behave in a different climate state—effectively replaying the same extreme weather event in a cooler world (without human induced climate change) or an even warmer future world. This makes it possible to isolate how climate change influenced the storm’s internal processes and rainfall intensities, insights nearly inaccessible through unconditional approaches.

What the Valencia storyline reveals

In the Valencia flood example1, the research question concerns a specific event rather than an event class. The storyline framework asks: “If this exact storm had occurred in a world without climate change, would the outcome have been the same?”

Using a conditional attribution approach, the authors reproduce the exact event and compare its behaviour under present-day conditions and under a climate representative of the period before substantial human-induced warming, which is usually referred to as pre-industrial climate conditions. This allows them to isolate how warming influenced the event itself, rather than its likelihood.

The strength of the storyline method lies in its ability to go beyond rainfall totals and probe the physical mechanisms behind the impacts. The authors trace how warming altered evaporation, moisture transport, diabatic heating and updraft strength, showing that the resulting precipitation increase exceeded the ~7% per degree expected from Clausius–Clapeyron scaling due to nonlinear storm dynamics1.

Toward a more integrated attribution science

The Valencia study also highlights a broader point about the future of attribution science. Extreme-weather attribution has long been shaped by the two aforementioned approaches: statistical approaches that quantify changes in the probability of events (unconditional attribution), and physically based, storyline approaches that examine how climate change influences the processes within a specific event (conditional attribution). These methods have sometimes been viewed as competing, yet they answer fundamentally different questions—and both are essential.

For highly complex extremes such as convective storms resulting in flash floods, conditional attribution provides insights that cannot be gained from statistical methods alone. It reveals how warming alters the storm-scale dynamics, moisture pathways and local feedbacks that ultimately determine the severity of impacts. But statistical approaches offer something equally valuable: a measure of how often such hazardous conditions are expected to occur in the future.

The next step for the field is not to choose between these methods, but to integrate them8. Impact assessments—whether for urban flooding, agriculture, health or infrastructure—require both the physical understanding provided by storylines and the probability-based perspective offered by traditional attribution, in combination with impact-focused research. Studies have begun moving in this direction: work linking extreme weather attribution to soybean harvest losses in South America9, or process-based analyses of flooding in New York following Hurricane Sandy10, demonstrate how such connections can be made.

As climate change accelerates, the most useful attribution science will be that which bridges approaches—translating both physical understanding and probability shifts into information that matters for society. The Valencia study1 makes clear that projected intensification of extreme rainfall is no longer a distant scenario; it is already becoming visible in present-day events across the Western Mediterranean (Fig. 1). To turn this knowledge into resilience, attribution studies must increasingly connect with impact analyses, providing the evidence base needed for designing and prioritising adaptation strategies.

Fig. 1: Satellite imagery showing before and after the extreme rainfall and flood events in Valencia, Spain.
figure 1

Source: NASA.