How much does climate change actually cost? It’s one of the more important questions, and also one of the more difficult ones. The reason often hinges on one word: attribution.

It’s one thing for science to conclude that global warming generally leads to more intense hurricanes, droughts, wildfires, and all sorts of other extreme weather events. There’s no doubt climate change will fuel the upcoming wildfire season in California and a busy summer for Atlantic hurricanes. But it’s another thing to link any single storm to climate change and calculate the cost as a direct consequence.

Attribution science had its coming-out moment with the 2003 European heatwave that killed over 70,000. It’s relatively easy to read temperatures off a thermostat and conclude that the summer was probably the hottest in Europe since at least 1500. How much of it was caused by climate change? A 2004 Nature study concludes with over 90% confidence that human influence was to blame for a doubling of the risk of such a heatwave in 2003, conditions not met in any other year since 1851. That paper, co-authored by Peter Stott, who heads the Climate Monitoring and Attribution team at the U.K. Met Office, is now one of the most cited attribution studies.

Stott’s team inside the British forecasting agency has been churning out rapid-response analyses for any number of extreme climatic events, placing percentages on the likelihood that any one drought, flood, or wildfire was caused by climate change. The ever-growing human influence on the climate, coupled with more data and more sophisticated statistical techniques, now allows for better and crisper conclusions. The 2010 Moscow heatwave? Climate change has increased the probability of such record heats fivefold, which translates into an 80% probability that the heatwave would not have occurred without climate change.