A University of Wisconsin professor believes she’s come up with a way to remove some of the uncertainty in modeling, and she thinks her findings will help scientists and modelers more accurately take into account the impacts of climate change.
Galen McKinley, a professor in the University of Wisconsin-Madison’s department of atmospheric and oceanic sciences, is tackling the problem of variability – variations in climate measured over a period of decades – in long-range modeling.
She conducted an extensive study and published her findings in a paper called “Timescales for detection of trends in the ocean carbon sink.” The paper appeared in Nature late last month.
The paper examines the relationship between carbon dioxide emissions and climate change and how well the “ocean-carbon sink” soaks up carbon emissions.
To get to the bottom of this she needed to separate variability from climate forcing – defined as any influence on climate that originates from outside the climate system.
It’s part of the still-ongoing debate over how much climate change is influencing severe weather. There’s never been a clear way of separating the effects of variability from the effects of climate change – was Superstorm Sandy due to climate change, variability, or both?
The research produced by McKinley, who is also an affiliate of the Center for Climatic Research at UW–Madison’s Nelson Institute for Environmental Studies, may give scientists an idea of how much variability there is, as well as a better, more scientific way of looking at it.
“The goal is to highlight how very large the natural variability is and to encourage all scientists and analysts to find ways to account for it,” McKinley said.
Tom Larsen, chief product architect at catastrophe modeling firm CoreLogic, believes McKinley’s research may eventually lead to a better way of dealing with the uncertainty that makes long-range severe weather modeling a risky business.
“Currently, the science that we use to explain (variability) away is that it was ‘Random,'” said Larsen, who has been calling for more predictive modeling for years. Larsen last year said there has been an increasing number of requests to offer clients climate-related data in models.
A better understanding of variability may have, for example, told us more about the current El Niño, which so far has turned out to be a bit more lackluster than anticipated, he said.
“We have an El Niño this year that (indicated) that this was going to be a torrential year,” Larsen said. “The season’s not over, but it’s looking to be kind of average.”
McKinley’s research veered away from a traditional method of using several different climate models once to make predictions.
Instead, she ran a single model, called the Community Earth System Model–Large Ensemble, 30 times with slight changes in temperature.
Common sense would dictate that using several models and looking at them as a whole would yield better results.
But it turns out there’s a big advantage in using a single model. It offers a good look at the impact of variability on climate.
With multiple models the challenge is to separate the difference in the model structures from climate variability, while a single model removes the question of those different model structures and leaves remaining only the effect of “anthropogenic forcing and variability,” McKinley said.
In the single-model approach, there is “a clarity of separation,” she added.
The approach also requires enormous computing resources.
In McKinley’s research she started her model on Jan. 1, 1920 and worked in very small perturbations of 10-14 (0.00000000000001) Celsius in ocean surface temperatures.
The perturbations (deviations) did not change the mean global temperature, it just got a tiny amount warmer in some places and cooler in others. These perturbations, each of which produced what McKinley called “ensembles,” were done randomly in space across the globe.
The size of each perturbation is far smaller than measurement error, meaning that the perturbations are smaller than the precision with which we even know the temperature of the global atmosphere today, according to McKinley.
Looking at the differences between each ensemble she got a clearer picture of climate variability.
“That spread across ensemble members is an indicator of the internal variability,” McKinley said.
In layman’s terms she described it “like the butterfly effect,” the concept that small causes can have large effects on weather – the term comes from the metaphor of a hurricane having been influenced by the flapping of the wings of a distant butterfly at an earlier time.
Using another metaphor, McKinley explained the climate system like a bell.
If you hit the bell once it starts to ring. If the bell is already ringing and you hit it, the bell will slightly change its ringing. If the bell is already ringing at a different rate when you hit it, there will be yet another different ringing response.
“You can think of the climate system as all these different bells ringing,” she said.
Those different rings are the variability.
“I think he helps us understand from a science perspective really how variable the system can be and how sensitive the system can be to small perturbations,” she said of her research.
Those 10-14 changes gave “surprising” results. By 1930 McKinley was already seeing noticeable changes in the climate.
The slight changes in each ensemble had profound effects on the decadal oscillation – the periodically recurring pattern of ocean and atmospheric climate variability – as well as El Niño and La Niña weather patterns.
“So if the temperature in 1920 had been a little warmer, we might have had El Niño three years ago, and if were a little cooler we might have had El Niño last year,” McKinley said.
Her research tells her is that the variability is a bigger part of what we’re observing, and that trends due to climate forcing are out there, but they are being masked by variability.
“Our goal is to really understand the large-scale effects of anthropogenic warming and separate it from the internal variability,” she said.
She believes her research could lead the modeling community to using their models “in a more intelligent way.”
Larsen believes McKinley’s research will help in figuring out how the effects of climate change should be factored into models.
Currently, most models that attempt to factor in climate change are using temperature changes suggested by the Intergovernmental Panel on Climate Change, which Larsen said are “very coarse.”
“They’re so coarse that there’s not a lot you can do,” he added.
In other words, it’s clear that climate change may someday create weather patterns we should be scared about, “but I can’t really tell you what you should be scared about,” Larsen said.
Sea levels are expected to rise, but not everywhere. And just where that water will go, and how a changing climate will impact severe weather, is far from clear, he said.
“It’s this type of work and the follow-up that will hopefully ensue that will start giving us better answers,” Larsen said.
- Climate Change’s No Justice, No Pancake, No Fly Zone
- California Commissioner Explains Call for Insurer Divestment from Coal
- Survey of Economists Shows Climate Change Impact Sooner, More Severe
- Reinsurance Group Head Puts Science in Climate Change Discussion
- Warm 2016 May Help Put Climate Change on The Map – Even Higher That Is
Was this article valuable?
Here are more articles you may enjoy.