From The Durango Telegraph (Jonathan Romeo):
Before the holidays, snowpack in Southwest Colorado was hovering well below the historic average for this time of year, heightening concerns the winter was not off to strong start.
Fortunately, a weeklong series of snowstorms between Christmas and New Year’s dumped several feet on the San Juan Mountains, causing snowpack averages to jump from around 80% to more than 140%, according to data from the National Resource Conservation Service.
Unfortunately, these numbers don’t tell the whole story of Colorado’s snowpack. This year, Colorado, as well as the entire West, is basing snowpack averages on updated numbers that reflect the drier years the West has been experiencing because of climate change-driven drought.
True, the updated snowpack averages are important for researchers and hydrologists in their work to better understand current climate conditions. The problem for some, however, is that by continually calling degraded conditions from climate change “normal,” both scientists and the public adjust their sense of normal to a situation that is anything but.
“It mutes the effects of climate change because we’re constantly shifting the baseline to reflect the new normal,” Michael Remke, a lecturer of biology at Fort Lewis College, said. “If we become normalized to it being dry, and then we have a dry year reported as 120% of normal, then people are like, ‘Great, a wet year.’ But the reality is we’re trending in a dry direction.”
The National Resource Conservation Service calculates “historic averages” of snowpack based on a 30-year period of record (mostly through SNOTEL stations in the high country). These are updated every 10 years to reflect the most current conditions.
For the past 10 years, however, these averages have been based on snowfall recorded from 1981-2010. But as of this October, the data set was updated to include snowpack averages from 1991-2020. Essentially, the data set switched out the 1980s (considered a wet period) to include the 2010s (a very dry period).
The updated numbers make sense for snow researchers and scientists working in related fields, said Joel Atwood, a hydrologist for NRCS’s Colorado Snow Survey. “It’s important to capture changes as we move forward. When you have 30-year intervals, you can capture some of those changes, and the last chunk of 30 years are more comparable to present conditions.”
One of the most important uses for the data is to monitor and predict runoff in the spring – a critical piece of information to gauge how much water may be available for municipalities, agriculture and other uses…
The problem, some believe, is that the new snowpack averages are shared through social media or the nightly news, and the full context and nuances of the data is not well understood by the public.
For Southwest Colorado, the baselines from 1981-2010 are below the standards now in effect from 1991-2020. This early in the winter, there’s no great disparity between the two for snowpack averages. But that could change come peak season in April, when the snowpack on Red Mountain Pass is an inch below the water-snow equivalent from previous standards. In other areas, the changes are more drastic…
Snowpack isn’t the only data set updated every 10 years. Various agencies, like the USGS and NOAA, also update things like temperature and precipitation norms every decade. But the wrench of the impacts from climate change are increasingly complicating how to interpret and use all this information…
A shifting baseline
This issue is no new phenomena – in fact it has a name: shifting baseline syndrome.
Also known as SBS, it was first coined in 1995 after a scientist studying what would be a sustainable catch level for commercial fishing found each generation of fishery scientists used the current conditions as their baselines, not taking into account the degradation that had occurred from past over-fishing.
“The shifting baseline syndrome is the situation in which, over time, knowledge is lost about the state of the natural world, because people don’t perceive changes that are actually taking place,” Dr. E.J. Milner-Gulland, who authored a paper on SBS, said in 2009. “In this way, people’s perceptions of change are out of kilter with the actual changes taking place in the environment.”
The new snowpack averages are a relatively small piece of the puzzle, but it does warp our reference point, Remke said, because the newer data reflects drier years that are now the new standard. “This is an important issue to be aware of,” he said.
The situation grows ever more complicated when taking into account the tools and technology for measuring snowpack (there’s a lot of variability in numbers, methods, etc.). And with only 700 or so SNOTEL sites across the West, mostly installed in the 1980s, data is limited.
But, according a report in Forbes, some climate scientists are urging agencies to stick with a 30-year time period, rather than update the standards every decade and reinforce shifting baselines. And, a report on SBS said it could be combated by environmental restoration, increased data collection, education and quite simply, having more people interact with nature.
All this is important to think about, Burke said: Basing snowpack data on the past 30 years will likely yield more accurate predictions of next year’s snowpack, and may make us feel better about this year’s numbers. But it will also obscure the seriousness of the drought in which we currently find ourselves.