Beginning in 1970, Americans and later citizens across the globe have celebrated Earth Day on April 22. It’s a day dedicated annually to civic action, volunteerism and other activities to support and promote environmental protection and green living.
This year, Fresh Water News is using Earth Day as an opportunity to highlight a handful of Colorado projects and businesses that are moving the needle on water conservation and sustainability. Here are their stories.
Booze that doesn’t “destroy the planet”
In 2010, Connie Baker attended distilling school somewhat on a whim — she’d always loved vodka and thought learning more about how it’s made would be a fun week-long vacation.
In the end, though, Baker fell in love with distilling and, along with her husband, Carey Shanks, began planning to open a new distillery not far from their home in Carbondale, Colo.
But after touring distilleries around the country for inspiration, they began to fully understand just how resource-intensive — and wasteful — distilling as an industry often was. Traditional distilleries send tens of thousands of gallons of clean water down the drain during the production process — water that could easily be reused, if only they had the right setup.
“I love vodka, but I don’t want to destroy the plant to make it,” said Baker.
Instead of accepting the status quo, Baker and Shanks decided to design and build their own sustainable distillery from the ground up. Their crown jewel? A custom water energy thermal system, WETS for short, that recaptures 100 percent of the water and energy used during the distillation process.
They officially opened Marble Distilling in 2015. Ever since, their WETS system has saved more than four million gallons of water and 1.8 billion BTUs of energy per year. The recaptured energy is enough to heat and cool the distillery, which includes a five-room boutique hotel on the second floor, and to power much of the distilling process.
The distillery’s water bill is regularly less than $100 a month. While most distilleries use the equivalent of 100 bottles of water to produce one bottle of vodka, Marble uses the equivalent of just one bottle of water per bottle of vodka. (They also make bourbon, whiskey and liqueurs.)
“The only water we’re using for the spirit is what’s in the bottle,” Baker said.
Baker and Shanks also freely share information about their WETS system and other sustainable elements with anyone and everyone who’s curious, including and especially other distilleries.
“We don’t want to own this information,” Baker said. “We want to be leaders in the industry for change. We have proven over the course of six years that it absolutely can be done. It makes sense not only from a sustainability standpoint but from an economic standpoint. There’s no reason not to do it. It’s not any harder, so why wouldn’t you do it?”
Sustainability at 14,000 feet
The infrastructure atop the iconic 14,115-foot Pikes Peak is getting a refresh — and one that’s particularly friendly to water.
Construction crews are finishing up work on the new Pikes Peak Summit Complex, which includes a visitor center, a high-altitude research laboratory, and a municipal utility facility.
Visitors to the summit number upwards of 750,000 annually, and the previous facilities that welcomed them at the top were deteriorating. Replacing them created an opportunity to do things differently. The 38,000-square-foot complex, which is set to open around Memorial Day, aims to be net-zero for energy, waste and water consumption; it also hopes to become the first Living Building Challenge-certified project in Colorado, a rigorous green building standard created by the International Living Future Institute.
The project, which is expected to cost $60 million to $65 million when complete, incorporates a number of water-saving and conservation features, including a pioneering on-site wastewater treatment plant, a vacuum toilet system, low-flow fixtures, and a rainwater harvest system for potential future use.
Even with increased visitor numbers, the new complex is expected to use 40 to 50 percent less water than the 1960s-era Summit House it will replace. That water has to be hauled up the mountain, a 40-mile round trip.
In 2018, crews hauled 600,000 gallons of fresh water to the summit, according to Jack Glavan, manager of Pikes Peak – America’s Mountain, a self-supporting enterprise of the City of Colorado Springs. (Colorado Springs operates the Pikes Peak Recreation Corridor, which includes the Pikes Peak Highway and related facilities, through a special use permit granted by the U.S. Forest Service, which owns the land.) The new facility should cut that down to between 300,000 and 350,000 gallons a year, Glavan said.
“In the past, we used roughly a gallon to 1.2 gallons per person, and with this water system, we’re figuring we’re going to cut that down to 0.4 to 0.5 gallons per person,” said Glavan.
Similarly, the water-savvy upgrades will allow the facility to halve the amount of wastewater it hauls down to the Las Vegas Street Wastewater Treatment Plant, which requires an 80-mile round trip.
On top of the water efficiencies, the upgrades will also reduce vehicle trips and associated emissions. Freshwater trips are expected to drop from 127 to 72 per year, and wastewater trips from 174 to 69.
The building also aims to be one of the first in Colorado to reuse water that’s been treated on-site. But for final approval from the state, complex managers must first prove that the wastewater system works, a process that will likely involve about a year of sampling, Glavan said. Assuming all goes according to plan, the facility will use reclaimed water for toilets and urinals.
All told, the facility’s leaders hope that these and many other sustainable design features — undertaken as part of the highest-altitude construction project in the United States, on top of the mountain that inspired the lyrics of “America the Beautiful” — encourage others to reduce their impact on the environment in whatever way possible.
“We’re proud to be doing it,” Glavan said. “It does cost a little bit more incrementally but we are America’s mountain and we’re hoping we’re setting an example for everyone. If we can do it up here at 14,000 feet, people should be able to do it at lower altitudes.”
While working as a hotel engineer at the ART Hotel in Denver several years ago, Mac Marsh noticed that whenever he responded to a maintenance request in the kitchen, the faucet was almost always running. But why?
After some investigating, he found out that running cold water over frozen food was the industry standard when it wasn’t possible to defrost it in the refrigerator. These food-safety defrosting guidelines, set by the U.S. Department of Agriculture’s Food Safety and Inspection Service and followed by local health officials, are intended to keep restaurants’ guests safe and healthy, since keeping food cool as it defrosts helps prevent the growth of harmful bacteria and pathogens.
But it takes one hour to defrost one pound of meat under cold water, which equates to about 150 gallons of water per pound. When he began to think about all the restaurants and all the food they defrosted on a daily basis, Marsh realized he had to act.
He invented a novel solution to the problem: a device that can recirculate cold water in a sink or basin. His Boss Defrost device, which plugs into a power outlet, is also equipped with a thermometer, which helps users ensure the water stays below the recommended 71 degrees Fahrenheit. The Denver company began manufacturing the devices, now used in more than 25 states, in January 2020.
The company’s leaders say Boss Defrost can reduce a restaurant’s defrosting water use to about 450 gallons per month on average, a sharp decline from the approximately 32,000 gallons that an average commercial kitchen uses to defrost food each month.
“This water waste is food service’s skeleton in the closet,” said Diana López Starkus, who’s a partner in the business along with her husband, Chris Starkus, an award-winning Denver chef and farmer. “It happens all along the food chain, from fast food to fine dining, K-12 schools, college campuses, hospitals, hospice and state and federal buildings.”
Though the pandemic — and ensuing restaurant shutdowns and capacity limits — slowed down the company’s growth, it also gave them an opportunity to expand into grocery meat and seafood departments.
Sales picked up again when restaurants began to reopen, since their owners were looking for every possible way to save money as they recovered from the pandemic. Starkus said the device generally pays for itself in water bill savings in one to three months.
“We like to say it’s a win-win-win,” Starkus said. “Good for the earth, good for your wallet and the easiest sustainability measure to initiate in 2021. “We’re passionate about empowering ourselves and others to create positive change toward a better future. That’s why we call it Boss Defrost, because every prep cook in the nation can become an environmental boss, someone that’s working optimally, respecting the resources at their fingertips and staying financially sound.”
Sarah Kuta is a freelance writer based in Longmont, Colorado. She can be reached at email@example.com.
Since 2017, River Network has worked to increase the number and quality of Stream Management Plans in Colorado. Stream Management Plans, or SMPs, were developed as a result of 2015’s Colorado’s Water Plan, which set goals and measurable objectives to map out the future of water management in the state. One of these objectives is that 80% of locally prioritized streams have an SMP by 2030. River Network is helping watershed coalitions meet this objective by developing guidance on best practices, facilitating a peer learning network, and providing direct support to local coalitions throughout Colorado.
SMPs are data-driven assessments of river health that help communities determine how to protect or enhance environmental and recreational assets in their watershed. SMPs are accomplished by stakeholders convening to evaluate the health of their local river through an assessment of biological, hydrological, geomorphological and other data. This site-specific information is used to assess the flows, water quality, habitat, and other physical conditions that are needed to support collaboratively identified environmental and/or recreational values. To date, there are 26 SMPs that have been completed or are underway. SMPs are as much about people and communities as they are about the functional health of the river. Community and stakeholder buy-in is seen as a critical aspect of a successful SMP.
As the second-largest economic sector and the largest consumer of water in Colorado, agriculture is a key stakeholder in SMPs. In the San Luis Valley, the Rio Grande Headwaters Restoration Project has done an incredible job at engaging local farmers and ranchers in their SMP and related projects, many of whom have been farming and ranching there for generations. In a recent trip River Network staff, Mikhaela Mullins, had the opportunity to hear directly from these ranchers to discuss the deep connection they have with the land and the Rio Grande River.
Local ranchers, Greg Higel, Rick Davie, Thad Elliott, and Kyler Brown, shared that stewardship for the land and water has always been important to them and their families. In recent years they had wanted to make improvements to their ditches, diversion structures, and headgates but lacked the resources to make these needed improvements. When they were approached by the Rio Grande Headwaters Restoration Project about partnering on infrastructure improvement projects, they were eager for the opportunity to work together. “The river needed help, and we needed to make sure we did that right,” says Greg Higel, Centennial Ditch Superintendent. Through these partnerships, a number of ditches and related infrastructure were updated. Over time, the ranchers have been able to reduce the amount of time needed to maintain these structures and have seen water quality improve, wildlife return to their land, an increase in riparian plant diversity, and an increase in water quantity resulting in a longer season of water access. The ranchers spoke about how working with Rio Grande Headwaters Restoration Project and other conservation organizations has been a win-win-win situation for all involved in these multi-beneficial projects.
In the future, River Network will continue to support watershed coalitions as they tackle important river planning and identify how it can provide benefits to farmers and ranchers. River Network looks forward to continuing to shift the conversation between conservation and agricultural stakeholders by expanding the role of agricultural organizations, such as conservation districts, to have more of a leadership role. Learn more about the work that River Network has done in Colorado in this video.
The water level of Lake Mead, the country’s largest reservoir, has dropped more than 130 feet since the beginning of 2000, when the lake’s surface lapped at the spillway gates on Hoover Dam.
Twenty-one years later, with the Colorado River consistently yielding less water as the climate has grown warmer and drier, the reservoir near Las Vegas sits at just 39% of capacity. And it’s approaching the threshold of a shortage for the first time since it was filled in the 1930s.
The latest projections from the federal government show the reservoir will soon fall 7 more feet to cross the trigger point for a shortage in 2022, forcing the largest mandatory water cutbacks yet in Arizona, Nevada and Mexico.
The river’s reservoirs are shrinking as the Southwest endures an especially severe bout of dryness within a two-decade drought intensified by climate change, one of the driest periods in centuries that shows no sign of letting up.
With a meager snowpack in the Rocky Mountains and the watershed extremely parched, this month’s estimates from the federal Bureau of Reclamation show Lake Mead could continue to decline through next year and into 2023, putting the Southwest on the brink of more severe shortages and larger water cuts.
“What really is starting to emerge is this really long pattern, that we’re in a megadrought in a lot of the western U.S.,” said Laura Condon, an assistant professor of hydrology and atmospheric sciences at the University of Arizona. “It’s kind of like a cumulative impact, that we’ve just been getting hotter and drier and hotter and drier.”
Many scientists describe the past two decades in the Colorado River Basin as a megadrought that’s being worsened by higher temperatures with climate change. While the Southwest has always cycled through wet and dry periods, some scientists suggest the word “drought” is no longer entirely adequate and that the Colorado River watershed is undergoing “aridification” driven by human-caused warming — a long-term trend of more intense dry spells that’s here for good and will complicate water management for generations to come.
Both Lake Mead and the upstream reservoir Lake Powell are dropping. Taken together, the country’s two largest reservoirs now hold the smallest quantity of water since 1965, when Powell was still filling behind the newly built Glen Canyon Dam.
The Colorado River has long been overallocated to supply farmlands and growing cities from Denver to Phoenix to Los Angeles. And the growing strains on the river suggest that Lake Mead, its sides coated with a whitish “bathtub ring” of minerals along its retreating shorelines, will continue to present challenges as the Southwest adapts to a shrinking source of water.
“There will still be ups and downs and we will have wetter and drier years going forward but overall warmer temperatures mean we should expect a drier basin with less water,” Condon said. “Warmer temperatures increase the amount of water plants use and decrease snowpack. Even if we get exactly the same quantity of precipitation, a warmer basin will produce less streamflow from that precipitation.”
Representatives of the seven states that depend on the river met at Hoover Dam in 2019 and signed a set of agreements, called the Drought Contingency Plan, laying out steps to reduce the risks of a damaging crash. Arizona and Nevada agreed to take the first cuts to help prop up Lake Mead, while California agreed to participate at lower shortage levels if the reservoir continues to drop.
The states’ water officials described the deal as a “bridge” agreement to temporarily lessen the risks and buy some time through 2026, by which time new rules for sharing shortages must be negotiated and adopted.
Under the deal, Arizona and Nevada have left some water in Lake Mead in 2020 and 2021. Those reductions are set to increase next year under the “Tier 1” shortage, which the federal government is expected to declare in August.
Arizona is in line for the largest cuts, which will reduce the Central Arizona Project’s water supply by nearly a third and shrink the amount flowing through the CAP Canal to farmlands in Pinal County. Nevada is also taking less water, and Mexico is contributing under a separate deal by leaving some of its supplies in Lake Mead.
“We have a plan to deal with these shortages,” said Tom Buschatzke, director of the Arizona Department of Water Resources. “We’ve known this was possible for a long time and have planned for it.”
He and other officials say the Drought Contingency Plan never guaranteed the region would escape a shortage, but that it has reduced the odds of Mead falling to critical lows and has pushed back the possibility of more severe shortages and larger cuts. Buschatzke said voluntary conservation measures by the states and Mexico since 2014, plus the initial mandatory cuts over the past two years, have left about 40 feet of conserved water in Lake Mead.
“We would already be in a Tier 2 shortage had that water not stayed in the lake,” Buschatzke said during a panel discussion hosted by the Arizona Capitol Times. “It’s what we can do to slow the reduction in Lake Mead and minimize the depth and length of the shortages.”
A warmer watershed, a shrinking river
Scientists have found that the Colorado River is sensitive to rising temperatures as the planet heats up with the burning of fossil fuels. In one study, scientists determined that about half the trend of decreasing runoff in the river’s Upper Basin since 2000 was the result of unprecedented warming.
In other research, scientists estimated the river could lose roughly one-fourth of its flow by 2050 as temperatures continue to rise. They projected that for each additional 1 degree C (1.8 degrees F) of warming, the river’s average flow is likely to drop by about 9%.
The past year has been especially harsh. Ultradry conditions intensified across much of the West, with extreme heat adding to the dryness throughout the Colorado River watershed. According to the National Weather Service, the past 12 months were the driest on record in Utah, Nevada, Arizona and New Mexico, and the fourth-driest in Colorado, where much of the river’s flow originates.
Lake Powell now stands just 36% full.
The reservoir typically gets a boost in the spring and summer as the river swells with runoff from melting snow. But this winter, the snowpack peaked at 88% of the long-term median and has since dropped to 71% of the median. The dry soils in the watershed are soaking up some of the melting snow like a sponge, leaving less water running into the Colorado and its tributaries.
The amount of water that will flow into Powell from April through July is now estimated at just 38% of average.
Water researchers Eric Kuhn and John Fleck said their analysis of the latest federal numbers points to some alarming possibilities. The two — who coauthored the book “Science Be Dammed: How Ignoring Inconvenient Science Drained the Colorado River” — wrote in separate blog posts that a careful reading of the data in the 24-month study, which only goes out to March 2023, shows the projections point to bigger troubles at Mead and Powell later that year.
Fleck wrote that the “most likely” scenario would put the level of Mead at an elevation around 1,035 feet at the end of September 2023, which would trigger larger cuts for Arizona, Nevada and Mexico, as well as California’s participation in reductions.
“I’m talking about the midpoint in a range of possible outcomes,” Fleck wrote. “A run of wet weather could make things substantially better. But a run of dry weather could make them worse.”
Kuhn wrote that the assumptions in the government study “do not fully capture the climate-change driven aridification of the Colorado River Basin.” He said the projections suggest Lake Powell could drop in 2023 to “a level that is troublingly close to the elevation at which Glen Canyon Dam could no longer generate hydropower.”
Across the West, snow has traditionally stored a vital portion of the water, gradually melting and releasing runoff in the spring and summer. But that’s changing with higher temperatures. Researchers from the University of California, Irvine, found in a study last year that the western U.S. has experienced longer and more intense “snow droughts” in the second half of the period from 1980 to 2018.
“The main issue is the snow drought everywhere in the entire West, including Arizona, Utah, California, Colorado,” said Amir AghaKouchak, a professor in UC Irvine’s Department of Earth System Science. “When the snow is below average, it means low-flow situations in summer, drier soil moisture. And drier soil moisture increases the chance of heat waves.”
The upshot, he said, is that “we have to prepare for a different hydrologic cycle, basically.”
Warm and dry in the headwaters
With higher temperatures, more snow has been melting earlier in the year. Scientists recently examined 40 years of data from snow monitoring sites across the western U.S. and Canada and found increasing winter snowmelt at a third of the sites…
With higher temperatures, more snow has been melting earlier in the year. Scientists recently examined 40 years of data from snow monitoring sites across the western U.S. and Canada and found increasing winter snowmelt at a third of the sites.
Other researchers have discovered that the dry periods between rainstorms have grown longer on average across the western United States during the past 45 years. Scientists with the U.S. Department of Agriculture and the University of Arizona found this trend throughout the West in their study, but they saw the most extreme changes in the desert Southwest, where rainstorms have been happening much less frequently.
The average dry period between storms in the desert Southwest has gone from 31 days to 48 days, an increase of about 50 percent since the 1970s, the scientists found. Annual precipitation declined by about 3.2 inches in the region over that period, a much larger decline that the West as a whole.
“In the desert Southwest, we were averaging around 10 inches and now we’re averaging around 7 inches,” said Joel Biederman, a hydrologist at USDA’s Southwest Watershed Research Center in Tucson. “That’s much more impactful when you consider that the amount in our region is smaller to begin with.”
Biederman and his colleagues focused on changes that have been measured and didn’t attempt to parse the influences of natural variations and climate change.
A separate analysis of climate data over the past 30 years by the National Oceanic and Atmospheric Administration shows the nation’s “normals,” or averages, have shifted dramatically in a decade, growing wetter in the central and eastern U.S. and drier in the Southwest while climate change has pushed temperatures higher.
Another group of scientists at Los Alamos National Laboratory recently looked at how interconnected extremes influenced by climate change — from floods to droughts and heatwaves — are expected to intensify in the future in the Colorado River Basin. They found these sorts of concurrent extreme climatic events “are projected to increase in the future and intensify” in key regions of the watershed.
Lights Out Colorado is a voluntary program to help migratory birds.
The National Audubon Society, the International Dark Sky Association, and Denver Audubon are partnering to promote the new program.
Every year in North America, more than 3.5 billion birds move north in the spring and 4 billion birds fly south in the fall. More than 80 percent of them travel at night, navigating with the night sky. However, as they pass over big cities on their way they can become disoriented by bright artificial lights and skyglow, often causing them to collide with buildings or windows.
While lights can throw birds off their migration paths, bird deaths are more directly caused by the amount of energy the birds waste flying around and calling out in confusion. The exhaustion can then leave them vulnerable to other urban threats and deplete their energy needed for surviving migration and producing chicks in subsequent breeding seasons.
Fortunately, the simple action of turning off lights can help birds navigate urban environments and protect them from unnecessary harm. The National Audubon Society, the International Dark-sky Association, and Denver Audubon have partnered to launch Lights Out Colorado, a new program that aims to help Coloradans save millions of birds as they take part in spring and fall migrations.
Lights Out Colorado provides two simple steps communities can make to have a big impact on birds:
Shield outdoor lights to prevent light from being emitted upwards.
Turn off lights by midnight during bird migration seasons (April-May and August-September).
It is particularly important to take these measures as early in the evening as possible, as migrants begin their nocturnal migrations at dusk, during spring and fall migration periods. In addition to helping birds, these efforts have the additional benefits of reducing energy usage and saving money.
There are several common-sense exceptions to these guidelines. First, lighting activated by motion sensors can stay powered on. Second, businesses open late can keep their lights on until the business closes. Third, lighting needed for safety should stay on. Finally, local governments may choose from a variety of options for public lighting. Only lighting that is not needed should be shut off.
Portions of Colorado saw slight improvements for drought conditions, with a small part of the state reaching drought-free status for the first time since mid-2020 according to the most recent reportion from the National Drought Mitigation Center.
Western Colorado continues to suffer under extreme and exceptional drought, with some additional area in severe conditions. Extreme drought is also impacting southern Las Animas, southwest Baca and central Kiowa counties.
North central and a small part of northeast Colorado improved for the week. Central Larimer County, along with northeast Boulder and a sliver of southwest Weld counties moved to drought-free conditions – the first time any part of the state has been free from drought or abnormally dry conditions since July 2020.
Much of western Logan County moved from moderate drought to being abnormally dry. A similar improvement was seen in western Weld County, along with portions of Boulder, Broomfield, Gilpin, Clear Creek, Jefferson, Adams, Arapahoe and Denver counties.
Further south, severe drought shifted to moderate conditions for northeast Park, southern Jefferson and most of Douglas counties.
Conditions elsewhere in the state were unchanged this week.
Overall, one percent of Colorado is drought-free, while an additional 10 percent is abnormally dry, up from eight percent last week. Moderate drought covers 29 percent of the state, down from 31 percent, while severe conditions account for 28 percent, down from 30 percent. Extreme drought is present in 17 percent of Colorado, with 15 percent in exceptional conditions – both unchanged from the prior week.
Sometimes realisation comes in a blinding flash. Blurred outlines snap into shape and suddenly it all makes sense. Underneath such revelations is typically a much slower-dawning process. Doubts at the back of the mind grow. The sense of confusion that things cannot be made to fit together increases until something clicks. Or perhaps snaps.
Collectively we three authors of this article must have spent more than 80 years thinking about climate change. Why has it taken us so long to speak out about the obvious dangers of the concept of net zero? In our defence, the premise of net zero is deceptively simple – and we admit that it deceived us.
The threats of climate change are the direct result of there being too much carbon dioxide in the atmosphere. So it follows that we must stop emitting more and even remove some of it. This idea is central to the world’s current plan to avoid catastrophe. In fact, there are many suggestions as to how to actually do this, from mass tree planting, to high tech direct air capture devices that suck out carbon dioxide from the air.
The current consensus is that if we deploy these and other so-called “carbon dioxide removal” techniques at the same time as reducing our burning of fossil fuels, we can more rapidly halt global warming. Hopefully around the middle of this century we will achieve “net zero”. This is the point at which any residual emissions of greenhouse gases are balanced by technologies removing them from the atmosphere.
This is a great idea, in principle. Unfortunately, in practice it helps perpetuate a belief in technological salvation and diminishes the sense of urgency surrounding the need to curb emissions now.
We have arrived at the painful realisation that the idea of net zero has licensed a recklessly cavalier “burn now, pay later” approach which has seen carbon emissions continue to soar. It has also hastened the destruction of the natural world by increasing deforestation today, and greatly increases the risk of further devastation in the future.
To understand how this has happened, how humanity has gambled its civilisation on no more than promises of future solutions, we must return to the late 1980s, when climate change broke out onto the international stage.
Steps towards net zero
On June 22 1988, James Hansen was the administrator of Nasa’s Goddard Institute for Space Studies, a prestigious appointment but someone largely unknown outside of academia.
By the afternoon of the 23rd he was well on the way to becoming the world’s most famous climate scientist. This was as a direct result of his testimony to the US congress, when he forensically presented the evidence that the Earth’s climate was warming and that humans were the primary cause: “The greenhouse effect has been detected, and it is changing our climate now.”
If we had acted on Hanson’s testimony at the time, we would have been able to decarbonise our societies at a rate of around 2% a year in order to give us about a two-in-three chance of limiting warming to no more than 1.5°C. It would have been a huge challenge, but the main task at that time would have been to simply stop the accelerating use of fossil fuels while fairly sharing out future emissions.
Four years later, there were glimmers of hope that this would be possible. During the 1992 Earth Summit in Rio, all nations agreed to stabilise concentrations of greenhouse gases to ensure that they did not produce dangerous interference with the climate. The 1997 Kyoto Summit attempted to start to put that goal into practice. But as the years passed, the initial task of keeping us safe became increasingly harder given the continual increase in fossil fuel use.
It was around that time that the first computer models linking greenhouse gas emissions to impacts on different sectors of the economy were developed. These hybrid climate-economic models are known as Integrated Assessment Models. They allowed modellers to link economic activity to the climate by, for example, exploring how changes in investments and technology could lead to changes in greenhouse gas emissions.
They seemed like a miracle: you could try out policies on a computer screen before implementing them, saving humanity costly experimentation. They rapidly emerged to become key guidance for climate policy. A primacy they maintain to this day.
Unfortunately, they also removed the need for deep critical thinking. Such models represent society as a web of idealised, emotionless buyers and sellers and thus ignore complex social and political realities, or even the impacts of climate change itself. Their implicit promise is that market-based approaches will always work. This meant that discussions about policies were limited to those most convenient to politicians: incremental changes to legislation and taxes.
This story is a collaboration between Conversation Insights and Apple News editors
The Insights team generates long-form journalism and is working with academics from different backgrounds who have been engaged in projects to tackle societal and scientific challenges.
Around the time they were first developed, efforts were being made to secure US action on the climate by allowing it to count carbon sinks of the country’s forests. The US argued that if it managed its forests well, it would be able to store a large amount of carbon in trees and soil which should be subtracted from its obligations to limit the burning of coal, oil and gas. In the end, the US largely got its way. Ironically, the concessions were all in vain, since the US senate never ratified the agreement.
Postulating a future with more trees could in effect offset the burning of coal, oil and gas now. As models could easily churn out numbers that saw atmospheric carbon dioxide go as low as one wanted, ever more sophisticated scenarios could be explored which reduced the perceived urgency to reduce fossil fuel use. By including carbon sinks in climate-economic models, a Pandora’s box had been opened.
It’s here we find the genesis of today’s net zero policies.
That said, most attention in the mid-1990s was focused on increasing energy efficiency and energy switching (such as the UK’s move from coal to gas) and the potential of nuclear energy to deliver large amounts of carbon-free electricity. The hope was that such innovations would quickly reverse increases in fossil fuel emissions.
But by around the turn of the new millennium it was clear that such hopes were unfounded. Given their core assumption of incremental change, it was becoming more and more difficult for economic-climate models to find viable pathways to avoid dangerous climate change. In response, the models began to include more and more examples of carbon capture and storage, a technology that could remove the carbon dioxide from coal-fired power stations and then store the captured carbon deep underground indefinitely.
This had been shown to be possible in principle: compressed carbon dioxide had been separated from fossil gas and then injected underground in a number of projects since the 1970s. These Enhanced Oil Recovery schemes were designed to force gases into oil wells in order to push oil towards drilling rigs and so allow more to be recovered – oil that would later be burnt, releasing even more carbon dioxide into the atmosphere.
Carbon capture and storage offered the twist that instead of using the carbon dioxide to extract more oil, the gas would instead be left underground and removed from the atmosphere. This promised breakthrough technology would allow climate friendly coal and so the continued use of this fossil fuel. But long before the world would witness any such schemes, the hypothetical process had been included in climate-economic models. In the end, the mere prospect of carbon capture and storage gave policy makers a way out of making the much needed cuts to greenhouse gas emissions.
The rise of net zero
When the international climate change community convened in Copenhagen in 2009 it was clear that carbon capture and storage was not going to be sufficient for two reasons.
First, it still did not exist. There were no carbon capture and storage facilities in operation on any coal fired power station and no prospect the technology was going to have any impact on rising emissions from increased coal use in the foreseeable future.
The biggest barrier to implementation was essentially cost. The motivation to burn vast amounts of coal is to generate relatively cheap electricity. Retrofitting carbon scrubbers on existing power stations, building the infrastructure to pipe captured carbon, and developing suitable geological storage sites required huge sums of money. Consequently the only application of carbon capture in actual operation then – and now – is to use the trapped gas in enhanced oil recovery schemes. Beyond a single demonstrator, there has never been any capture of carbon dioxide from a coal fired power station chimney with that captured carbon then being stored underground.
Just as important, by 2009 it was becoming increasingly clear that it would not be possible to make even the gradual reductions that policy makers demanded. That was the case even if carbon capture and storage was up and running. The amount of carbon dioxide that was being pumped into the air each year meant humanity was rapidly running out of time.
With hopes for a solution to the climate crisis fading again, another magic bullet was required. A technology was needed not only to slow down the increasing concentrations of carbon dioxide in the atmosphere, but actually reverse it. In response, the climate-economic modelling community – already able to include plant-based carbon sinks and geological carbon storage in their models – increasingly adopted the “solution” of combining the two.
So it was that Bioenergy Carbon Capture and Storage, or BECCS, rapidly emerged as the new saviour technology. By burning “replaceable” biomass such as wood, crops, and agricultural waste instead of coal in power stations, and then capturing the carbon dioxide from the power station chimney and storing it underground, BECCS could produce electricity at the same time as removing carbon dioxide from the atmosphere. That’s because as biomass such as trees grow, they suck in carbon dioxide from the atmosphere. By planting trees and other bioenergy crops and storing carbon dioxide released when they are burnt, more carbon could be removed from the atmosphere.
With this new solution in hand the international community regrouped from repeated failures to mount another attempt at reining in our dangerous interference with the climate. The scene was set for the crucial 2015 climate conference in Paris.
A Parisian false dawn
As its general secretary brought the 21st United Nations conference on climate change to an end, a great roar issued from the crowd. People leaped to their feet, strangers embraced, tears welled up in eyes bloodshot from lack of sleep.
The emotions on display on December 13, 2015 were not just for the cameras. After weeks of gruelling high-level negotiations in Paris a breakthrough had finally been achieved. Against all expectations, after decades of false starts and failures, the international community had finally agreed to do what it took to limit global warming to well below 2°C, preferably to 1.5°C, compared to pre-industrial levels.
The Paris Agreement was a stunning victory for those most at risk from climate change. Rich industrialised nations will be increasingly impacted as global temperatures rise. But it’s the low lying island states such as the Maldives and the Marshall Islands that are at imminent existential risk. As a later UN special report made clear, if the Paris Agreement was unable to limit global warming to 1.5°C, the number of lives lost to more intense storms, fires, heatwaves, famines and floods would significantly increase.
But dig a little deeper and you could find another emotion lurking within delegates on December 13. Doubt. We struggle to name any climate scientist who at that time thought the Paris Agreement was feasible. We have since been told by some scientists that the Paris Agreement was “of course important for climate justice but unworkable” and “a complete shock, no one thought limiting to 1.5°C was possible”. Rather than being able to limit warming to 1.5°C, a senior academic involved in the IPCC concluded we were heading beyond 3°C by the end of this century.
Instead of confront our doubts, we scientists decided to construct ever more elaborate fantasy worlds in which we would be safe. The price to pay for our cowardice: having to keep our mouths shut about the ever growing absurdity of the required planetary-scale carbon dioxide removal.
Taking centre stage was BECCS because at the time this was the only way climate-economic models could find scenarios that would be consistent with the Paris Agreement. Rather than stabilise, global emissions of carbon dioxide had increased some 60% since 1992.
Alas, BECCS, just like all the previous solutions, was too good to be true.
Across the scenarios produced by the Intergovernmental Panel on Climate Change (IPCC) with a 66% or better chance of limiting temperature increase to 1.5°C, BECCS would need to remove 12 billion tonnes of carbon dioxide each year. BECCS at this scale would require massive planting schemes for trees and bioenergy crops.
The Earth certainly needs more trees. Humanity has cut down some three trillion since we first started farming some 13,000 years ago. But rather than allow ecosystems to recover from human impacts and forests to regrow, BECCS generally refers to dedicated industrial-scale plantations regularly harvested for bioenergy rather than carbon stored away in forest trunks, roots and soils.
Currently, the two most efficient biofuels are sugarcane for bioethanol and palm oil for biodiesel – both grown in the tropics. Endless rows of such fast growing monoculture trees or other bioenergy crops harvested at frequent intervals devastate biodiversity.
It has been estimated that BECCS would demand between 0.4 and 1.2 billion hectares of land. That’s 25% to 80% of all the land currently under cultivation. How will that be achieved at the same time as feeding 8-10 billion people around the middle of the century or without destroying native vegetation and biodiversity?
Growing billions of trees would consume vast amounts of water – in some places where people are already thirsty. Increasing forest cover in higher latitudes can have an overall warming effect because replacing grassland or fields with forests means the land surface becomes darker. This darker land absorbs more energy from the Sun and so temperatures rise. Focusing on developing vast plantations in poorer tropical nations comes with real risks of people being driven off their lands.
And it is often forgotten that trees and the land in general already soak up and store away vast amounts of carbon through what is called the natural terrestrial carbon sink. Interfering with it could both disrupt the sink and lead to double accounting.
As these impacts are becoming better understood, the sense of optimism around BECCS has diminished.
Given the dawning realisation of how difficult Paris would be in the light of ever rising emissions and limited potential of BECCS, a new buzzword emerged in policy circles: the “overshoot scenario”. Temperatures would be allowed to go beyond 1.5°C in the near term, but then be brought down with a range of carbon dioxide removal by the end of the century. This means that net zero actually means carbon negative. Within a few decades, we will need to transform our civilisation from one that currently pumps out 40 billion tons of carbon dioxide into the atmosphere each year, to one that produces a net removal of tens of billions.
Mass tree planting, for bioenergy or as an attempt at offsetting, had been the latest attempt to stall cuts in fossil fuel use. But the ever-increasing need for carbon removal was calling for more. This is why the idea of direct air capture, now being touted by some as the most promising technology out there, has taken hold. It is generally more benign to ecosystems because it requires significantly less land to operate than BECCS, including the land needed to power them using wind or solar panels.
Unfortunately, it is widely believed that direct air capture, because of its exorbitant costs and energy demand, if it ever becomes feasible to be deployed at scale, will not be able to compete with BECCS with its voracious appetite for prime agricultural land.
It should now be getting clear where the journey is heading. As the mirage of each magical technical solution disappears, another equally unworkable alternative pops up to take its place. The next is already on the horizon – and it’s even more ghastly. Once we realise net zero will not happen in time or even at all, geoengineering – the deliberate and large scale intervention in the Earth’s climate system – will probably be invoked as the solution to limit temperature increases.
One of the most researched geoengineering ideas is solar radiation management – the injection of millions of tons of sulphuric acid into the stratosphere that will reflect some of the Sun’s energy away from the Earth. It is a wild idea, but some academics and politicians are deadly serious, despite significant risks. The US National Academies of Sciences, for example, has recommended allocating up to US$200 million over the next five years to explore how geoengineering could be deployed and regulated. Funding and research in this area is sure to significantly increase.
In principle there is nothing wrong or dangerous about carbon dioxide removal proposals. In fact developing ways of reducing concentrations of carbon dioxide can feel tremendously exciting. You are using science and engineering to save humanity from disaster. What you are doing is important. There is also the realisation that carbon removal will be needed to mop up some of the emissions from sectors such as aviation and cement production. So there will be some small role for a number of different carbon dioxide removal approaches.
The problems come when it is assumed that these can be deployed at vast scale. This effectively serves as a blank cheque for the continued burning of fossil fuels and the acceleration of habitat destruction.
Carbon reduction technologies and geoengineering should be seen as a sort of ejector seat that could propel humanity away from rapid and catastrophic environmental change. Just like an ejector seat in a jet aircraft, it should only be used as the very last resort. However, policymakers and businesses appear to be entirely serious about deploying highly speculative technologies as a way to land our civilisation at a sustainable destination. In fact, these are no more than fairy tales.
The only way to keep humanity safe is the immediate and sustained radical cuts to greenhouse gas emissions in a socially just way.
Academics typically see themselves as servants to society. Indeed, many are employed as civil servants. Those working at the climate science and policy interface desperately wrestle with an increasingly difficult problem. Similarly, those that champion net zero as a way of breaking through barriers holding back effective action on the climate also work with the very best of intentions.
The tragedy is that their collective efforts were never able to mount an effective challenge to a climate policy process that would only allow a narrow range of scenarios to be explored.
Most academics feel distinctly uncomfortable stepping over the invisible line that separates their day job from wider social and political concerns. There are genuine fears that being seen as advocates for or against particular issues could threaten their perceived independence. Scientists are one of the most trusted professions. Trust is very hard to build and easy to destroy.
But there is another invisible line, the one that separates maintaining academic integrity and self-censorship. As scientists, we are taught to be sceptical, to subject hypotheses to rigorous tests and interrogation. But when it comes to perhaps the greatest challenge humanity faces, we often show a dangerous lack of critical analysis.
In private, scientists express significant scepticism about the Paris Agreement, BECCS, offsetting, geoengineering and net zero. Apart from some notable exceptions, in public we quietly go about our work, apply for funding, publish papers and teach. The path to disastrous climate change is paved with feasibility studies and impact assessments.
Rather than acknowledge the seriousness of our situation, we instead continue to participate in the fantasy of net zero. What will we do when reality bites? What will we say to our friends and loved ones about our failure to speak out now?
The time has come to voice our fears and be honest with wider society. Current net zero policies will not keep warming to within 1.5°C because they were never intended to. They were and still are driven by a need to protect business as usual, not the climate. If we want to keep people safe then large and sustained cuts to carbon emissions need to happen now. That is the very simple acid test that must be applied to all climate policies. The time for wishful thinking is over.