Making sense of a chaotic planet: How understanding weather and climate risks depends on supercomputers like NCAR’s

Antonios Mamalakis, University of Virginia

Have you ever stopped to wonder how forecasters can predict the weather days in advance, or how scientists figure out how the climate might evolve under different policies?

The Earth system is a vast web of intertwined processes, from microscopic chemical reactions to towering storms. Ocean currents circulating deep in the Atlantic, forests exchanging carbon with the atmosphere, and humans altering the composition of the air all have effects that ripple through the system. These processes are governed by physical laws, such as conservation of mass, energy and momentum.

All of this plays out on such a large scale that no single human mind can truly grasp it in full. And yet, the system is so sensitive that a small perturbation, given enough time, can steer its trajectory in a dramatically different direction. This sensitivity is called “chaos,” also known as the “butterfly effect.” The planet is, at once, immense and delicate.

Despite this complexity and scale, scientists are able to simulate and anticipate how the climate will change.

How is this even possible? Behind the long-term climate projections that affect our lives sits one of the most remarkable scientific achievements of the modern era: climate models that run on supercomputers.

I am a climate data scientist. My colleagues and I try to understand extreme weather and long-term climate risks by using virtual versions of Earth inside these machines.

What a climate model really is

Here is the simplest way to picture a climate model:

Imagine dividing the entire planet into 3D boxes. At the surface, each box might represent an area 50 to 100 kilometers across. Then we stack boxes upward into the atmosphere and downward into the oceans to create a 3D grid wrapping around the globe.

Each box contains numbers: temperature, wind speed, humidity, sea ice thickness, soil moisture and hundreds of other variables. The model contains mathematical expressions that describe how these variables influence one another: how heat moves, how air rises and sinks, how moisture condenses into clouds, how the ocean absorbs and redistributes energy.

A globe with boxes around it and a close-up of some calculations taking place in one of those boxes.
Climate models are systems of differential equations based on the basic laws of physics, fluid motion and chemistry. They divide the planet into a 3D grid, apply the equations and evaluate the results. Within these models, the atmosphere component, for example, calculates winds, heat transfer, radiation, relative humidity and surface hydrology. NOAA

We then let the model march forward in time, solving the math and updating every variable in every box. Then again. And again.

Now scale that up. Millions of grid boxes. Hundreds of variables per box. Calculations carried out millions of times to simulate decades or even centuries.

And because the system is chaotic, we do not run the model just once. We run it many times with slightly different initial conditions – what scientists call an ensemble – to make sure the result is in fact a true system response to the considered scenario, such as warming temperatures due to increased emissions, and not an effect of chaos.

The result is an astronomical number of calculations. Performing them requires computers capable of executing quadrillions of operations per second – what are known as petaflop-scale supercomputers. A petaflop equals 1 quadrillion – 1,000,000,000,000,000 – calculations per second!

From simulation to real-world decisions

These simulations inform decisions that affect everyday life: how high to elevate homes in flood-prone areas, how to design power grids resilient to prolonged heat waves, how to manage water resources during drought.

Urban planners, engineers, emergency managers and policymakers all rely on information derived from these models.

Dozens of major climate models have been developed around the world by universities, national laboratories and government agencies. Each modeling center builds its own code, makes its own physical assumptions, chooses its own grid resolution and operates its own supercomputing systems. Through international efforts such as the Coupled Model Intercomparison Project, modeling centers agree on common experiments: the same greenhouse gas scenarios and the same volcanic eruptions, for example.

When you hear that extreme rainfall is projected to intensify in a warmer world, or that the Arctic Ocean could become seasonally ice-free within decades, those conclusions are not the result of calculations carried out by a single scientist, a single team of scientists, or even a single model run. They emerge from dozens of independently developed models, run on room-sized supercomputers, under pre-agreed and carefully coordinated experiments.

A map created by an ensemble with multiple computer models shows areas of agreement.
In this example of the use of multiple models, areas in color and without hashmarks indicate regions with high agreement among models, where more than 80% of the models agree on the signs of change. The projections for annual maximum daily precipitation change were made using the Multi-model Coupled Model Intercomparison Project Phase 5 (CMIP5). IPCC

This global collaboration is one of the reasons scientists know so much about climate change. These shared simulations allow scientists around the world to test hypotheses and explore future risks based on models’ consensus.

It is no surprise that the 2021 Nobel Prize in physics recognized pioneers of climate modeling. These models fundamentally transformed humanity’s ability to understand a complex planet.

There is no alternative way to answer “what if” questions about the future climate system. What happens if carbon dioxide doubles? What if emissions decline rapidly? What if a major volcanic eruption injects aerosols into the stratosphere? Because the climate system is so complex, and forces can push it outside the range of historical experience, the past is no longer a reliable guide to the future. So statistical models aren’t enough.

Artificial intelligence cannot replace this foundation either. AI has made impressive progress in short-term weather prediction, learning patterns from vast historical datasets, and producing forecasts with remarkable speed.

But climate projections require extrapolating to conditions the planet has not experienced in modern history – such as higher greenhouse gas concentrations. AI can accelerate simulations and analyze massive amounts of data today, but it cannot replace solving the physical equations that govern the system.

National supercomputing centers are essential

In the United States, major climate modeling efforts have been supported by national laboratories and federal centers, including NASA and the National Center for Atmospheric Research, or NCAR, along with a few research universities.

At NCAR, scientists developed the Community Earth System Model, a comprehensive climate model that’s arguably one of the best models to date and is used by researchers across the country and around the world to study climate change, severe weather, climate effects on wildfires, and atmospheric patterns. It has helped position the United States at the forefront of climate science and enabled the global research community to tackle some of the most pressing challenges of our time.

Running large ensembles with this model requires powerful hardware, data storage systems capable of handling petabytes of output, and engineers who keep these systems operational. This is not a matter of downloading and running a program on a laptop. It is a national-scale scientific enterprise that makes NCAR and its supercomputer essential.

In a warming climate, the stakes are high. The ability to simulate the Earth system at scale is one of the most powerful tools humanity has to prepare for the risks ahead.

Antonios Mamalakis, Assistant Professor of Data Science and Environmental Science, University of Virginia

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Utility-scale battery energy storage facility planned in Hayden — Steamboat Pilot & Today

The estimated 30- to 40-acre utility-scale battery energy storage facility planned by Jupiter Power to be built in 2027 southwest of the Hayden Power Station would look similar to this facility completed in 2024 in Houston. Jupiter Power/Courtesy photo

Click the link to read the article on the Steamboat Pilot & Today website (Suzie Romig). Here’s an excerpt:

February 26, 2026

A utility-scale battery energy storage facility was approved last week by the Colorado Public Utilities Commission with the project planned for construction southwest of the Hayden Power Station within the town limits of Hayden. Megan Castle, communications director for the Public Utilities Commission, said the 400 MW project was approved as part of 10 total bid projects accepted by the state commission. The project in Hayden will be led by Jupiter Power, a developer and operator of utility-scale battery energy storage systems with corporate offices in Austin, Houston and Chicago. The proposed energy storage facility would be located on a 72-acre site on Routt County Road 51 near the Hayden power plant and Yampa Valley Regional Airport and would connect by a new transmission line to the Mount Harris substation southeast of the main Hayden station. The stored energy would be sold to Xcel Energy through a Power Purchase Agreement, said Michelle Aguayo, senior media relations representative at Xcel.

“This project is a Power Purchase Agreement, where another company builds and maintains the project, and we purchase the power produced,” Aguayo said Tuesday.

Hayden Town Manager Mathew Mendisco said municipal leaders are excited about the forthcoming facility, and that Jupiter Power staff already has worked through the preliminary application process with the town and is expected to submit a formal application later this year. Mendisco said the town leadership is happy to have more light industrial development to continue to boost the local tax base.

“Development like this is always good,” Mendisco said. “The energy sector has been part of our economy for many years, and this fits right in. We are excited to have land near the airport developed, and we think this seems to be a good fit.”

Winter plagued by ‘freakishly bad’ atmospheric pattern, meteorologists say — The #Durango Herald #snowpack #drought

Click the link to read the article on The Durango Herald website (Scout Edmondson). Here’s an excerpt:

February 26, 2026

Last week, Southwest Colorado saw enough snow to cancel school and snarl travel. A week later, winter appears to have vaporized – replaced by clear skies, dry roads and temperatures warm enough for sandals and T-shirts. According to the National Weather Service, high temperatures in Durango reached 63 degrees Thursday. It is the latest episode of weird winter weather this year. Jonathan Harvey, an associate professor of geosciences at Fort Lewis College, said in an email to The Durango Herald that winter’s absence is because of stubborn high-pressure ridges steering the jet stream – a belt of fast-moving winds that separates warm tropical air to the south from cold arctic air to the north – toward the northern United States.

“We have spent most of the winter under a ‘ridge’ in the jet stream, which has prevented cold air and storms from hitting our region,” he said.

That, according to National Weather Service forecaster Lucas Boyer, is because of warm seawater temperatures in the Eastern Pacific Ocean.

“We’ve had ample warm water in the in the Eastern Pacific for a lot of the winter,” he said. “We’ve really seen the jet stream get pushed north, which means warm air to the south. It’s been really devastating for any kind of snowfall production.”

Boyer said any storms that have managed to break through that high-pressure ridge were followed by periods of temperatures 10 to 15 degrees above the historical average…The warm, dry winter has hurt snowpack in the San Juan Mountains. Boyer said the water equivalent in the San Juan River Basin is at 17% of the historical median, while the snowpack in the San Juan Mountains is 40% to 50% of average. Unfortunately, Harvey said there is not much time left for mid-elevation snowpack. But, there is still time for high-elevation snowpack.

Colorado Drought Monitor map February 24, 2026.

Wolf Creek Ski Area receives nearly 5 feet of snow, #snowpack jumps — The #PagosaSprings Sun

Westwide SNOTEL basin-filled map February 28, 2026.

Click the link to read the article on the Pagosa Springs Sun website (Clayton Chaney):

February 26, 2026

A “real” winter storm rolled through Pagosa Country last week, leaving behind 1-2 feet, or more, of fresh snow in spots around the county. Meanwhile, Wolf Creek Ski Area reported nearly 5 feet of new snow, with a total of 59 inches falling from Feb. 17 through Feb. 21. According to reports entered into the Community Collaborative Rain, Hail and Snow (CoCoRaHS) network five-day period ranged from 15.8 inches downtown up to 18.6 inches uptown. The highest totals were seen in the northern part of the county, ranging from 24 inches near Lake Hatcher up to 31.5 inches along the Mineral County border. As of Feb. 25, the ski area reported a year-to-date snowfall of 164 inches with a midway depth of 82 inches and a summit depth of 89 inches…

According to the Natural Resources Conservation Services (NRCS), as of Feb. 24, the San Miguel, Dolores, Animas and San Juan River basins were measured to be at 64 percent of its 30-year median snowpack. The statewide snowpack as of Feb. 25 was at 63 percent. A post by Shawn Prochazka from Pagosa Weather notes that the snowpack in the basins jumped from 42 percent to 60 percent during last week’s storm. “That’s an impressive jump this late in the season,” Prochazka writes. As of 11 a.m. on Wednesday, Feb. 25, Wolf Creek Pass, at 10,930 feet, had a snow water equivalent of 14.7 inches. The current amount is 64 percent of that date’s median snow water equivalent…

Colorado Drought Monitor map February 24, 2026.

According to the U.S. Drought Monitor’s most recent map released on Feb. 19, which was valid as of Feb. 17, 100 percent of Archuleta County is in an “abnormally dry” drought stage, with 47.89 percent of the county (the northern half) in a “moderate drought” stage. The far northwestern portion of Archuleta County — 2.17 percent of the county — is in a “severe drought” stage.

Bringing the crowds back to Arches National Park, other national parks: And other public lands briefs — Jonathan P. Thompson (LandDesk.org)

Click the link to read the article on The Land Desk website (Jonathan P. Thompson):

February 24, 2026

🌵 Public Lands 🌲

Just when you thought the GOP’s assaults on public lands couldn’t get any worse, the Trump administration launched a new blitzkrieg on environmental protections.That includes eviscerating the National Environmental Policy Act, the federal law requiring agencies to analyze, mitigate, and avoid impacts of major federal projects and projects on public land.

Interior Secretary Doug Burgum this week announced the “rescission of more than 80% of Interior’s prior NEPA regulations.” The changes, which includes limiting public comment, are aimed at streamlining permitting across the board, much as the department did with its “emergency permitting procedures” for oil and gas, uranium, coal, and critical minerals projects on public lands.

Associate Deputy Secretary Karen Budd-Falen, who is in hot water over potential ethics violations, lauded the changes, saying in a statement: “These reforms will help unleash American energy, strengthen rural communities ,and deliver real results faster for the American people.” As long as they are fossil fuels, that is, since Interior has put a de facto blockade on solar and wind developments on public lands.

Burgum finalized the NEPA rules a few days after opening 2.1 million acres of previously protected public lands in Alaska’s Dalton Corridor to new mining claims and oil and gas drilling.

***

An oil and gas drilling and hydraulic fracturing operation in the Greater Chaco Region near where the BLM plans to sell more leases this August. Jonathan P. Thompson photo.

The Bureau of Land Management is, thankfully, still taking comments on proposed oil and gas leases, though it’s not clear that they will pay them any heed. You have until March 23 to give your two cents on the Farmington Field Office’s plan to auction 12 parcels covering about 16,856 acres this August. The parcels are on the checkerboard, with the biggest block of them about 20 miles east of Chaco Culture National Historical Park.

Find more information and comment at the agency’s project page.

***

This week confirmation hearings begin for Steve Pearce, Trump’s pick to lead the BLM and oversee some 245 million acres of public land.

Pearce is a hard-right Republican, former congressman from New Mexico, and no friend of public lands or environmental protectionsPearce’s political career was infused with hostility toward the agency he has been nominated to oversee. Pearce has opposed new national monument designations, is a fan of drilling public lands, has tried to weaken or eliminate the Endangered Species Act, lied about wolves in an effort to defund the Mexican wolf recovery program, received a 4% score from the League of Conservation Voters.

A few months ago we would have considered his confirmation a slam-dunk, since at the time most Republicans were still willing to debase themselves to any degree to curry favor with Trump. But with Trump’s approval rating plummeting as he suffers from more frequent cognitive mishaps and more revelations of his involvement with Jeffrey Epstein, the Senate may not be so friendly to Pearce.

***

The News: The National Park Service “expands access,” a.k.a. limits or eliminates timed-entry reservation systems, at Arches, Yosemite, Glacier, and Rocky Mountain National Parks, sparking fears that unmanageable crowds will once again overwhelm the popular parks.

The Context: In the wake of the first wave of the COVID-19 pandemic, when Zoom boomers flooded Western communities and the masses descended on the surrounding public lands, people became increasingly concerned about the resulting crowds at national parks and at popular non-park trails and sites. Not only did the crowds risk damaging the parks’ resources, but they also potentially screwed up the visitors’ experiences.

In Arches National Park, for example, cars backed up at the entry gate for close to a mile, parking lots were crammed with vehicles and trail-jams weren’t uncommon, and on especially busy days park officials had to actually shut the gates and turn folks away — even those who may have traveled from abroad to see Delicate Arch.

To ease the pressure, the National Park Service in 2022 instituted a timed-entry reservation system during the busiest months of the year. This limited the number of people entering the park, but it also ensured the ones that made a reservation that they wouldn’t be turned away. The system led to a sharp drop in visitation during its first year, though the number of people entering the park averaged around 4,000 per day. But it has climbed every year since, including in 2025 when other Canyon Country parks saw visitation decline.

Still, some locals, presumably those of the quantity over quality variety, pushed back, saying the new system was diminishing visitation and hurting the local tourism industry. Last fall, Grand County Commissioner Brian Martinez asked the park service to revoke the timed-entry system and to build up the park’s infrastructure to enable it to maximize visitor numbers. The Trump administration’s park service apparently listened, and now timed-entry is no more.


Neither fire, smoke, nor searing heat can stop the public land swarms — Jonathan P. Thompson

You wanna know how old I am? I’m old enough to remember, way, way back to the days of yore, when federal officials and gateway-town chambers of commerce were wringing their hands in concern over a nationwide decline in visitation to national parks. Over a 13-year period, visitor numbers to 58 “nature-based” national parks—Arches, Yosemite, Yellowstone, …

🐓 Regulatory Capture Chronicles 🦊

Trump’s apparent disdain for clean air (and a healthy public) was manifested in recent weeks as the administration not only rolled back the EPA’s “endangerment finding,” which authorizes it to regulate greenhouse gas emissions, but also the Biden-era mercury toxic air standards

Mercury emissions are an environmental and public health hazard. The Four Corners-area coal plants once kicked out more than four thousand pounds of mercury each year, along with thousands of pounds of selenium and copper and hundreds more pounds of lead, arsenic, and cadmium, not to mention sulfur dioxide, nitrogen oxide, and other pollutants. 

Aquatic Mercury Cycle. Graphic credit: USGS

Those emissions have decreased considerably over the years as federal regulations kicked in and as coal plants were shuttered altogether. Still, the Four Corners plant puts out about 150 pounds of mercury each year, along with varying quantities of other toxic metals. Most of these pollutants are then deposited in the surrounding water, on the land, and on homes. For years, rain and snow falling on Mesa Verde National Park have contained some of the highest levels of mercury in the nation, and elevated levels have even been found on Molas Pass, just south of Silverton. The mercury is then taken up by bacteria in lakes and rivers, which convert it to highly toxic methylmercury, which then enters the food chain. Mercury messes with fishes’ brains, and even at relatively low concentrations can impair bird and fish reproduction and health. It’s not so good for the people who live near the plant, drink the water, or eat those fish, either.

Because most existing coal plants in the West already complied with the Biden regulations, Trump’s rollback isn’t expected to have a significant effect in most cases (unless power plants dismantle existing pollution-control equipment). However, it is expected to allow the Colstrip coal plant in Montana — one of the nation’s worst polluters — to continue to operate (the operators complained that compliance with the Biden rule would have forced it out of business).

The San Juan Generating Station back when all four units were still operating, and spewing mercury and other nastiness on the area and its residents. The plant was shuttered in 2022 and has mostly been demolished. Jonathan P. Thompson photo.

🐐 Things that get my Goat 🐐

I probably shouldn’t put this here, but geez, really? Aren’t we over the whole “The West is a big empty space that we can clutter up with our myths and technology and nuclear waste” complex? I guess not. I’m sure this guy, who is clearly from somewhere that is not the Western U.S., means well. But he needs to figure out that the West’s “empty” spaces are actually full of life and beauty and, well, space, which most of us value quite highly.

Granted, the spaces shown in this guy’s pictures do look like they may have been extensively grazed, but that does not mean they are appropriate places for a bunch of damned power- and water-guzzling data centers and their associated energy facilities.


Data Centers: The Big Buildup of the Digital Age — Jonathan P. Thompson


🤖 Data Center Watch 👾

🌞 Good News! 😎

A new study has shown that it is possible, in some cases, to build large-scale solar systems without destroying the desert on which they sit.

The Gemini Solar Project in southern Nevada is one of the nation’s largest such facilities, covering about 5,000 acres of desert land. During its construction in 2022, the developers refrained from the full “blade-and-grade” site preparation that is typical, and instead worked to minimize disturbance and leave some areas of vegetation and soils completely intact.

A group of researchers from the Desert Research Institute and the U.S. Geological Survey surveyed the plant population — with a focus on the rare and sensitive threecorner milkvetch — before and two years after construction. Their hypothesis was that the facility would detrimentally affect the plant, and that the areas nearer the panels would see the biggest impacts.

What they found is that not only did the milkvetch survive, but it actually thrived “within the novel environment created at Gemini.” The plants found after construction were larger and more fecund than those found off-site. “Our results suggest that the altered environment created by panel arrays did not alter threecorner milkvetch survivorship at Gemini.”

It’s just one study focused on one solar installation and one plant. But it does suggest that, if done correctly, utility-scale solar development does not have to be a desert’s death knell.


In related, but less sunny news: Lawmakers from a handful of states have proposed bills that would make it easier for residents and businesses to install plug-in or balcony solar panels. While these panels don’t generate a ton of electricity, they are relatively inexpensive and, as the name indicates, are pretty simple to hook up. They are common in parts of Europe, especially Germany, and are gaining popularity in the U.S. since the Trump administration has killed most federal rooftop solar subsidies. The legislation is mostly aimed at allowing folks to plug these things in without a permit or go-ahead from the utility. 

Last year, Utah, of all places, actually passed one of these bills. But so far this year plug-in solar legislation has died in Wyoming and in Arizona, after utilities expressed concerns. Come on! The California bill seems to still be alive. 

Parting Note

I’ll be leading a couple of workshops and giving a talk at this year’s Entrada Institute “Writing from the Land” on May 14-16 in Torrey, Utah. Check it out:

#Durango City Council approves feasibility study for #AnimasRiver surf wave — The Durango Herald

Animas River

Click the link to read the article on The Durango Herald website (Christian Burney). Here’s an excerpt:

February 28, 2026

City Council approved a $44,000 feasibility study last week that will explore where a new surf wave could be optimally built along the Animas River in Durango. The nonprofit Animas River Surfers proposed the feasibility study and a partnership with the city to get it done. It raised $13,000 to contribute to the study. The city budgeted $40,000 plus a 10% contingency from the 2015 sales tax fund for the study, which City Council approved last week along with budget appropriations for a wide scope of other projects. Parks and Recreation Director Scott McClain said it has received proposals for the study, and the chosen consultant will identify possible locations for a surf wave and narrow them down to one. The consultant will engage with commercial organizations and Animas River users during the study…City spokesman Tom Sluis told The Durango Herald the city hasn’t yet hired a consultant to conduct the feasibility study and it will be another week or so before a consultant is selected.