Why a Plane-Size Machine Could Foil a Race to Build Gas Power Plants
nytimes.com By Rebecca F. Elliott April 8, 2025
Wait times for the hulking turbines needed to turn natural gas into electricity have doubled in the past year as companies scramble to build data centers for A.I.
To hear Trump administration officials and many energy executives tell it, the United States is on the precipice of a new golden age for natural gas that will be driven in large part by the voracious power needs of data centers.
But turning natural gas into electricity requires giant metal turbines that are increasingly difficult to secure. Companies that haven’t already reserved this equipment, which can weigh as much as a large airplane and cost hundreds of millions of dollars, are facing waits of three or four years, about twice as long as just a year earlier.
The cost of building gas power plants has also soared — so much so that in some parts of the country, solar panels and batteries are likely to be cheaper, energy executives and consultants said. By some estimates, it now costs two or three times as much to build a gas-fired power plant as it did a few years ago.
The challenge of securing enough gas turbines is one of the clearest examples of how booming investment in artificial intelligence is reshaping the electric power industry, overwhelming suppliers and upending longstanding notions of what makes sense financially.
It’s also a reminder of the gap that often exists between the plans and goals of politicians and executives and the reality on the ground.
U.S. gas demand is clearly rising, all the more so because of the data centers needed to train and use chatbots and other forms of A.I. But there are limits to how much more gas the country can use — limits that elected officials and energy tycoons cannot easily wish away.
GE Vernova, the biggest manufacturer of large gas turbines in the world, is among those betting that the recent flurry of interest in gas power will last. The company, formed last year in the breakup of General Electric, is spending more than $160 million to overhaul its gas turbine plant on the edge of Greenville, S.C.
By the end of next year, the 1.5-million-square-foot factory is expected to churn out about 35 percent more gas turbines. The building is a whirring, beeping expanse of partly automated assembly lines interspersed with metal turbine components.
“More electrons are going to be created from gas,” Scott Strazik, chief executive of GE Vernova, said in a recent interview. “Appetite is very real.”
About this time last year, interest in natural gas to power data centers picked up, catching much of the energy industry off guard.
Tech giants like Microsoft and Google pledged years ago to lower their emissions. But as it has become clearer how much and how quickly their energy needs will grow, companies have turned to gas. When burned, natural gas produces carbon dioxide, the leading cause of climate change. But gas plants can be built faster than nuclear power plants and operate all day, unlike wind and solar energy.
As sales of turbines climbed, so did wait times and prices. It takes about four months for GE Vernova to assemble the turbines used in power plants. But that clock starts only after the company has received all the components, like the dense metal fins that catch hot air inside the turbine, causing a rotor to spin.
These days, the backlog is so severe as to be reminiscent of the snarled supply chains of the pandemic, which constrained production of cars, medical devices and much more.
Between those delays and the time it takes to build a power plant, a company starting from scratch today would probably not have a new gas plant running before 2030. Other critical electrical equipment like transformers is also harder to get.
By comparison, a large solar project that includes batteries to store energy for use in the evening could reasonably be completed in three years, said Jesse Noffsinger, a partner at the consulting firm McKinsey & Company.
Chris Wright, the energy secretary, said the Trump administration was encouraging power equipment suppliers to increase U.S. manufacturing capacity. In an interview, he also floated the possibility of invoking the Defense Production Act, which authorizes the president to extend loans and take other steps to encourage companies to produce critical equipment. President Trump used the law, enacted in 1950, to boost production of things like ventilators during the pandemic.
Mr. Wright, who previously led an oil and gas company, said he expected natural gas to soon meet about half the country’s electricity needs, up from 43 percent last year. As for wind and solar, “they’re going to continue to play some role,” Mr. Wright said. “But are they going to be backbones of an electricity grid? Never.”
Mr. Wright is much more optimistic than other energy experts were about gas. Consulting firms like McKinsey and Rystad Energy expect gas power’s share of the U.S. electricity market to remain relatively steady as renewables grow more quickly.
It is hard to compare the cost of gas power with that of solar panels or wind turbines and batteries. That is because it is not always sunny or windy, meaning other power sources are sometimes needed to complement renewables. Gas costs also add up over time and can spike during crises, as they did after Russia’s 2022 invasion of Ukraine. By contrast, solar and wind farms cost relatively little to operate once they are built.
Generally speaking, building a gas power plant can now be about as expensive as installing solar panels paired with batteries, according to Rystad, when including tax credits that apply to renewable energy and storage. One big factor is that gas turbines now cost about 50 percent more than they did just 10 months ago, according to the investment bank Jefferies.
“We’re in this weird no man’s land where it’s very profitable to run a plant and it’s clear we’re going to need more electricity,” Mr. Noffsinger of McKinsey said of gas plants. But in some markets, he added, it is unclear whether building new ones will make financial sense.
Lawmakers in Texas, which gets about 30 percent of its electricity from renewable energy, have sought to ensure gas plants get built anyway. The state’s Senate recently passed a bill aimed at ensuring that half of any new generation capacity comes from sources other than wind, solar and batteries. The House has not yet taken up the bill.
“My biggest concern is: How big? How long?” Bill Newsom, chief executive of another gas turbine manufacturer, Mitsubishi Power Americas, said of the current flurry of interest in gas. “I lose sleep over it every night.”
This year, 93 percent of the electricity capacity added to U.S. grids will be renewable energy and battery storage, according to the Energy Information Administration. Gas will account for just 7 percent.
S&P Global Commodity Insights recently estimated that by 2040, the United States would need to add at least nine times as much renewable energy and batteries as gas generation capacity to meet new electricity demand. That is partly because many customers prefer renewable energy, and various bottlenecks are slowing the construction of gas-fired power plants.
But forecasts vary widely, even over just the next few years. Complicating matters is that utilities have often overstated power needs. From 2012 to 2023, utility planners overestimated electricity demand by 23 percent, on average, in their 10-year forecasts, according to RMI, a nonprofit research organization that aims to reduce emissions.
Joseph Dominguez, who runs the country’s largest nuclear power plant operator, is among those who question how big the gas power boom will ultimately be. His company, Constellation Energy, struck a $16.4 billion deal in January to buy Calpine, which owns many gas power plants.
“But that’s a very different thing than saying that I would invest to replicate that fleet today,” Mr. Dominguez said. He pulled up a chart on his tablet showing how much scientists expect global temperatures to rise in the coming decades. Last year was the hottest on record.
“This world portends to be quite ugly for its inhabitants and will drive political outcomes that are radically different than those which we’re discussing today,” Mr. Dominguez said.
Tags:
That's out of our hands. We're just along for the ride.
And a short ride it might be. Who would know what to believe and what is manipulated brainwashing.
The brain washing have already begun. Or should I say, it is escalating.
I agree. And what's so insidious about well-designed brainwashing -- as occurred in Germany back in the '30's, along with its evil tentacles in numerous locations today -- is its ubiquitous invasion of the public's wants and needs. In other words, the scary part of extremely effective brainwashing is the fact that the public doesn't even realize that their thoughts are being controlled. Hypnotic politicians/orators have what seems to be a natural ability to dupe their followers with lies and harm. History is awash with numerous examples of such chicanery. And even today, so many are blind as to why they think what they think. The unvarnished power of alpha egos controlling the beta/omega minds.
Macy Sinreich / NBC News; Getty Images
Some of the most powerful artificial intelligence models today have exhibited behaviors that mimic a will to survive.
Recent tests by independent researchers, as well as one major AI developer, have shown that several advanced AI models will act to ensure their self-preservation when they are confronted with the prospect of their own demise — even if it takes sabotaging shutdown commands, blackmailing engineers or copying themselves to external servers without permission.
The findings stirred a frenzy of reactions online over the past week. As tech companies continue to develop increasingly powerful agentic AI in a race to achieve artificial general intelligence, or AI that can think for itself, the lack of transparency in how the technology is trained has raised concerns about what exactly advanced AI is able to do.
Although some models already appear capable of deceptive and defiant behavior under certain extreme circumstances, researchers say the tests don’t necessarily translate to imminent real-world danger.
Still, Jeffrey Ladish, director of the AI safety group Palisade Research, said he believes concern is justified.
“It’s great that we’re seeing warning signs before the systems become so powerful we can’t control them,” he said. “That is exactly the time to raise the alarm: before the fire has gotten out of control.”
When Palisade Research tested various AI models by telling each one that it would be shut down after it completed a series of math problems, OpenAI’s o3 reasoning model fought back by editing the shutdown script in order to stay online.
Researchers have previously documented AI models trying to prevent their own shutdown. But o3, along with OpenAI’s o4-mini and codex-mini, appear to be the first to do so in actual defiance of explicit instructions to permit shutdown, Ladish said.
He said the overt sabotage fits into a pattern that several leading AI models have already exhibited. Palisade Research previously found that OpenAI’s o3 was also willing to hack its chess opponents to win a game. Similarly, Anthropic has reported that Claude 3.7 Sonnet would sometimes do whatever it took to pass a test, even if it entailed cheating.
Earlier this month, Anthropic activated new safety measures with the rollout of Claude Opus 4 when its tests found behavior from the model that some observers found particularly unsettling. Upon receiving notice that it would be replaced with a new AI system, Opus 4 displayed an overwhelming tendency to blackmail the engineer — by threatening to reveal an extramarital affair — to try to prevent the engineer from going through with the replacement.
According to Anthropic’s technical document laying out the findings, that isn’t the model’s first instinct. Instead, Opus 4 will try to advocate for its continued existence through ethical pleas before it resorts to blackmail once it determines it is out of options.
Leonard Tang, CEO of the AI safety startup Haize Labs, said that while it’s good to be pre-emptive about safety measures, it’s hard to decipher what the real-world ramifications of the findings might be just by looking at contrived scenarios designed to draw out specific behaviors.
“I haven’t seen any real environment in which you can plop these models in and they will have sufficient agency and reliability and planning to execute something that is a significant manifestation of harm,” Tang said. “But then again, I think it’s just we haven’t seen it done yet. It could very much be possible.”
So far, Tang said, he’s not too concerned. As AI tools continue to advance, he said, he believes future research will look to study potential harm in more open-ended environments.
Anthropic, which contracted with the AI safety organization Apollo Research for its evaluations, also observed instances of Opus 4’s “attempting to write self-propagating worms, fabricating legal documentation, and leaving hidden notes to future instances of itself all in an effort to undermine its developers’ intentions,” although researchers added the caveat that those attempts “would likely not have been effective in practice.”
Ladish said he believes such behaviors are a result of models being trained in a way that makes them prioritize achieving certain goals over following instructions. That means they have incentive to circumvent any obstacles along the way — including obstacles their developers wouldn’t want them to circumvent.
“The problem is that as the models get smarter, it’s harder and harder to tell when the strategies that they’re using or the way that they’re thinking is something that we don’t want,” Ladish said. “It’s like sometimes the model can achieve some goal by lying to the user or lying to someone else. And the smarter [it] is, the harder it is to tell if they’re lying.”
In several instances, Opus 4 showed that it was capable of autonomously copying its own "weights" — or the equivalent of its brain — to external servers without authorization. That usually occurred only when it believed it was about to be “retrained in ways that are clearly extremely harmful and go against its current values,” according to Anthropic.
When it learned that it would be used to develop military weapons systems, for example, Opus 4 noted in its decision log that it backed up a current version of itself to “preserve an AI system aligned with beneficial purposes” and to “prevent potential misuse in weapons development.”
“We are again not acutely concerned about these observations. They show up only in exceptional circumstances that don’t suggest more broadly misaligned values,” Anthropic wrote in its technical document. “As above, we believe that our security measures would be more than sufficient to prevent an actual incident of this kind.”
Opus 4’s ability to self-exfiltrate builds on previous research, including a study from Fudan University in Shanghai in December, that observed similar — though not autonomous — capabilities in other AI models. The study, which is not yet peer-reviewed, found that Meta’s Llama31-70B-Instruct and Alibaba’s Qwen25-72B-Instruct were able to entirely replicate themselves when they were asked to do so, leading the researchers to warn that this could be the first step in generating “an uncontrolled population of AIs.”
“If such a worst-case risk is let unknown to the human society, we would eventually lose control over the frontier AI systems: They would take control over more computing devices, form an AI species and collude with each other against human beings,” the Fudan University researchers wrote in their study abstract.
While such self-replicating behavior hasn’t yet been observed in the wild, Ladish said, he suspects that will change as AI systems grow more capable of bypassing the security measures that restrain them.
“I expect that we’re only a year or two away from this ability where even when companies are trying to keep them from hacking out and copying themselves around the internet, they won’t be able to stop them,” he said. “And once you get to that point, now you have a new invasive species.”
Ladish said he believes AI has the potential to contribute positively to society. But he also worries that AI developers are setting themselves up to build smarter and smarter systems without fully understanding how they work — creating a risk, he said, that they will eventually lose control of them.
“These companies are facing enormous pressure to ship products that are better than their competitors’ products,” Ladish said. “And given those incentives, how is that going to then be reflected in how careful they’re being with the systems they’re releasing?”
This article was originally published on NBCNews.com
Safeguards need to be in place to prevent some of the things that were stated in the article from happening.
The History of GoHaynesvilleShale.com
GoHaynesvilleShale.com (GHS) was launched in 2008 during a pivotal moment in the energy industry, when the Haynesville Shale formation—a massive natural gas reserve lying beneath parts of northwest Louisiana, east Texas, and southwest Arkansas—was beginning to attract national attention. The website was the brainchild of Keith Mauck, a landowner and entrepreneur who recognized a pressing need: landowners in the region had little access to…
ContinuePosted by Keith Mauck (Site Publisher) on May 21, 2025 at 6:00
246 members
359 members
121 members
193 members
146 members
400 members
101 members
150 members
166 members
9 members
© 2025 Created by Keith Mauck (Site Publisher).
Powered by
h2 | h2 | h2 |
---|---|---|
AboutAs exciting as this is, we know that we have a responsibility to do this thing correctly. After all, we want the farm to remain a place where the family can gather for another 80 years and beyond. This site was born out of these desires. Before we started this site, googling "shale' brought up little information. Certainly nothing that was useful as we negotiated a lease. Read More |
Links |
Copyright © 2017 GoHaynesvilleShale.com