Microsoft’s plan to restart Three Mile Island points to the way forward.
Hollywood thrillers rarely change the course of history, but if they had their own category at the Academy Awards, a good candidate might be The China Syndrome. Released 45 years ago, the hit film depicts a disaster at a nuclear plant in California, sparking fears that the reactor’s core would melt down through the containment vessel—all the way to China (hence the name). Less than two weeks after its release, life imitated art as a partial nuclear meltdown at the Three Mile Island nuclear power plant in Pennsylvania quickly turned public opinion against nuclear energy and effectively halted the expansion of nuclear power in the United States.
Yet after almost a half-century, the advent of artificial intelligence, described by Google CEO Sundar Pichai as “more profound than electricity or fire,” has achieved what many thought impossible. Last month, Microsoft and Constellation Energy announced they would spend $1.6 billion to restart the remaining functional reactor at Three Mile Island to fuel the tech firm’s plunge into AI, a technology with vast energy requirements due to its needs for computational power. With a global race underway to capitalize on AI’s economic and military potential—and as China quickly catches up to the United States—fears of another kind of China shock are trumping yesterday’s nuclear angst.
To win the AI race, the next administration should set for itself a grand challenge of building a terawatt of new clean electricity generation capacity in the next decade, particularly clean firm power that runs 24/7, which would nearly double the country’s current capacity. Achieving such a goal was already necessary in order for the United States to confront the challenge of climate change, but it is now also urgently needed to confront the national security and economic imperatives raised by AI.
The thirst for more power to fuel the AI boom is just the latest stressor added to an already aging electricity grid. A single user query to ChatGPT or other large language model can use up to 10 times more electricity than a typical Google search uses. The Electric Power Research Institute expects data centers to make up around 9 percent of the United States’ total electrical load by 2030, up from approximately 4 percent today. Goldman Sachs predicts that the AI revolution will require adding as much electricity to the grid globally by 2030 as all of Japan produces today.
This power demand growth comes on top of the already significant increases required for electrification as cars, heating systems, and industrial processes continue to shift away from the combustion of fossil fuels. According to researchers at Princeton University, if the United States were to achieve the goal of net-zero emissions by 2050, it would need to consume two to three times as much electricity as it does today.
Building out this vast additional electricity infrastructure is all the more difficult because the United States has lost the muscle memory to do so. After rising steadily for years, demand growth has plateaued for the past two decades. It takes more than a decade on average to build a new high-voltage transmission line in the United States, and the current backlog of renewable energy projects waiting to be connected to the power grid is twice as large as the electricity system itself. In U.S. states with large tech sectors, such as California and Virginia, utilities are refusing requests to connect more data centers to the grid because they cannot solve transmission and power generation constraints. And local communities are increasingly opposing new data center projects over concerns about noise, environmental impacts, and stress on existing power infrastructure.
Moreover, even before the projected surge in demand, grid operators and regulators warned that the antiquated U.S. electricity system, which is already adjusting to handle rising levels of intermittent solar and wind energy, is not prepared for growing electricity usage. These reliability concerns were evident this summer, when the nation’s largest grid operator held an auction in the so-called capacity market to ensure that power would be available when needed—a sort of insurance policy for the grid—and the price was nine times higher than at a similar auction in 2023.
Despite these challenges, big tech firms such as Amazon, Google, Meta, and Microsoft have made it clear that failure to get the electricity needed for AI is not an option. Inability to source adequate power for the AI boom is an existential risk to these companies’ futures, whereas overinvesting in computing power is a relatively small price to pay given the perceived cost of falling short. As a result, these firms will do whatever it takes to find more power.
For Amazon, that meant buying a nuclear power plant in Pennsylvania just over six months ago. But by taking over an existing plant, Amazon merely redirected that plant’s zero-carbon electricity from other customers, forcing power companies to find other ways to pick up the slack. For Microsoft, securing power meant reactivating a dormant nuclear plant, but there are not many nuclear plants sitting idle and ready to be restarted in the United States. That option makes more sense in Japan and Germany, which both shut down their nuclear fleets after the tsunami-induced Fukushima nuclear disaster. Indeed, Germany’s opposition Christian Democrats just last week called for not dismantling nuclear plants as the country copes with high electricity prices.
Before building new solar and wind in the United States, certain siting and permitting obstacles for the projects and necessary transmission lines must be overcome. Intermittent, unpredictable electricity generation from renewables is also ill-suited to many AI needs. Advanced nuclear technology shows promise, but such projects will take a decade or more to come online at any sort of scale, whereas tech firms need massive amounts of power in the next couple years.
In the near term, that risks leaving natural gas as one of the few options to meet rising power demand quickly in the United States. While tech firms would prefer to burnish their green credentials, there is already evidence that climate commitments may take a back seat to the critical necessity of remaining competitive in the race for AI dominance. Indeed, Google recently dropped its claim of becoming carbon-neutral, and electricity demand from AI was a major driver of that decision.
These dynamics also affect where U.S. tech firms build their data centers. Although they would prefer to set them up domestically, the risk of falling short on power will win out in the end. If it proves too difficult to meet their rising electricity needs at home, they will look to energy- and capital-rich regions overseas, such as the Middle East. This in turn raises national security concerns, including technology transfer to adversaries and access to AI systems during periods of conflict or tension. If the United States does not fix its power infrastructure problems, it risks losing critical AI investment and technology to other geographies.
The United States has a head start on AI infrastructure and houses far more data centers today than any other country in the world does. But not only is China is catching up quickly; it also has the ability to expand domestic power production—whether from renewables, nuclear power, coal, or any other source—with far fewer environmental or other constraints. The United States’ inability to substantially increase domestic power generation risks undermining its current leadership role.
The next administration should take five steps to ensure that the United States has the energy foundation necessary to power AI leadership in both the short and the long term.
First, the fastest way to meet AI’s energy needs is to reduce them in the first place. The reality is that many, if not most, queries to generative AI systems could be accomplished using models requiring substantially less computing power than the most advanced systems available. More efficient routing of queries to leverage smaller models could significantly reduce projected energy use. Moreover, advances in AI chip and data center architectures in recent years have allowed for substantial improvements in energy efficiency. For instance, cutting-edge chips from Google and Nvidia have improved performance relative to power use by factors of two to five. The administration should support research and large-scale government purchase programs that send a market signal for energy-efficient AI software and hardware systems.
Second, the administration should make it easier to build new energy infrastructure. The permitting reform bill recently negotiated by Sens. Joe Manchin and John Barrasso is a good place to start, but much more needs to be done to reform the permitting system—while still respecting the need for sound environmental reviews and the rights of tribal communities. Biden’s decision last month to support a bill, despite the objection of environmentalists, exempting construction of some chip-making facilities from lengthy environmental reviews highlights the tension between these myriad objectives.
Third, reforming the current utility business model in the United States, which often rewards utilities financially for making capital investments in new infrastructure, could increase incentives for power companies not only to build new infrastructure but also to use existing infrastructure more efficiently. Such measures include deploying batteries to store renewable energy and rewiring old transmission lines with advanced conductors that can double the amount of power they move.
Regulators should also protect existing ratepayers from bearing the costs of the additional power plants and transmission lines needed for AI. Typically, the cost of capital investments that utilities make is spread among all ratepayers, but utilities could charge firms building new data centers a higher price for their additional power requirements in order to reduce the need to recoup capital investments from the existing base of ratepayers. Tech firms could also build their own power plants disconnected from the grid.
Fourth, meeting the rising needs of AI domestically while protecting grid reliability will also require more electricity from sources that are available at all times, known as firm power, rather than intermittent sources like wind and solar, absent significant breakthroughs in technology to store electricity. The next administration should support research, demonstration, and deployment of advanced nuclear technology, which reduces costs, waste, and safety concerns, and make it easier to permit such projects and build them at viable price points. First and foremost, this will require reforming the Nuclear Regulatory Commission. Advanced geothermal energy is another highly promising sector, where research should be supported and permitting eased. Recent innovations, some of which draw on U.S. expertise in drilling for shale oil and gas, hold enormous promise to make it easier and cheaper to tap the intense heat trapped deep below the Earth’s surface for power generation and other uses.
Given the long lead times to site and permit new renewable or nuclear projects, if firms turn in the near term to building new natural gas power generation to meet AI’s need for firm power, the next administration should work with Congress to require that those plants be built to comply with the recently finalized rules from the Environmental Protection Agency, which require that 90 percent of carbon dioxide emissions are captured by 2035. The cost of doing so would be less than the cost to society of the rise in data center carbon dioxide emissions—and far less than the enormous economic benefits tech firms expect to reap from the AI revolution. Such a requirement would put other key technologies needed for a secure and clean domestic energy future on an even playing field with gas.
Fifth, the next administration should prioritize diplomacy and engagement with partner nations to create opportunities abroad for U.S. firms in AI, accelerate the pace of AI innovation in allied countries, reduce Chinese influence on the global AI ecosystem, and protect against cybersecurity threats and illicit technology transfers. A large energy importer with limited domestic resources, such as Japan, may find it cheaper to build data centers in the United States or Canada, where cheap energy is abundant, and transmit data via subsea cables, rather than ship expensive liquefied natural gas to grow the Japanese electricity supply. Similarly, if U.S. firms are not allowed to invest in AI capabilities in Gulf states, such as Saudi Arabia or the United Arab Emirates, those countries are likely to turn to Chinese firms for technology and investment instead.
Today, the race for AI leadership remains America’s to lose, thanks to its depth of talent, capital, academic research, and innovative companies. Yet preserving this leadership position requires making it much easier to generate and transport electricity in the United States. To maintain public support and not undermine other policy goals, the next administration should adopt a policy framework that pursues multiple objectives simultaneously: holding existing ratepayers harmless, requiring new power generation to be clean, and allowing tech firms to move quickly to meet rapidly growing power needs. Streamlining permitting, deploying next-generation clean energy systems, and strengthening cooperation with allies are necessary not only to curb the threat of climate change, but also to maintain America’s economic prosperity and national security in the coming era of AI.
We give you energy news and help invest in energy projects too, click here to learn more
Energy News Beat