⚛️ ⚡✨ AI’s Need for Nuclear: A Quick Q&A with … energy writer and historian Emmet Penney
'As far as we can tell, China operates its nuclear fleet safely and efficiently. And, it bears repeating, they’re just better at building nuclear than we are right now.'
My fellow pro-growth/progress/abundance Up Wingers,
It’s well known that AI is an extremely energy-hungry technology. From Emmet Penney’s article, “To Win A.I., Go Nuclear”:
Data centers require a lot of electricity, so we will need to build many new power plants to maintain our competitiveness.
This is why, even before the challenge posed by DeepSeek, an AI boom in the United States was far from a given. Misguided energy policy has reduced supply and left the grid increasingly unreliable over the past several decades. The new administration has the opportunity for a course correction, but it will need to more than slash wind and solar subsidies and other green initiatives: It will have to hybridize the Republican Party’s traditional penchant for slashing red tape with a muscular industrial policy dedicated to strengthening America’s energy system.
I asked Penney a few quick questions about what it will take to power an American victory in today’s AI race.
Penney is a senior fellow at the Foundation for American Innovation. He is also a contributing editor at Compact Magazine and is the author of his own Substack, Nuclear Barbarians.
1/ What makes a ChatGPT search so energy intensive?
This is a great question. Let’s start with how energy intensive a typical ChatGPT query is. Compared to a Google search (0.3 watt-hours of electricity), running a single ChapGPT query requires an order of magnitude more power (2.9 watt-hours).
The reason for this difference relates to the complexity of the task being performance. What we ask of ChatGPT is fundamentally more complex than what we ask of a Google search. A Google search is like a vastly upgraded card catalog search through the digital library of the world; querying ChatGPT, on the other hand, is synthesizing all of that and then producing bespoke syntheses or images. Thus, ChatGPT recruits more computing power to execute its tasks, which then demands more chips, more cooling, more power. There’s actually a great in-depth look at this from a professional who’s been around the data center industry since the 1990s, for anyone who wants to drill further down.
2/ How do we see China looking to power their AI ambitions? Are they going nuclear and are they doing it responsibly?
China has become an energy powerhouse. They’re building everything. Coal, solar, wind, nuclear, you name it! They want to be as energy self-sufficient as possible and they want to grow. So, they aren’t betting on a single technology to power their AI needs, but they, like the Russians, are absolutely eating our lunch on new nuclear builds.
Unlike the Russians, however, China is building mostly domestically. Currently, they have more nuclear plants under construction than the entire world put together. In addition to those 30 plants, China has recently just commissioned 10 more plants for around $27 billion dollars. For reference, our most recent nuclear reactors, two AP-1000 units in Georgia, cost $35 billion and took 15 years to complete.
More shameful for us is that China has been cranking out reactors based on our AP-1000 design faster and cheaper than us. With a goal of totaling 200 gigawatts in nuclear capacity by 2040, China hopes to double our fleet, which is currently the largest in the world.
As far as we can tell, China operates its nuclear fleet safely and efficiently. And, it bears repeating, they’re just better at building nuclear than we are right now.
3/ Can you explain concretely why greater chip efficiency is ironically more likely to increase consumption?
So, it’s not necessarily chip efficiency, but the efficiency of AI’s power consumption that’s increasing its power demand. The 19th century economist William Stanley Jevons discovered this paradox (thus it bears his name) when our coal usage began to explode. Jevons noticed that as coal generation increased in efficiency, consumption increased apace. In short, instead of efficiency curtailing our consumption, it expanded it because efficiency drove down costs, which then expanded use-cases.
But Jevon’s paradox can only bear out if AI’s use-cases and applications are expansive enough. In other words, insofar as AI’s usefulness is less than infinite, there will be a plateau in its deployment and thus a plateau in its energy consumption. We are stalking this plateau with no clear picture of when we will find it–but we will find it eventually.
4/ What role will SMRs play relative to large-scale reactors in the age of AI?
Here’s the theory of the case: small modular reactors (SMRs) will be cheaper and faster to deploy than large nuclear units for two reasons:
1. They’re smaller.
2. They’re modular, which means that their component parts will be cranked out on an assembly in a factory and then trucked out to the construction site and snapped together. This, the theory goes, will be a simpler, faster way to deploy nuclear.
3. Thanks to 1 and 2, SMR designs can be churned out rapidly, unlocking economies of repetition that drive down delivery costs.
Thus, if data centers require off-grid power to serve their needs, then SMRs can be deployed on site with datacenters, and then added to the site as the datacenter’s needs grow. So, SMRs seem like the perfect clean, firm energy provider for the AI boom.
Here’s the catch: we’re not sure if the theory holds. Certainly, we all hope it does. Technically, the AP-1000s at in Georgia were modular, even though the reactors were bigger. Westinghouse, the company building those units, quickly found out that modularity could only be accomplished at great expense and by locating the forge for the modular aspects at the work site. The “modularity” aspect of SMRs offers the steepest challenge to their success.
And then there’s the price of power. Admiral Hyman Rickover, the father of both the Navy’s nuclear submarine fleet and our civilian fleet, picked the Light Water Reactor design for several reasons. First, using water as the reactor coolant is (comparatively) safe and simple. Second, even though LWRs may cost more upfront, they will last longer and deliver cheaper power. With more exotic SMR designs that use molten salt or some other kind of coolant, the reactor may very well be deployed more quickly, but the price of power produced, the longevity of the plant, and the operational challenges to running it may prove steeper than traditional nuclear.
But maybe not! We’ll have a better idea over the next decade or so.
5/ How should we interpret the Trump administration’s pro-nuclear stance, given that Trump has been a vocal supporter of coal?
President Trump and Energy Secretary Chris Wright have been very clear that they support a wide variety of power sources so long as they’re dispatchable and reliable. Currently, the US is facing a power supply crisis, so working to keep coal plants on while stoking the nuclear renaissance is not in tension, it is simply prudence.
On sale everywhere The Conservative Futurist: How To Create the Sci-Fi World We Were Promised