Search

By EPN Staff

American energy demand is expected to rise sharply in coming years, driven in part by the new computation and storage needs of artificial intelligence and data centers. A recent study from Duke University’s Nicholas School of the Environment suggests that existing infrastructure can absorb much of this growth without massive new generation investments – if data centers will curtail their use during peak demand times for the rest of the grid.

“The findings highlight a significant opportunity: nearly 100 GW of large new loads could be integrated with minimal impact, supporting economic growth while maintaining grid reliability and affordability,” Duke researchers said in their report.

The average curtailment would be about two hours, the study found, within the capability of current battery technology to keep the center running.

“Our system is prepared to serve that highest peak,” Dalia Patino-Echeverria, a report co-author, told North Carolina’s WUNC. “That means the rest of the year, the system is underutilized.”

Why it matters

Policymakers, utility companies and tech companies are grappling with how much demand the United States will see – following some two decades of nearly flat demand – and how to fill it. The Duke University study suggests the challenge is far more surmountable than feared, even if high-end demand predictions prove correct.

The potential impact is massive. Varun Sivaram, a senior fellow for energy and climate at the Council on Foreign Relations, who was not involved in the report, told Latitude Media that the findings were “seismic.”

“For this minimal amount of flexibility, existing U.S. regional power grid infrastructure and power plant capacity could support FOUR Project Stargates, or more data center capacity than the entire current U.S. nuclear fleet of 94 reactors can supply today,” Sivaram said.

The bigger picture

Data centers don’t have to go offline to make this work – they can switch to battery power or simply reduce consumption by prioritizing only the most important tasks.

“You might only need to curtail 10% or 20% or 50% [of a data center] because that’s all the flexibility you need right to stay below the existing system peak,” Tyler Norris, another report co-author, told Latitude Media.

This isn’t a new idea. Google has a demand response pilot program that “generates hour-by-hour instructions for specified data centers to limit non-urgent compute tasks for the duration” of severe weather events or other events that strain the power grid.

The Electric Power Research Institute’s DCFlex project is testing a similar “flexibility hubs” concept in a partnership with major companies including Google, Meta and Nvidia. That project announced in February that it was going international, adding European partners to the U.S. based group.

Duke University’s study says the curtailment time needed “comparable to existing demand response programs already in place" and that its 100-GW estimate "assumes that these new loads would be curtailed an average of 0.5 percent of their maximum uptime each year to help the grid meet peak demand, such as on extremely hot or cold days."

Additional context

Data center projects have an incentive to adopt this sort of flexibility: It might mean they get plugged in faster to the grid – something that’s getting harder to do as demand spikes.

For example, in recent years Virginia’s Dominion Power has slowed new center connections in Northern Virginia, one of the country’s largest homes for data centers, because of demand concerns.

Projects that promise some flexibility could jump the line because they make the overall grid more efficient.