US data-center power use could nearly triple by 2028, DOE-backed...

US data-center power use could nearly triple by 2028, DOE-backed...
Source: Daily Mail Online

In three years, data centers could account for 6.7%-12% of US total electricity consumption

Between 2017 and 2023, data-center power demand more than doubled with rollout of more AI servers

AI requires increasingly powerful chips and intense cooling systems, driving energy demand growth

(Adds report details, research lead quotes starting in paragraph 6)

By Laila Kearney

NEW YORK, Dec 20 (Reuters) - U.S. data center power demand could nearly triple in the next three years, and consume as much as 12% of the country's electricity, as the industry undergoes an artificial-intelligence transformation, according to an unpublished Department of Energy-backed report seen by Reuters.

The Lawrence Berkeley National Laboratory report, which is expected to be released on Friday, comes as the U.S. power industry and government agencies attempt to understand how the sudden rise of Big Tech's data-center demand will affect electrical grids, power bills and the climate.

By 2028, data-center annual energy use could reach between 74 and 132 gigawatts, or between 6.7% and 12% of total U.S. electricity consumption, according to the Berkeley Lab report.

The industry standard-setting report included ranges that depended partly on the availability and demand for a type of AI chip known as GPUs. Currently, data centers make up a little more than 4% of the country's power load.

"This really signals to us where the frontier is in terms of growing energy demand in the U.S.," said Avi Shultz, director of the DOE's Industrial Efficiency and Decarbonization Office.

Data-center power demand was mostly flat from the early- to mid-2010s despite significant industry growth, after computational efficiencies and a shift away from small dispersed centers to large cloud-based sites slashed the need for energy-sapping information technology equipment and overhead.

Starting in 2017, the deployment of GPU-accelerated servers led to a more than doubling of the sector's power use over a six-year period, the report said.

AI, which requires increasingly powerful chips and intense cooling systems, is the primary driver for the projected data-center growth.

When the last report was released in 2016, AI servers in data centers accounted for about 2% of total server energy use.

"There's so much difference in the industry between when we looked at it in 2016 and when we look at it now," said the report's lead researcher, Arman Shehabi, a staff scientist at Berkeley Lab.

Shehabi and his team of researchers recommend publishing the report annually, or biannually, to more closely track data-center trends.

Estimates in the report are based on calculations of electricity use from installed GPUs and other data-center IT equipment, using publicly available information, market-research firms and reviews by power-sector and data-center executives.

Researchers recommended strategies to increase transparency around the industry, including collecting more data-center information which would be reported anonymously about electricity use and details including compute capabilities and workload type which is now considered proprietary.

"By showing what the energy use is and, more importantly, what's causing the growth in energy use, it helps us think about what opportunities there are for efficiencies," Shehabi said.

The report also makes suggestions to further research and develop energy-efficiency strategies for the country's booming AI data centers.