AI is “sucking up” the world's electricity!More terrible things are yet to come

In recent years, the rise of artificial intelligence (AI) has caused widespread discussion and concern. Many people are worried that AI will cause unemployment to soar, and some optimistic friends jokingly call it “As long as electricity bills are more expensive than steamed buns, AI will never be able to completely replace humans.“.

Although this is a joke, behind it is the real problem of AI energy consumption. More and more people are worried that high energy consumption will become a bottleneck restricting the development of AI. Not long ago, technology entrepreneur and former Google engineer Kyle Corbitt said on social media X that Microsoft has encountered difficulties in this regard.

Advertisement

How much electricity does AI consume?

Corbett said that Microsoft engineers training GPT-6 are busy building an IB network (InfiniBand) to connect GPUs distributed in different regions. The job was difficult, but they had no choice because if more than 100,000 H100 chips were deployed in the same area, the grid would collapse.

Why would the concentration of these chips lead to the collapse of the power grid? Let's do a simple calculation.

Data published on NVIDIA's website shows that the peak power of each H100 chip is 700W, and the peak power consumption of 100,000 H100 chips can reach up to 70 million W. An energy industry practitioner in the X comment area pointed out that the total energy consumption of 100,000 chips would be equivalent to the entire output of a small solar or wind power plant. In addition, the energy consumption of supporting facilities for so many chips must also be considered, including servers and cooling equipment. With so many power-consuming facilities concentrated in a small area, one can imagine the pressure it puts on the power grid.

Advertisement

AI power consumption, the tip of the iceberg

Regarding the issue of AI energy consumption, the New Yorker’s report once attracted widespread attention. Reports estimate that ChatGPT may consume more than 500,000 kilowatt hours of electricity per day.

In fact, although the current power consumption of AI seems to be astronomical, it is still far less than that of cryptocurrency and traditional data centers. The difficulties encountered by Microsoft engineers also show that what restricts the development of AI is not only the energy consumption of the technology itself, but also the energy consumption of supporting infrastructure and the carrying capacity of the power grid.

A report released by the International Energy Agency (IEA) shows that,In 2022, the power consumption of global data centers, artificial intelligence and cryptocurrency will reach 460 TWh, accounting for nearly 2% of global energy consumption.The IEA predicts that in the worst-case scenario, electricity consumption in these areas will reach 1,000 TWh by 2026, which is equivalent to the electricity consumption of the entire Japan.

However, the report also shows,The current energy consumption directly invested in AI research and development is much lower than that of data centers and cryptocurrency.

NVIDIA occupies about 95% of the AI ​​server market, supplying about 100,000 chips in 2023, consuming about 7.3 TWh of power annually. But in 2022, the energy consumption of cryptocurrencies is 110 TWh, which is equivalent to the electricity consumption of the entire Netherlands.

AI is
Legend: Estimated energy consumption of traditional data centers, cryptocurrency, and AI data centers in 2022 and 2026 (bar charts are displayed from bottom to top). It can be seen that the current power consumption of AI is much lower than that of data centers and cryptocurrency. Image source: IEA

Cooling energy consumption cannot be ignored

The energy efficiency of a data center is usually evaluated by the energy efficiency ratio (Power Usage Effectiveness), which is the ratio of all energy consumed to the energy consumed by the IT load. The closer the energy efficiency ratio is to 1, the less energy is wasted in the data center.

According to a report released by the Uptime Institute, a data center standards organization,The average energy efficiency ratio of global large data centers in 2020 is approximately 1.59.In other words, for every 1 kilowatt hour of electricity consumed by the IT equipment in the data center, its supporting equipment consumes 0.59 kilowatt hour of electricity.

Most of the data center's additional energy consumption is used in the cooling system.A survey study shows that the energy consumed by the cooling system can reach 40% of the total energy consumption of the data center.

In recent years, with the upgrading of chips, the power of a single device has increased, and the power density of data centers (i.e., power consumption per unit area) has continued to increase, placing higher requirements on heat dissipation. But at the same time, by improving data center design, energy waste can be significantly reduced.

Due to differences in cooling systems, structural designs, and other aspects, the energy efficiency ratios of different data centers vary greatly. The Uptime Institute report shows that European countries have reduced their energy efficiency ratio to 1.46, while more than one-tenth of the data center energy efficiency ratios in the Asia-Pacific region still exceed 2.19.

Countries around the world are taking measures to urge data centers to achieve energy conservation and emission reduction goals. Among them, the European Union requires large data centers to set up waste heat recovery equipment; the U.S. government invests in the research and development of more energy-efficient semiconductors; the Chinese government has also introduced measures requiring data centers to have an energy efficiency ratio no higher than 1.3 from 2025, and to increase the proportion of renewable energy use year by year. Increased to 100% by 2032.

AI is
Legend: Energy efficiency ratio of large data centers around the world in 2020. From left to right: Africa, Asia-Pacific, Europe, Latin America, the Middle East, Russia and CIS countries, the United States and Canada. Image source: Uptime Institute

Technology companies use electricity. It is difficult to reduce expenditures and even more difficult to increase revenue.

With the development of cryptocurrency and AI, the scale of data centers of major technology companies continues to expand. According to statistics from the International Energy Agency (IEA), there will be 2,700 data centers in the United States in 2022, consuming 4% of the country's electricity consumption, and it is predicted that this proportion will reach 6% by 2026. As land becomes increasingly scarce on the east and west coasts of the United States, data centers are gradually moving to central areas such as Iowa and Ohio. However, the original industries in these second-tier areas are underdeveloped, and the power supply may not be able to meet demand.

Some technology companies have tried to break away from the grid and buy electricity directly from small nuclear power plants, but this method of using electricity and building new nuclear power plants face complex administrative processes. Microsoft is trying to use AI to assist in completing applications, while Google uses AI to schedule computing tasks to improve power grid operation efficiency and reduce corporate carbon emissions. As for when controllable nuclear fusion will be put into use, it is still unknown.

Climate warming is making things worse

AI research and development requires stable and strong power grid support, butAs extreme weather occurs more frequently, power grids in many areas are becoming more vulnerable.Climate warming will lead to more frequent extreme weather events, which will not only cause a surge in electricity demand and increase the burden on the power grid, but will also directly impact power grid facilities. The IEA report pointed out that due to drought, insufficient rainfall and early snowmelt, the proportion of global hydropower in 2023 fell to its lowest value in thirty years, less than 40%.

Natural gas is often seen as a bridge in the transition to renewable energy, but it is not stable during extreme winter weather.In 2021, a cold wave hit Texas in the United States, causing widespread power outages. Some residents were without power for more than 70 hours. One of the main causes of this disaster was frozen natural gas pipelines, which caused the shutdown of natural gas power plants. The North American Electric Reliability Council (NERC) predicts that from 2024 to 2028, more than 3 million people in the United States and Canada will face an increasing risk of power outages.

In order to ensure energy security while achieving energy conservation and emission reduction, many countries also regard nuclear power plants as a transitional measure. At the 28th Summit of the United Nations Committee on Climate Change (COP 28) in December 2023, 22 countries signed a joint statement committing to increase nuclear power generation capacity to three times the 2020 level by 2050. At the same time, as China, India and other countries vigorously promote nuclear power construction, the IEA predicts that global nuclear power generation will reach a record high by 2025.

The IEA report pointed out: “In the face of changing climate patterns, it will become increasingly important to improve energy diversification, enhance cross-regional dispatch capabilities of the power grid, and adopt more impact-resistant power generation methods.” Ensuring power grid infrastructure is not only related to the development of AI technology , it is also related to the national economy and people’s livelihood.

references

[1] Kyle Corbitt.

[2] IEA (2024),Electricity2024, IEA, Paris https://www.iea.org/reports/electricity-2024, License: CC BY 4.0

[3] Andy Lawrence. Which regions have the most energy efficient data centers?. Uptime Institute.

https://www.datacenterdynamics.com/en/opinions/which-regions-have-most-energy-efficient-data-centers/. <2020-08-04/2024-04-10>

[4] Zhang, Xiaojing, Theresa Lindberg, Naixue Xiong, Valeriy Vyatkin, and Arash Mousavi. "Cooling energy consumption investigation of data center it room with vertically placed server."Energy procedia105 (2017): 2047-2052.

[5] Evan Halper. Amid explosive demand, America is running out of power.Washington Post.https://www.washingtonpost.com/business/2024/03/07/ai-data-centers-power/. <2024-03-07 /2024-04-09>.

[6] Jeremy Hsu. US grid vulnerable to power outages due to its reliance on gas. New Scientist. https://www.newscientist.com/article/2411905-us-grid-vulnerable-to-power-outages-due-to-its -reliance-on-gas/. <2024-01-11/2024-04-09>.

[7] Jeremy Hsu. Much of North America may face electricity shortages starting in 2024. New Scientist.https://www.newscientist.com/article/2409679-much-of-north-america-may-face-electricity-shortages-starting- in-2024. <2023-12-23/2024-04-09>.

Advertisement