Energy Consumption through AI: A new Challenge

The increasing energy requirements of AI pose major challenges for data centers. In our blog post, we shed light on how sustainable solutions and new technologies can help to minimize the ecological footprint and increase efficiency.


In our last blog post “Sustainability in Data Centers: A must in the Age of AI“, we highlighted the importance of sustainable practices in data centers. Today, we would like to take a closer look at a pressing issue that is becoming increasingly important in the world of artificial intelligence (AI): the rapid increase in energy consumption by AI systems.

Rapid Rise in Energy Consumption

With the exponential growth of AI technology, energy requirements are also increasing rapidly. Large technology companies are investing billions in AI accelerators quarter after quarter, leading to a surge in power consumption in data centers. In particular, the rise of generative AI and the increasing demand for graphics processing units (GPUs) have led to data centers having to scale from tens of thousands to over 100,000 accelerators.

Energy Requirement per Chip grows

The latest generations of AI accelerators launched by Nvidia, AMD and soon Intel have resulted in a significant increase in energy consumption per chip. For example, Nvidia's A100 has a maximum power consumption of 250W for PCIe and 400W for SXM. The successor H100 consumes up to 75 percent more, resulting in a peak power of up to 700W. This development shows that although each new generation is more powerful, it also requires more energy.

Challenges and Solutions

As energy consumption continues to rise with each new generation of GPUs, data centers are faced with the challenge of meeting this demand efficiently. This is where innovative cooling technologies such as liquid cooling come into play, enabling effective heat dissipation while maintaining high power density.

An important step in overcoming this challenge is the increased use of renewable energy sources. In addition, leading chip manufacturers such as Taiwan Semiconductor (TSMC) are working to improve the energy efficiency of their products. TSMC's latest manufacturing processes, such as the 3nm and the future 2nm process, promise to significantly reduce energy consumption while increasing performance.

Forecasts show that the energy requirements of AI will continue to increase in the coming years. Morgan Stanley estimates that the global energy consumption of data centers will rise to around 46 TWh in 2024, which is already a threefold increase compared to 2023. Other forecasts assume that data centers could account for up to 25 percent of total electricity consumption in the USA by 2030.

Conclusion

The rapid development of AI technology brings with it enormous challenges, particularly in terms of energy consumption. As a data center operator, we see it as our duty to promote and implement sustainable solutions. However, the gigantic challenges of the AI age can only be overcome together - by the united IT industry, from AI developers to chip manufacturers and data centers.

Source: Forbes