How AI is Impacting Data Center Efficiency

November 19 2024

Industry insight

Data center growth is on the rise. In the U.S. alone, data center demand is expected to increase by 10% annually through 2030, and the global data center market is projected to be just as bullish during the next decade.

The increase in the number of data centers to handle the amount of data generated and processed by artificial intelligence (AI) means the IT landscape is likely to look very different between today and 2030 — and much of this shift will center on the amount of energy required to power AI computing.

This dramatic increase is due to the amount of data processing and storage capacity required to facilitate global AI use. These increases present some unique challenges for data center operators, some of which include:

  • The right IT racks, cooling systems, and power distribution systems to efficiently facilitate such a rapid increase in processing, particularly as AI data center deployment becomes more variable
  • The physical footprint of building new data centers closer to the edge, where AI computing can be extremely valuable in reducing the latency of cloud applications
  • A sustainable model of energy usage that can adequately power AI data centers without an outsized environmental footprint

These elements are already shifting what data centers look like, how they operate, and what network engineers need to know about specifying the right solutions to overcome these hurdles so that companies can leverage AI computing to its fullest potential.

How AI is changing the data center landscape

AI is changing more than just the amount or type of information data centers process — AI is changing how data centers are built and the IT solutions necessary to harness its power. For example, the average data center today can range between 5,000 to 10,000 square feet in size, and the need for additional IT equipment to effectively handle escalated data processing workloads is changing the data center landscape in a couple of key ways. 

The emergence of hyperscale data centers — which can be close to 100,000 square feet in size — provides the kind of flexibility and scalability required to process and store high volumes of data.

Because hyperscale data centers are built for adaptability, they are ideal to handle AI’s increased data workloads.

In addition, the increased number of server racks inside a data center necessitates more effective and efficient cooling. However, the challenge is addressing thermal loads in a targeted way that does not involve simply installing additional cooling units. Space is limited inside the kind of high-density data centers that are common in AI computing. As such, data center operators need to think about IT cooling in a strategic way to reduce waste and increase operational efficiency.

While these are two of the biggest ways in which AI is changing the look and size of today’s data centers, they’re not the only ways, and there’s much that data center operators need to know when creating the ideal IT infrastructure to efficiently support AI computing.

Rittal’s new white paper on how AI is impacting data center efficiency addresses these challenges with actionable steps to build data centers that are ideal for AI computing.

Download the white paper to learn more
 

[1] Investing in the Rising Data Center Economy. McKinsey & Company, January 17, 2023