Navigating industry challenges and opportunities in the age of generative AI

Blogs and Articles

Mark Kidd takes a look at how Generative AI is reshaping the data center landscape — unveiling challenges in power consumption, e-waste and sustainability.

December 12, 20237 mins
Woman having a look at data center servers

This year has marked a profound shift with the emergence of Generative AI, changing the ways we live and do business. Large Language Models (LLMs) have captured the popular imagination, showcasing the potential to automate and enhance various aspects of society.

In the whirlwind of excitement surrounding Generative AI, businesses are feverishly adapting their models to accommodate this transformative technology. That includes the data center industry.

Generative AI is driving an industry-wide data center design revolution fueled by sustainability. Smart data center users undertaking AI investment should be aware of this and start planning for it now.

AI's voracious appetite

This revolutionary technology is ushering in a significant transformation in the requirements for data centers. Generative AI, known for its voracious power consumption, requires substantial computing, network, and storage infrastructure, often categorized under High-Performance Computing.

Addressing the insatiable power demands of AI requires substantial upgrades to data center infrastructure. The challenges of capacity and power loom large over the data center industry today.

Tailoring facilities to meet the specific needs of these configurations involves considerations such as high-density power, modular architecture, high-bandwidth connectivity for both training and inference, and advanced cooling systems.

At Iron Mountain Data Centers, we have seen this first-hand with our customers and have developed specialist facilities that meet their needs. Two notable examples from the healthcare and research sectors exemplify the transformative potential of generative AI.

One of our healthcare clients developed a supercomputer for AI-driven imaging applications. The data growth of this supercomputer has been exponential since its inception in 2018, showcasing the rapid evolution of AI in the medical field.

In research, Arizona State University's Computational Research Accelerator department faced challenges with their existing supercomputer, 'Agave,' so they built a new supercomputer called ‘Sol’ in 2022 in one of IMDC’s Phoenix data centers.

Sol is a Dell-built system spanning 178 nodes, it uses AMD Epyc 7713 CPUs (consisting of around 18,000 cores), with the bulk of the nodes carrying 512GB of memory and five large-memory nodes equipped with 2TB. It has 56 GPU nodes with quadruple Nvidia A100 (80GB) GPUs each and 4 nodes with triple Nvidia A30 (24GB) GPUs. The system is networked with Nvidia’s 200GB/s HDR InfiniBand and supported by 4 Petabytes of Dell BeeGFS scratch storage. The R&D potential of Sol is extremely exciting, and a steep physical growth curve is anticipated.

More power demand

The primary challenge in accommodating generative AI lies in the substantial surge of power demands. Generative AI models heavily rely on graphics processing unit (GPU) chips, notorious for their energy consumption that surpasses traditional CPUs by 10–15 times. These models often boast billions of parameters, requiring swift and efficient data pipelines during their months-long training phases.

As an illustrative example, ChatGPT 3.5 possesses a staggering 175 billion parameters, trained on a corpus of over 500 billion words. The energy demand to train a ChatGPT 3.5 model ranges between 300-500 MW, significantly eclipsing the typical power requirements of a data center, which currently falls in the range of 30-50 MW.

At one of our larger campuses in Northern Virginia, designed to accommodate up to 10 data centers, the entire power load of the campus would be necessary to train a ChatGPT 3.5 model. While Large Language Models (LLMs) represent the pinnacle of power-hungry facets within the generative AI landscape, it is noteworthy that every generative model encountered exhibits processor and power needs that escalate exponentially, either doubling or tripling each year.

Forecasting the future power requirements of generative AI is challenging, but most analysts agree on the substantial escalation of current requirements. Assuming a conservative compound growth rate of 15% for existing data centers, global capacity is anticipated to double in five years and quadruple in 10.

This is happening fast. Analyst TD Cowen reported a "tsunami of AI demand," with 2.1 GW of data center leases signed in the US in Q2 2023, which is a fifth of the current total supply.

An E-Waste avalanche

Another challenge also created by AI has emerged at the backend — a surge in discarded equipment. AI is driving faster server innovation, especially in chip design, and has led to the development of cutting-edge AI chips like the Nvidia H100. These chips, with billions invested in their manufacturing and facing scarcity, have become a form of debt collateral and are even available for rent. While this accelerated innovation cycle is vital for enhancing efficiency, it contributes to the escalation of e-waste, along with the concurrent surge in capacity.

E-waste is one of the fastest-growing waste streams globally. Projections show that by 2030, annual e-waste production could skyrocket to an astounding 75 million metric tons. This electronic waste is estimated to hold approximately $60 billion worth of raw materials, including gold, palladium, silver, and copper. Just 17 percent of the world's e-waste is currently collected and properly recycled each year.

Looming climate targets

Against the backdrop of these challenges, the data center industry faces unprecedented pressure to address environmental concerns. As the world shifts towards zero-emission targets, the environmental impact of generative AI, particularly in terms of power consumption and e-waste, will be under intense scrutiny.

Addressing the challenge

Addressing these challenges requires a multifaceted approach. Low-to-no-carbon power sources will be crucial in meeting the escalating power demands of generative AI. Innovations in microgrids and backup power sources will play a significant role.

Renewables are key. Most hyperscalers and a growing number of colocation providers have been working steadily to grow the green grid and eliminating carbon. Today, hyperscalers are the biggest buyers of renewables in the world. On the colocation side, Iron Mountain Data Centers is now one of the top 20 renewable buyers in the world.

Data center owners must prioritize sustainability, with a focus on 24/7 carbon-free energy. Following the lead of industry giants like Google, Iron Mountain Data Centers has committed to this approach, surpassing mere attribution of power to renewable credits.

You can see how we and our partners have gone about this in a recent documentary ‘Transforming our Future’. This is a major step up from ‘attributing’ power used to renewables credits, and we believe that this approach will in time replace the current year-by-year Virtual Power Purchase Agreement model.

The e-waste challenge necessitates a focus on circularity. The accelerated innovation in AI chips and GPUs should be met with efficient IT asset lifecycle optimization, recycling, remarketing, and secure disposal. IMDC's Asset Lifecycle Management (ALM) division sanitizes over 3 million drives annually and is investing in companies like Regency Technologies to enhance recycling capabilities.

The AI opportunity for the Data Center industry

Generative AI not only promises to revolutionize the industries it serves but also the infrastructure industry supporting it. While the economic value is immense, power consumption is equally significant. Many generative AI applications can be hosted in a specialized shared facility. Different models have different infrastructure requirements, but all share the need for high-density power, advanced cooling and modular design.

Overcoming these challenges is not insurmountable, especially in an era where innovation and commitments to addressing the climate crisis are prevalent. AI may be a key to solving the problem, not just for our own industry but for other sectors.

For data center customers delving into generative AI applications, early planning and investment are essential to stay ahead of the steep upward curve in power and space uptake. They’ll need to pay close attention to infrastructure design, efficiency, energy sourcing, and e-waste management. This means scrutinizing the energy track record and targets of their cloud or data center provider and sharing data on climate target progress and day-to-day access to (preferably 24/7 carbon-free) renewables.

As the industry grapples with these challenges, it has the potential not only to transform itself but to offer solutions to broader sectors facing similar environmental concerns.

Mark Kidd is executive vice president and general manager of the Iron Mountain Data Centers and Asset Lifecycle Management (ALM) business units. Mark has led the data center organization since its inception in 2013 and additionally took over the ALM organization in early 2023. Mark is responsible for driving growth across the data centers and ALM platforms, including setting strategic direction, leading commercial efforts, and developing expansion opportunities.