Nickeled & Dimed

Penny for your thoughts?

We are accepting articles on our new email: cnes.ju@gmail.com

AI is “Thirsty”! How Big Tech is Straining Water and Power

Abstract:

The rapid expansion of artificial intelligence (AI) has led to a significant increase in energy and water consumption, raising concerns about its environmental sustainability. AI models, such as ChatGPT, require vast computing power, leading to immense heat generation in data centres. Traditionally, air-based cooling was the primary method, but as processing demands grew, liquid cooling became more prevalent due to its efficiency. However, liquid cooling significantly contributes to freshwater depletion, with tech giants like Microsoft and Google consuming billions of gallons annually. While companies are exploring sustainable cooling solutions, such as underwater data centres and alternative coolants, the effectiveness of these initiatives remains uncertain. The ethical and economic implications of AI’s resource consumption raise the question of whether its benefits outweigh its environmental costs. As AI continues to integrate into society, the industry must prioritise energy efficiency and water conservation to achieve long-term sustainability. This article explores AI’s growing resource demands, its impact on water and energy consumption and the innovative solutions being developed to mitigate its environmental footprint.

Introduction

Artificial intelligence has become an integral part of our modern world, driving innovation across industries, optimising business processes and transforming the way humans interact with technology. However, as AI advances, so does its environmental footprint, particularly in terms of water and energy consumption. AI models require vast computing power, which in turn demands substantial cooling efforts. This article explores the growing concerns around AI’s energy and water consumption, how major tech companies address these issues, and the future of sustainable cooling solutions.

Cooling the AI Boom- Cold Water, Cold Air and Heated Servers

Water scarcity is a looming crisis. While about 71% of the Earth’s surface is covered in water, only about 3% is freshwater and an even smaller fraction is readily accessible for human use. An adult male requires about 3 liters of water per day, while an adult female requires approximately 2.2 liters. However, by 2030, nearly half of the world’s population is expected to face severe water stress, according to the UN Environmental Report.

Meanwhile, AI models are placing additional strain on these already limited resources. For instance, ChatGPT consumes approximately 500 milliliters of water to process every 5 to 50 prompts. The high-performance computers that power AI systems generate enormous amounts of heat, necessitating sophisticated cooling mechanisms to ensure operational efficiency and safety. Initially, most data centers relied on air-based cooling methods, but as processing demands exceeded 30 kW, liquid cooling technologies became more widely adopted. Companies like OpenAI source water from regions such as the Des Moines and Raccoon River watersheds in Iowa, where water-intensive AI operations have raised environmental concerns.

Big Tech’s reliance on water-heavy cooling strategies aligns with their large-scale investments in AI. Microsoft, for instance, reported a global water usage of nearly 1.7 billion gallons in recent years. Google has similarly increased its water consumption, using billions of gallons in 2024 alone. Water shortages have already led Google to shift to air-cooling methods in regions like Arizona, where water scarcity is a pressing issue. Although cooling technology itself has been around since the late 1800s, its modern-day application in AI infrastructure presents new sustainability challenges.

The Energy Demands of AI Processing

The recent shift towards liquid cooling can be attributed to the growing awareness of the unsustainable water and energy consumption by major tech companies. AI operations are particularly energy-intensive, with data centers accounting for about 2% of global electricity consumption, according to the International Energy Agency (IEA). This demand is only expected to increase as AI adoption expands.

Cooling is a significant contributor to data centers’ operational costs, making up approximately 40% of the total electricity bill. Power-hungry processing units continue to demand more energy as newer generations of silicon chips are developed. Consequently, data centers must improve their power usage effectiveness (PUE), which is a measure of how efficiently a data center uses its energy. The ideal PUE is 1.0, meaning that all energy is effectively used for computing, but the average PUE for most data centers is around 1.8. Sustainability-focused data centers strive to achieve a PUE of 1.2, but this remains a challenge given the exponential growth of AI-driven processing workloads.

Data centers also contribute significantly to greenhouse gas emissions, and are currently responsible for about 1% of global emissions. As the AI industry continues to scale, these numbers will only grow unless sustainable measures are implemented.

Liquid Cooling: The Future of Data Center Efficiency

Among the cooling solutions available, liquid cooling is emerging as the most effective. It outperforms air-based cooling by efficiently dissipating heat and reducing overall energy consumption. There are two main types of liquid cooling: immersion cooling and direct-to-chip cooling.

Further subcategories of liquid cooling include hydrocarbon-based and fluorocarbon-based solutions. While fluorocarbon-based coolants are more effective, they have been criticised for their environmental impact, leading to governmental bans on their use. Many tech giants are working toward alternative solutions. For example, Lenovo has adopted direct-to-chip cooling to enhance system-level efficiency, while Dell is investing in direct liquid cooling research in collaboration with standardisation bodies like ASHRAE and the Open Compute Project.

AI’s Expanding Environmental Footprint

As AI becomes an indispensable tool across industries, its environmental impact is growing. The demand for cooling solutions has surged, with up to 9 liters of water being used per kWh of energy in data centers. AI is projected to reach a staggering 6.6 billion cubic meters of water consumption by 2027.

Additionally, the power used by AI is associated with significant carbon emissions. In the United States, thermoelectric power plants—known for their water-intensive cooling needs—contribute to AI’s overall water footprint, consuming an average of 43.8 liters per kWh.

Several leading tech companies, including Microsoft, Google and Meta, have pledged to mitigate their environmental impact, aiming to replenish more water than they consume by 2030. However, the feasibility of these promises remains uncertain and whether these commitments will translate into meaningful sustainability efforts remains to be seen.

The AI Debate: Worth the Investment?

Beyond the environmental impact, AI raises broader ethical and economic concerns. As AI models become more ingrained in daily life, future generations will increasingly depend on these technologies. However, AI is not infallible—its information is not always reliable and its rapid expansion requires scrutiny.

With the AI industry expected to exceed a $200 billion market valuation, an important question arises: Is the investment in AI truly worth it, given its environmental and ethical concerns? The environmental costs of AI extend across multiple stages, including computing hardware production, data center operations, cloud infrastructure maintenance and AI model training. For perspective, training a single AI model like GPT-3 can emit up to 552 tonnes of carbon dioxide equivalent—comparable to the lifetime emissions of five cars.

Keeping it Cool: Innovative Solutions

Tech companies are actively exploring new cooling solutions to reduce their environmental footprint. Microsoft’s Data Centre Community Pledge includes commitments to reducing water consumption by more than 125 million liters per data center annually by emphasising zero-water evaporation and improving water-use efficiency (WUE).

In 2020, Microsoft also tested an ambitious underwater data center project. A container-sized server was submerged on the ocean floor to examine the feasibility and sustainability of underwater cooling. The results were promising—submerged servers experienced one-eighth the failure rate of traditional land-based data centers. This experiment provided insights into alternative cooling techniques that could reduce data centers’ reliance on freshwater resources.

Conclusion: A Sustainable Future for AI?

As AI continues to evolve, the environmental consequences of its growth must be addressed. The energy and water demands of AI-driven data centers present a significant challenge, but innovative cooling solutions may help mitigate these impacts. While liquid cooling offers efficiency advantages over traditional air-based cooling, it still raises concerns regarding water consumption and the use of environmentally harmful coolants.

Major tech companies have pledged to adopt more sustainable practices, but their ability to follow through remains to be seen. In the long run, achieving sustainability in AI infrastructure will require industry-wide efforts, policy interventions and continued research into eco-friendly cooling alternatives. Balancing AI’s rapid expansion with environmental responsibility is crucial for a sustainable technological future.

Author’s bio: Aditi Gupta is a graduate of B.A. (Hons.) Liberal Arts and Humanities with a major in Political Science and International Relations. She is interested in pursuing further studies in international relations, environmental studies and economics. 

Image Source: Microsoft finds underwater datacenters are reliable, practical and use energy sustainably – Source

Leave a comment