top of page

The Carbon Cost of Artificial Intelligence: Evaluating the Environmental Impacts of AI

  • Lily Alvino
  • Oct 6
  • 2 min read
ree

Seemingly overnight, artificial intelligence has embedded itself into every corner of daily life. From email spam filters to summaries shaping Google search results, its presence is pervasive and often difficult to recognize. With its rapid integration has come promises of endless innovation and increased productivity, but the threats it poses to the environment hinders its potential. The sheer amount of resources artificial intelligence requires to operate strains environmental systems, raising ethical concerns. 


Generative AI platforms, such as OpenAI’s ChatGPT-4, demand sizable energy and resources throughout their lifecycle–including functions like hardware manufacturing, model training, and daily operation. New-generation computer chips to power GenAI require more electricity and cooling systems than predecessors. This is because the AI model training involves training billions of parameters through continuous computations to execute the complex demands from its users. Each training session can last weeks to months, consuming a substantial amount of energy during that time. Furthermore, the process of training AI uses a high-performance computing (HPC) infrastructure that requires thousands of tensor processing units (TPU), which rely on fossil fuels from both a manufacturing, processing, and operational standpoint.


Beyond the production of GenAI, environmental problems have also arisen within AI data centers.  The International Energy Agency projects GenAI data center energy demands to double, increasing from 415 terawatt-hours to 945 terawatt-hours (TWh) by 2030. As the demand for GenAI continues to increase, so does the pressure placed on power grids. ChatGPT, an OpenAI tool used by millions daily, requires 2.9 watt-hours of electricity per single search–as opposed to the 0.3 watt-hours per search by its counterpart Google. In the United States alone, the expansion of AI is projected to account for a 6% national increase in total electricity usage in 2026. The sustained production of electricity will contribute to environmental concerns such as rising air pollution levels, disregard for solid waste, and an increase in thermal pollution within bodies of water. 


Freshwater resources are also being targeted with the expansion of AI. These data centers require large amounts of chilled water to absorb the heat from computing equipment, resulting in the consumption of an estimated two liters of water for each kilowatt hour of energy. In a study recently published by professors at UC Riverside and UT Arlington, it was determined that the water used during the production and operation of these mega-data centers equates to about six times the water consumption of Denmark. Within the United States, expansive GenAI data centers have now been sprouting in the western region, prolonging droughts in states like Arizona.  


As GenAI reaches new heights, the expansion of its physical infrastructure warrants careful consideration in the context of environmental sustainability and resource management. Its growing presence in water-stressed regions highlights the need to align technological advancement with ecological conservation. The unprecedented nature of GenAI brings about challenges regarding regulations and policies; however, in a tone of environmental caution, more than 190 countries have fostered a series of non-binding recommendations regarding ethical and sustainable practices of GenAI. As Generative Artificial Intelligence continues to define the future, its development must be grounded in a clear understanding of its environmental impact. The ultimate challenge for future generations will be ensuring that innovation remains aligned with sustainability.

Comments


bottom of page