AI presents a striking climate paradox. While designed to help combat climate change, AI systems consume massive energy. Data centers will use 1,050 terawatts by 2026, with AI representing 19% of power demand by 2028. ChatGPT alone generates CO2 equivalent to 260 transatlantic flights monthly. Tech companies are investing in renewables and edge computing as potential solutions. The balance between AI’s environmental costs and its climate-fighting potential remains a critical challenge.
As artificial intelligence (AI) continues to transform our world, a troubling paradox is emerging. The same technology that promises to help solve our climate crisis is rapidly becoming a major contributor to it. Data centers, the backbone of AI systems, are projected to consume about 1,050 terawatts of electricity by 2026, with AI representing nearly 19% of that power demand by 2028.
The numbers paint a concerning picture. ChatGPT alone generates over 260,930 kilograms of CO2 monthly—equivalent to about 260 transatlantic flights. Industry analysts project that data center energy consumption could more than double by 2030, with AI development potentially tripling CO2 emissions from these facilities. This could reach 2.5 billion tonnes of greenhouse gas emissions, roughly 40% of current annual U.S. emissions.
AI’s carbon footprint is skyrocketing—potentially reaching emissions equivalent to 40% of annual U.S. totals by 2030.
Manufacturing the hardware for AI systems adds another considerable environmental burden. CO2 emissions from GPU-based AI accelerator production are expected to increase 16-fold between 2024-2030. The fabrication process is extremely resource-intensive, requiring massive amounts of silicon, energy-hungry processes like lithography and etching, and substantial water usage for cooling. The environmental impact is further exacerbated by the toxic chemicals used during GPU manufacturing.
Tech companies are beginning to acknowledge this growing problem. Some are investing in renewable energy to offset their impact. Edge computing offers some hope, potentially reducing AI’s carbon footprint by up to 1000 times. Geothermal energy, with its availability factor of 95%, could provide the consistent and reliable power needed for AI infrastructure without the environmental drawbacks of fossil fuels. However, the current trajectory remains unsustainable.
The irony isn’t lost on environmental experts. Without the boom in AI development, data center emissions would be considerably lower. Yet many AI applications are being developed specifically to address climate challenges. Today’s AI systems remain surprisingly inefficient compared to biological intelligence, consuming enormous resources while still being prone to errors. Training large language models like GPT-3 produces an alarming 626,000 pounds of carbon dioxide emissions.
As leading AI research labs plan facilities requiring gigawatts of electricity—enough to power entire cities—the question becomes urgent: can we harness AI’s potential to fight climate change without making the problem worse in the process? The race is on to develop more sustainable AI practices before the technology’s climate impact grows beyond control.
References
- https://news.mit.edu/2025/explained-generative-ai-environmental-impact-0117
- https://thesustainableagency.com/blog/environmental-impact-of-generative-ai/
- https://sustainability-news.net/climate-nature/chatgpts-monthly-carbon-footprint-equivalent-to-260-transatlantic-flights/
- https://www.techinsights.com/blog/ai-gpu-growth-directly-impacts-carbon-emission-growth-through-2030
- https://www.indeed-innovation.com/the-mensch/ais-climate-crisis-are-we-burning-the-planet-to-feed-our-digital-brains/