top of page

AI, Climate, and the Narrow Path to Abundance

  • Writer: Joy Dasgupta
    Joy Dasgupta
  • Jun 10
  • 5 min read

Wrapping our heads around AI’s energy demand and the impact on climate is tougher than one might imagine. Too many moving parts. Too many points of view and too many collapsed forecasting models in the face of the blistering pace of change, where decades have been compressed into months.

 

We’ve seen headlines like: AI is an environmental disaster. Training just one large model can emit 800,000 kg of CO₂, which will require 13,000 young trees growing and working non-stop for ten years to sequester, just to break even.

 

But such straight-line narratives rarely tell the whole story. The reality is messier. AI’s rising energy demand doesn’t exist in isolation. It is influenced by self-reinforcing feedback loops, global politics, our dependence on technology as if it were oxygen, the hubris of technologists, the resistance of anti-capitalist climate activists, and our collective will, or lack thereof, to safeguard our long-term viability. In addition, a useful predictive model will also need to encompass the following:


  • Rapidly increasing efficiency of AI chips, even as demand explodes.

  • AI models becoming more efficient, e.g., Deepseek, Gyan.

  • The rise of low compute architectures e.g., Gyan.

  • The rapid scaling and increasing efficiency of renewable energy.

  • The rapid removal of dirty energy from the grid. 

  • Gen AI beginning the slide into the trough of disillusionment.

  • Unprecedented pressure on critical rare earth mineral supply.

 

In the absence of useful models, we can try setting a range instead, to include a spectrum of possible futures.


  1. In the grand scheme, in the long run, AI’s energy demand is not a cause for concern.The International Energy Agency (IEA) projects that by 2030, AI energy consumption will be less than 1% of global energy usage.  Hardly a showstopper, especially given all the hopes pinned on this “greatest force since fire."

  2. But AI’s need for over 1000 terawatts of power by 2035 is a major cause for concern. Today, the only option to feed ravenous AI models is with dirty energy, we don’t have enough renewables on the grid. If in that process we blow our ever-shrinking carbon budget, we may trigger one or several irreversible planetary tipping points. And if we reach that point of ecological collapse the long term becomes irrelevant. So, while on the other side of the energy transition lies unlimited, essentially free energy, from solar, wind, geothermal, nuclear fission (and fusion) we will have to go through the proverbial eye of the needle to enter this paradise. The path is narrow, and time is very short. 

 

Market forces are driving us to go all in on data center capacity buildout. Microsoft announced $80B investment in data centers in 2025 alone! Multiple five-gigawatt data centers, each requiring 44 TWh or energy (the equivalent of five nuclear reactors) are being planned in the US. To put this into physical perspective it takes “only” 200MW of electricity to power a Google campus comprising twenty football field sized data centers. Many of these have been proposed in areas already struggling with water scarcity.

 

Alongside this surge of data center demand, or because of it, several of the biggest tech companies are rapidly walking back their net zero pledges.  Google’s 2024 environmental report shows how its carbon emissions have increased by 48% in 5 years! Several others have reduced ambitions of their net zero plans, as their only option to meet this burgeoning new demand is with dirty energy. In an environment where electricity generation and transmission efficiencies have stayed flat over the last decade, this is not good news. 

 

Disentangling data center demand from AI driven data center demand is also not straightforward.  A Goldman Sachs report says, “The combination of AI, ex-AI increases in data demand and a material slowdown in power efficiency gains is making data centers a critical driver of accelerating global and US electricity demand growth” and projects that will rise to 4% of global share by 2030 (up from ~2% today) and for the US it will be closer to 8%.  AI will drive 20% of that growth, of which Generative AI will be a major contributor, but we don’t know exactly how much.  The remaining 80% is energy demand for other data center needs, such as cloud services, storing humanity’s digital exhaust, photographs, videos. 

 

We have several things going for us. We already have the technology we need to keep us to within say 2 or may be 3 degrees of warming.

 

Climate-friendly tech solutions are already the better and cheaper solutions. The US administration’s “drill baby drill” rallying cry may ring hollow for subnational entities that actually drive everything, including climate action. Constellation Energy for example is reactivating the undamaged nuclear reactor at the Three Mile Island site and has signed a PPA to power Microsoft’s artificial intelligence operations. It will generate approximately 835 megawatts of carbon-free energy; create 3,400 direct and indirect jobs and deliver more than $3 billion in state and federal taxes. And the Susquehanna nuclear power plant’s deal with AWS to plug 40% of its capacity directly into the data center bypassing completely the complexity of getting this power onto grid. AI being powered by clean energy, available 24x7, sounds like just what the doctor ordered. 

 

Manufacturers of chips have continued to accelerate the already frenetic pace of innovation over the past fifty years.  From 2399 transistors in an Intel chip in 1971 to 134 billion transistors on a 2024 Apple M2 ultra chip, we are now operating at circuitry that is etched into material 25,000 times thinner than human hair. NVIDIA’s performance-per-watt of its GPUs improved 4,000-fold over the past few years. Google’s custom Trillium AI chips operate at 67% improved efficiency than its predecessor, while delivering close to 5x improvement in peak compute performance. We are fusing chiplets into chips, working on photonic chips, neuromorphic chips, even analog chips. And then there is the final frontier, quantum computing.

 

The Path to Abundance

Instead of weekly headlines on incremental, and often pointless, improvements to our frontier AI models we need to start seeing headlines on step function improvement in energy efficiencies... driven by efficiency improvements in AI architectures, AI models, AI chips and data centers.  Soon after that we need to see headlines about cancelation of mega data center projects, because our state-of-the-art AI now requires a much smaller energy footprint.

 

Unfortunately, we can’t count on responsible tech leaders or visionary governments to nudge us there. They are all MIA. But we can count on the time-tested now a law of the silicon universe of accelerating technology improvements combined with market forces, encapsulated in Moore’s law.


Moore’s law runs itself.  It operates all the time. It does not need regulatory oversight. It always delivers on schedule, often before. It bends exponential technology curves towards outcomes that exceed expectations. It is our best and perhaps only path for an on-time arrival into a clean energy future. Then the new AI, a NetZero technology, much more powerful than anything we can imagine today, might just be able chart us a path back from the brink of climate and environmental catastrophe, into a brave new world of abundance.


Joy Dasgupta

CEO, Gyan

 

Gyan is built on a highly energy efficient architecture and can run on CPUs. Gyan’s language model trains on 200 times less energy than the major LLMs. Designed for the enterprise, Gyan is hallucination free, keeps data secure and private, and does not infringe on IP. We invite you to join us in advancing AI without compromising our atmosphere and biosphere. 

 

Comments


Commenting on this post isn't available anymore. Contact the site owner for more info.
bottom of page