In 2024, Google’s data centers consumed over 30.8 million megawatt-hours of electricity: more than double what they used in 2020. With massive GPU clusters humming 24/7, Google now uses as much electricity in a year as some countries. While the company has made strides in sourcing clean energy, the rate of demand growth is outpacing both infrastructure buildout and sustainability goals. And Google is hardly alone.
As AI workloads intensify across hyperscalers like Meta, Microsoft, and OpenAI, the energy requirements of data centers are on a collision course with the physical limits of the electric grid. The International Energy Agency projects data center electricity consumption could double globally by 2030, and the North American Electric Reliability Corporation has already warned o
NVIDIA Backed Emerald AI is Solving AI’s Energy Crisis
- By Mukundan Sivaraj
- Published on
“AI factories can flex when the grid is tight, and sprint when users need them to.”
