In 2024, Google’s data centers consumed over 30.8 million megawatt-hours of electricity: more than double what they used in 2020. With massive GPU clusters humming 24/7, Google now uses as much electricity in a year as some countries. While the company has made strides in sourcing clean energy, the rate of demand growth is outpacing both infrastructure buildout and sustainability goals. And Google is hardly alone.
As AI workloads intensify across hyperscalers like Meta, Microsoft, and OpenAI, the energy requirements of data centers are on a collision course with the physical limits of the electric grid. The International Energy Agency projects data center electricity consumption could double globally by 2030, and the North American Electric Reliability Corporation has already warned of destabilization risks. In the United States, delays in interconnection can stretch as long as ten years. But a new class of companies, and a new software approach, may be offering an alternative path forward.
Emerald AI’s Grid-Aware Architecture
Launched with $24.5 million in seed funding this month, Emerald AI is a core enabler of what its founder, physicist and energy strategist Dr. Varun Sivaram, calls a paradigm shift: transforming data centers from inflexible power hogs into agile grid assets. The company’s Conductor platform uses AI to orchestrate data center energy consumption in real time, selectively throttling or redirecting compute tasks based on local grid conditions. That means deferring low-priority jobs or rerouting them to less-congested centers, while ensuring mission-critical inference tasks remain uninterrupted.
The Phoenix field test that accompanied the company’s launch offers a proof point. In partnership with Oracle Cloud Infrastructure, NVIDIA, Salt River Project (SRP), and the Electric Power Research Institute (EPRI), Emerald demonstrated that it could cut power consumption of a 256-GPU cluster by 25% over a three-hour peak stress window without degrading AI performance. “This test was an opportunity to completely reimagine AI data centers as helpful resources to help us operate the power grid more effectively and reliably,” said SRP president David Rousseau.
Emerald’s seed investors include Radical Ventures, NVentures (NVIDIA’s venture arm), Amplo, CRV, and Neotribe, as well as notable individuals such as Google Chief Scientist Jeff Dean, AI pioneer Dr. Fei-Fei Li, Kleiner Perkins Chair John Doerr, and former U.S. Secretary of State John Kerry. The company is also backed by Secretary Gina Raimondo and Boston University professor Ayse Coskun, a specialist in high-performance computing and grid systems, who serves as chief scientist.
“The workloads that drive AI factory energy use can now be flexible,” Sivaram said in an interview with NVIDIA’s Marc Spieler. “AI factories can flex when the grid is tight, and sprint when users need them to.”
Beyond Infrastructure: Rethinking the Energy Stack
Emerald’s approach stands in stark contrast to some of the largest AI infrastructure projects underway. OpenAI’s $500 billion Project Stargate, being built with Oracle, is turning to a mix of wind power and backup gas turbines to meet relentless power demands. Crusoe Energy, the developer behind the Stargate sites in Texas, initially made its name redirecting flare gas from oilfields to mine Bitcoin. Now it’s adapting those same strategies, along with 100,000 NVIDIA GPUs per building, to meet AI’s energy appetite, including single-cycle gas turbines that are less efficient and more polluting than modern alternatives.
While these projects underscore the urgency of getting power online, they also reflect a shortfall in long-term sustainability. “There is a high level of urgency in the industry to get power fast,” said Crusoe COO Cully Cavness to Bloomberg. “We have tried to be creative about the energy component of data centers.” But reliance on gas turbines, sometimes emitting over 1,300 pounds of CO₂ per MWh, risks undercutting climate pledges made by many of these same firms.
Emerald AI offers a complementary solution: not building new generation, but using existing grid headroom more intelligently. A Duke University study cited by Sivaram’s team estimates that if AI factories could flex their power use by just 25% for 200 hours a year, it could unlock 100 gigawatts of new grid capacity. That’s equivalent to over $2 trillion in new data center infrastructure.
This flexibility also makes integrating renewables more feasible. “Renewable energy, which is intermittent and variable, is easier to add to a grid if that grid has lots of shock absorbers that can shift with changes in power supply,” said Coskun.
A Broader Ecosystem of Solutions
Emerald is not alone in tackling the data center energy crunch. Amperon, an AI-powered energy forecasting firm, recently secured strategic investment from National Grid Partners and works with major energy players like Ørsted and AES to improve real-time energy predictions. Meanwhile, GridCARE, another AI startup, is mapping pockets of underutilized grid capacity to fast-track interconnections for new data centers.
Major hyperscalers are also locking in long-term power deals. Meta recently signed a 20-year agreement with Constellation Energy to support the continued operation of the Clinton Clean Energy Center in Illinois. Microsoft is backing the restart of the Three Mile Island nuclear facility. Google, despite leading investments in solar, is exploring nuclear and geothermal options to offset the plateauing of energy efficiency improvements in its data centers.
But many of these renewable projects take years to deploy. And reversible computing, a still-theoretical method for dramatically cutting energy use by avoiding data deletion, may be decades away from commercial viability. Vaire Computing, a startup spun out of academic research on reversible architectures, is among the few commercial players exploring the space, but it remains early.
In the meantime, companies like Emerald offer a bridge: a software layer that reduces strain on the grid without requiring entirely new hardware or generation. “We’re at a critical inflection point,” Sivaram said. “To unshackle AI progress from power constraints, we need to make infrastructure responsive.”
The outcome of this next phase of AI infrastructure will show up beyond tech. Sivaram’s experience as former Chief Strategy Officer at Ørsted and advisor to the U.S. Secretary of State on climate may give him a relatively unique vantage point: bridging national policy, clean energy development, and hard tech innovation to the table.