Companies like OpenAI are consuming energy at a historic rate. One startup believes it has found a way to take pressure off the network

Photo of author

By [email protected]



The numbers are nothing short of amazing. Take Sam Altman, CEO of Open AI. He reportedly wants 250 gigawatts of new electricity – equivalent to about half of Europe’s all-time peak load – to power giant new data centers in the US and elsewhere around the world by 2033.

Building or expanding power plants to generate this amount of electricity according to Altman’s schedule seems unthinkable. “What OpenAI is trying to do is quite historic,” says Varun Sivaram, a senior fellow at the Council on Foreign Relations. The problem is that “it is impossible today for our grids, our power plants, to be able to provide that energy to those projects, and it is not possible for that to happen in the time frame that artificial intelligence is trying to achieve.”

However, Sivaram believes Altman may be able to achieve his goal of operating several new data centers in a different way. Sivaram, in addition to his position at the Council on Foreign Relations, is the founder and CEO of Emerald AI, a startup that launched in July. “I built it directly to solve this problem,” he says, “not just Altman’s problem specifically, but the larger problem of powering the data centers that all AI companies need.” Many smart minds in technology like Sivaram’s prospects. It is backed by Radical Ventures, NVentures, the venture capital arm of Nvidia, other venture capital firms, and highly experienced individuals including Google Chief Scientist Jeff Dean and Kleiner Perkins Chairman John Doerr.

The premise of Emerald AI is that the electricity needed for AI data centers largely already exists. Even large new data centers will experience power shortages only occasionally. “The power grid is a bit like a highway that experiences peak rush hour only a few hours a month,” says Sivaram. Likewise, in most places today, the existing network can easily handle the data center except during a few times of extreme demand.

Sivaram’s goal is to solve the problem of those rare high-demand moments that the network can’t handle. He says it’s not that difficult, at least in theory. He explains that some functions can be paused or slowed down, such as training or fine-tuning a large language model for academic research. Other jobs, such as queries to an AI service used by millions of people, cannot be rescheduled, but can be redirected to another data center where the local power grid is less stressed. Data centers will need to be resilient in this way less than 2% of the time, he says; Emerald AI aims to help them do just that by turning theory into real action. Sivaram says the outcome would be profound: “If all AI data centers worked this way, we could achieve Sam Altman’s global goal today.”

A paper Last February, researchers at Duke University published a test of the concept and found it to be successful. Separately, Emerald AI and oracle They tried out the concept on a hot day in Phoenix, and found they could reduce energy consumption in a way that didn’t degrade the AI’s calculations. “It’s like having your cake and eating it too,” Sivaram says. This paper is under peer review.

No one knows whether Altman’s 250 gigawatt plan will prove successful or foolish. In these early days, the future of Emerald AI is unpredictable, as promising as it sounds. What we know for sure is that grand challenges produce unimaginable innovations – and in the age of artificial intelligence, we should be prepared for a lot of them.

Global Luck Forum It returns from October 26-27, 2025 in Riyadh. CEOs and global leaders will come together at a dynamic, invitation-only event to shape the future of business. Apply for an invitation.



https://fortune.com/img-assets/wp-content/uploads/2025/10/GettyImages-2236544160.jpg?resize=1200,600

Source link

Leave a Comment