With Great Power

Can AI help save the grid?

Episode Summary

Varun Sivaram says AI has the power to ensure grid stability.

Episode Notes

Growing up in Silicon Valley, Varun Sivaram didn’t look to Elon Musk or Sergey Brin to learn about success stories. He looked to his dad, a material scientist who immigrated from India. 

But Varun’s own dreams of pursuing a career in technology took a circuitous path. His physics lab at Oxford discovered a promising new solar material, but when he emerged from graduate school in 2012, it was no time to launch a renewables startup. After a successful early career pursuing his other love, foreign relations, he pivoted to tech. In 2024, Varun founded Emerald AI, which helps data centers adjust their workloads to use energy more efficiently.

This week on With Great Power, Varun explains why he thinks AI can help save the grid. Varun and Brad talk about the demonstration pilots Emerald AI has completed and Varun’s vision for a massive AI factory the company is helping to build, in Virginia.

Credits: Hosted by Brad Langley. Produced by Mary Catherine O’Connor. Edited by Anne Bailey. Original music and engineering by Sean Marquand. Stephen Lacey is executive editor. The GridX production team includes Jenni Barber, Samantha McCabe, and Brad Langley.

Episode Transcription

Brad Langley: It's not hard to find immigrant success stories in Silicon Valley. Varun Sivaram grew up with one.

Varun Sivaram: My dad's been my role model my whole life. He came from India, immigrated with very little to his name, became a PhD scientist in his own right, material science. His first job was across the corner from Gordon Moore, one of the titans of Silicon Valley.

Brad Langley: Siva Sivaram went on to become a VP at Sandisk and eventually president of the company that acquired it, Western Digital.

Varun Sivaram: A Fortune 500 company that's responsible for most of the memory in your cell phones and in computers and data centers around the world.

Brad Langley: Growing up, Varun wanted to emulate his dad's success, but he was torn.

Varun Sivaram: I've always had these twin loves. One is for science and technology, and one is for international relations and foreign policy.

Brad Langley: At Stanford, Varun double majored in engineering and international relations. Four years later, he leaned into science and technology. He went to Oxford to study condensed matter physics, and one day in the lab, something amazing happened.

Varun Sivaram: I was next to my fellow graduate student, Mike Lee, who by accident discovered a miracle material.

Brad Langley: It was late at night, and Varun and Mike were studying how to use a mineral called perovskite in solar sales. After mixing two ingredients in the wrong proportion, Mike happened to stubble upon a winning recipe for solar efficiency.

Varun Sivaram: When he showed the graphs, our jaws just dropped. And this was an unbelievable efficiency from a material we had no hopes of. I was the luckiest graduate student in the world getting to witness history in the making in the solar field.

Brad Langley: But it just wasn't the right time and place to commercialize the discovery. This was back in 2012, a year after the Solyndra bankruptcy, which as you know, was a bad time for solar startups.

Varun Sivaram: It was a very tough period. I observed that and realized that next generation solar technologies would face real hurdles on their route to commercialization.

Brad Langley: So after earning his PhD, Varun decided to focus on his other love, international relations. And over the course of three years, his career really took off.

Varun Sivaram: I joined the Council on Foreign Relations, and I led their energy and climate program. I taught at Georgetown. I taught at Columbia, and I wrote a book on the future of solar energy.

Brad Langley: Later, he would spend two years working in India as chief technology officer for the clean energy provider Renew Power.

Varun Sivaram: It was an unbeatable opportunity. India was showing the world how an emerging economy could transition toward clean energy at very low cost.

Brad Langley: But policy work would call him back in 2021. He joined the Biden administration, serving as an energy and innovation advisor to John Kerry And then he headed back to tech again. In 2024, after a short time at the wind energy company Ørsted, Varun founded Emerald AI. It was a switch not from tech to policy, but from energy supply to energy demand.

Varun Sivaram: Instead of building supply projects, I'm focused on demand. I think that an energy transformation will require the biggest new source of demand, AI, to also become the grid's greatest ally.

Brad Langley: Last year, Emerald AI came out of stealth mode with a demonstration project designed to show the value of using AI for better grid management.

Varun Sivaram: AI is emerging as the biggest new user of energy. And if we can't find a way to make AI an intelligent user of that energy, then unfortunately we're not going to meet any of our goals. But if we can harness AI as the most intelligent user of energy, I actually think that AI could save the grid --- far from undermining it.

Brad Langley: This is With Great Power, a show about the people building the future grid today. I'm Brad Langley. Some people say utilities are slow to change, that they don't innovate fast enough. And while it might not always seem like the most cutting edge industry, there are lots of really smart people working really hard to make the grid cleaner, more reliable, and customer-centric. Today, my guest is Varun Sivaram, the CEO of Emerald AI. We talk about Emerald AI's demonstration project at a data center in Arizona. We also discuss a collaboration the company is now doing with Nvidia and Digital Realty, a data center developer. And we learn about the results of a matchup that pitted Emerald AI's technology against surging power demand in London. It's a story about a data center, but also about soccer and tea kettles. But first, I asked Varun to explain just what problem Emerald AI software is working to solve and how exactly it does that.

Varun Sivaram: Think about our power grid as a super highway. It faces peak rush hour just two times a month. The rest of the time it's relatively open. The power plants are producing less power than they could. The power lines are carrying less power than they could. AI data centers today can fit on that superhighway without us trying to build a new lane for that superhighway, which makes everyone's bills go up. Emerald makes it possible by reducing AI data centers power consumption during those peak rush hour moments. And there are three ways to do this. The first way is by slowing down or pausing computations. We call this temporal flexibility. Maybe there's a job, an AI job like fine-tuning a model that you don't have to run right this second, or you can slow it down a little bit. The next kind of flexibility is spatial flexibility.

This is where you take an AI workload. Like Brad, you're asking ChatGPT a question and you move it somewhere else. It moves at the speed of light. And so you, Brad, don't see the difference. You asked a question, you still got your answer. It was just 10 milliseconds longer because it took that long to move from Virginia to Chicago. The third kind of flexibility is called onsite resource flexibility. You might have a battery or a fuel cell or a generator. I think we should harness all three of these types of flexibility to make AI a truly power-flexible resource. And Emerald is the brain that does all of these. Emerald is an operating system for AI and power. It talks to the grid, it understands what the grid's constraints are and forecasts them, and it orchestrates all of the resources at a data center, the onsite batteries, but also the computers, the actual computations that can be slowed down or moved elsewhere offsite in order to, in one location, meet the grid's constraints.

Let's say there's a million air conditioners running in Phoenix and it's a really hot day. What Emerald demonstrated with Oracle and Nvidia and EPRI and the local utility Salt River Project is when that utility needed us to reduce at that data center, we were able to reduce the AI power consumption by 25% for three hours. That research was published in the scientific journal, Nature Energy. And we're just delighted. That's one of the world's top scientific journals. And this research is the first peer reviewed, rigorously tested demonstration that AI can actually be a flexible and friendly grid user.

Brad Langley: What are the mechanics of a deal like that? Is the data center provider your primary customer with partnership from Salt River? How are you guys expanding those kind of relationships across service territories? Maybe just digging a little bit to how that kind of go- to-market function works for Emerald AI.

Varun Sivaram: That's a great question, Brad. We work principally with two kinds of partners. One is utilities where we help them to interact with and dispatch flexible data centers on their network. So a utility will say, "Ah, I would like to bring on many more customers, but I don't want to charge all my rate-payers to expand the grid to add that lane to the highway at peak times." And so they use Emerald to help them to talk to all these data centers and say, "You can come on my grid, but I need to tell you when to reduce your load during peak demand moments." So that's one go- to-market approach. The other go- to-market approach is on the other side. We provide a product, a suite of products actually called Emerald Conductor that help the data centers control their power consumption on those peak load moments.

So if they receive the signal from the utility, "Hey, reduce your power." They're able to do so.

Brad Langley: I understand that you've also demoed your software for a project in London. Can you tell us more about that?

Varun Sivaram: We worked with Nebius at their brand new AI factory in London, and we addressed a very quirky UK specific issue. At halftime of a football game, we in America call it soccer. All of the British people turn on their tea kettles. They go into the kitchen to make some tea. And during the 2020 Euro match where England beat Germany, the tea kettle spike caused a gigawatt of new load on the system. We proved in this demo that AI's power consumption could fall and counterbalance the spikes from tea kettles. So at halftime, we had the AI power consumption fall, and at full time when there's another tea kettle pickup, we had the power consumption fall again. We demonstrated also that we could reduce power at the signal of a utility within 30 seconds. So the utility kind of tried to surprise the Emerald AI system running on these Nebius Nvidia GPUs.

And within 30 seconds, the system reduced to 30% below. Within roughly a minute, it was down to 40%. So we're able to precisely control the power draw while at the same time protecting the mission critical workflows. Brad, you asked about the mechanics. We want to make sure that we're slowing down the jobs that the customer has designated as flexible and not the jobs they've designated as mission critical.

Brad Langley: So Varun, I gotta admit, when I came into this conversation, I wasn't thinking about a connection between tea kettle spikes and data center usage. Was that a phenomenon that was known and you guys were trying to solve for that problem or was it something you discovered through your conversations with the utility that that needed to be addressed?

Varun Sivaram: A bit of both. It's a well known phenomenon, but the utility did send us the national system operators write up of what happened during that Euro 2020 match. And I think that the point here is to generally prove that AI can do a lot of stuff. It can be flexible to meet lightning strikes or tea kettle spikes or air conditioning peak load surges in Phoenix. No matter where you are, there'll be some idiosyncratic thing that you need to help the grid with. And AI is this very versatile energy user. We want to change the narrative from AI being an inflexible, always on source to a power user that is intelligent. We want AI to be an engine for good, and I think there's no way to do it unless it has this power flexible capability.

Brad Langley: So back in October, I know that Emerald AI, along with Nvidia, Digital Realty and PJM announced the Aurora AI Factory in Virginia, which is set to be completed by midyear. Can you give us more details on that project and what it entails?

Varun Sivaram: This 100 megawatt data center will be the world's first commercial scale data center, AI factory built to NVIDIA's new reference design that's able to flex in response to grid signals. We're partnering with PJM, the grid operator with Dominion the local utility. In order for them to be able to test under the auspices of EPRI, the national and international utility association, for them to test what the capabilities are of this data center to say reduce by 25% in response to a peak load moment. If the data center can do that, responding to signals, it'll become a more polite citizen of the grid. It'll help the utility to meet all of its other obligations, keep the electric system in balance. And the benefit is we can fit so many more data centers. There's a hundred gigawatts of new data centers that we could fit onto American power grids across the country if they had this capability of being able to reduce for just a couple hours at a time during those peak rush hour moments.

So that's what we're excited to deploy for the first time in the world.

Brad Langley: When it comes to data centers, uptime, speed, reliability is so critical. When you guys first set off on this endeavor, how much pushback did you get from the hyperscalers? Were they skeptical of this? Obviously the customer results that you've had go a long way towards proving the concept now, but how much of your early work was just convincing the hyperscalers that this could in fact work and not diminish their operations or their ability to deliver?

Varun Sivaram: Look, we are delighted that the ... Google, for example, has been such a pioneer in this field. Many of Google's top executives like Google's chief sustainability officer, Google's chief scientist, Jeff Dean, are our angel investors. I think what Google has done is demonstrate to the world that power flexibility is possible. They did so to advance clean energy, and Emerald's doing it not just for the Google ecosystem, but for the rest of the world. And in order to demonstrate that AI can be a grid-friendly citizen, so a slightly different objective. But I really think that a lot of the foundational work was done by folks like Google or folks at Microsoft Research. We're big fans of the extraordinary technical sophistication of the hyperscalers. I will say that Emerald works with a lot of folks. We work, for example, in addition to our demonstrations with Oracle, as I mentioned, we're announcing this brand new London proof point with Nebius and NeoCloud, and many of Nvidia's cloud partners are some of our closest partners.

We go hand in hand with Nvidia to say, "Hey, if you're an Nvidia cloud partner and you want to get access to power, you want to skip the line, come work with Emerald AI. We'll get you quicker access to power so that we can democratize this AI revolution around the world." There are these exciting and emerging cloud companies. It's not just a few folks now. It's many folks that are developing AI clouds, GPUs as a service, bare metal access, and all of them can benefit from this power flexible capability following Nvidia's reference design that Emerald's really proud to be a part of.

Brad Langley: So we've established that data centers are using your software to coordinate and optimize energy use, but there's also other conversations and other tools around things like energy storage and onsite DERs and such to support data center load growth. Looking ahead, what do you expect data centers will use to better manage energy through flexibility and what approach can do the most to improve or address load growth through data centers?

Varun Sivaram: Look, it will definitely need an “all of the above” strategy. We are going to need more generation on the grid. We are going to expand our transmission distribution grids. We are going to have behind the meter resources on data centers, and we're going to have computational flexibility, both temporally, slowing stuff down and spatially moving stuff from one place to another. All of these will be important, a holistic solution. And so I think Emerald is going to play a key role across these domains, both orchestrating the onsite resources and the computational flexibility. So I think in this “all of the above” solution, flexibility will emerge as one of the top tools and every data center should have it in the limit as we go toward a world where inference needs to be as cheap as possible so we can use as much of this scaled intelligence as we can.

AI inference, as Sam Altman says, will converge toward the cost of power. Well, if that's the case, we need to make sure that we are minimizing the cost of that energy system and the best way to do that is through flexibility.

Brad Langley: So we know that hyperscalers are under growing pressure from the federal government to lessen the impact of their AI data centers on the grid. Curious what your take is on the degree to which solutions like Emerald AI’s can move the needle on that problem. Might this come down to data centers just having to build more generation and pay for more energy they use, or is that more of a nuclear option?

Varun Sivaram: Again, in the long run, you want to make AI as cheap as possible. And so if you're forced to build two natural gas plants redundantly in your backyard to power your data center, that's just an inefficient way to go. Now, we'll certainly see a lot of folks do that in the near term, but if what you really want is a grid connection, take advantage of all the benefits the grid offers, being flexible will probably be a prerequisite because that's the only way that you keep costs down. You meet the Trump administration's goals of affordability, and you also get to build as much AI as we can to scale our competitiveness and innovation.

So I think flexibility has quickly risen from a contrarian thesis when I founded this company: "That's a dumb idea." To a leading contender: "That seems like an interesting idea." To a prerequisite and requirement: "Can't do this without flexibility." That's the progression we're in the middle of, and I feel more confident than ever that this was the right direction to go.

Brad Langley: And there's no doubt you're intercepting the market at the right time because this is a growing need for sure. So last question for you. We call the show With Great Power, which is a nod to the energy industry. It's also a famous Spider-Man quote, with great power comes great responsibility. So Varun, what superpower do you bring to the energy transition?

Varun Sivaram: I think I bring versatility. I've been fortunate to work in so many domains, deep scientific academic research, public policy as a diplomat. I led American clean energy diplomacy for John Kerry as a private sector executive at large companies and now small companies as a startup CEO and founder. I've been fortunate to see so many different settings that as I build a company that's effectively an interface company, a company that has to be good at AI and good at energy, I think we've got a chance of succeeding and it's a hard thing to do. 

You learn in business school, a company should do one thing really well. We're trying to do four things really well, and we have to straddle the energy AI interface. We have to be native in AI. We need to understand how AI computations work and how Nvidia chips work.

We have to understand the regulations of power systems and speak and communicate very well with utilities to ensure they can reliably meet their service obligations. There's a lot of stuff Emerald needs to do well. And my hope is that the versatility that I and our phenomenal team bring will enable us to succeed. Of course, we have this great responsibility with great power. Our great responsibility is we’ve got to keep the grid safe, reliable, and affordable for everyone, while at the same time making AI grow like a rocket ship. AI is the most important advancement in human history.

Brad Langley: Well, Varun, congrats to you and the team at Emerald AI and success so far. And thank you very much for your time.

Varun Sivaram: Thanks for having me, Brad.

Brad Langley: Varun Sivaram is the CEO of Emerald AI. With Great Power is produced by GridX in partnership with Latitude Studios. Delivering on our clean energy future is complex. GridX exists to simplify the journey. GridX is the enterprise rate platform that modern utilities rely on to usher in our clean energy future. We design and implement emerging rate structures and we increase consumer investment in clean energy, all while managing the complex billing needs of a distributed grid. Mary Catherine O'Connor produced a show. Anne Bailey is our senior editor. Stephen Lacey is our executive editor. Sean Marquand composed the original theme song and mixed the show. The GridX production team includes Jenni Barber, Samantha McCabe, and me, Brad Langley. 

If this show is providing value for you and we really hope it is, we'd love it if you could help us spread the word. You can rate or review us at Apple and Spotify, or you can share a link with a friend, colleague, or the energy nerd in your life.

As always, thanks for listening. I'm Brad Langley.