<

Big Brains homepage

Overview

Artificial intelligence may live in “the cloud,” but its footprint is firmly on the ground. As AI systems grow more powerful, the data centers that train and run them are consuming massive amounts of land, water and electricity—as well as reshaping regional power grids. What does this surge in demand mean for the environment, energy infrastructure, and the future of innovation?

In this episode, we speak with Prof. Andrew Chien, a computer scientist at the University of Chicago and a senior computing scientists at Argonne National Laboratory. An expert in large-scale computing and cloud computing, he explains why these data centers require so much power, why they’re stirring such controversy—and proposes a sustainable approach to data centers that could keep our energy use in check.

Related

Transcript

Paul Rand: Data centers seem to be a hot topic of conversation everywhere these days.

Tape: Now to an ABC News investigation, half of all the new demand for electricity over the next five years in the US is expected to come from data centers, they are the backbone of AI technology.

Paul Rand: Construction of these massive facilities is booming because of the growing demands of the AI tools that these centers power.

Andrew Chien: Everything is driven by computing these days, right? So, all of those things all require horsepower and compute services to implement them and so we, in some fundamental way or the businesses in some fundamental way, are driving this rapid growth.

Paul Rand: That’s Andrew Chien, a professor of computer science at the University of Chicago. He’s a leading researcher in large scale computing systems and an expert in sustainable computing.

Andrew Chien: It’s absolutely clear that this rapid growth of power that was happening a little bit before but now really accelerated with the advent of AI is far outracing our ability to make progress so, in a very literal and direct way, the environmental damage that computing is causing is growing rapidly.

Paul Rand: Even though data centers power much of the new technology we use, they’re also drawing concerns about their environmental impact and worries about whether they’ll drive up homeowner power bills or even cause blackouts.

Tape: Thousands of data centers are operating nationwide with hundreds more planned, some in the Chicago area. Companies point to economic benefits but residents are raising concerns.

Paul Rand: By the year 2030, it’s estimated that data centers could be using up to 9% of the total electricity generated in the United States but Chien and others have proposed innovative ways that could make data centers more sustainable going forward from utilizing our power grid more efficiently to utilizing renewable energy to help them run.

Andrew Chien: And so, what we really need to do is take the technology we have and figure out how we keep society happy, protect environments and health and neighborhoods but build out the capacity we need in the power grid to support these data centers.

Paul Rand: From the University of Chicago Podcast Network, welcome to Big Brains where we explore the ground-breaking ideas and the discoveries that are changing our world. I’m your host, Paul Rand, join me as we meet the minds behind the breakthroughs. On today’s episode, what’s driving the data center boom and how we can build more sustainably.

I have to tell you, every time I turn around and open up a newspaper, a magazine or a podcast, there is a story about data centers going on. This must be rather all encompassing for you these days.

Andrew Chien: Well, yeah, they’re everywhere, right? So, the good side of it is the US is the world leader in data centers.

Paul Rand: Okay. So, is that a good thing?

Andrew Chien: I think it’s a good thing. I think it reflects the fact that data centers are synonymous with computing capability and computing capability, I think we all believe, is a part of advancing society through digitalization and AI and other kinds of higher level information services.

Paul Rand: Why does almost everything on our digital lives actually depend on one? And if somebody walked in today into a modern data center, what actually would they see?

Andrew Chien: Yeah. So, why does everything digital depend on them? Well, it’s this idea of a factory. So, if you want to build things efficiently or, in this case, deliver digital services efficiently, in many different areas, we do it more efficiently when we do it at scale. So, this is why the cloud data centers are more efficient than separate enterprise IT data centers.

Paul Rand: Okay.

Andrew Chien: So, you put them together, you’re able to make this thing slightly more efficient, manage it efficiently and so on. So, that’s actually why you’re seeing this aggregation. Now, why is there overall growth and the answer is just walk around you and look at you, all the things you’re doing and seeing and touching and hearing and the answer is everything is driven by computing these days, right?

Paul Rand: Yup.

Andrew Chien: And it’s not just a little bit of stuff, it’s multiplying almost exponentially every year, layers and layers and layers on top, analysis, data collection, optimization for business, for entertainment and for advertising certainly. So, all of those things all require horsepower and compute services to implement them and so, we in some fundamental way, or the businesses in some fundamental way, are driving this rapid growth.

Paul Rand: Okay. You talked about where this really got supercharged was when ChatGPT came onto the scene. And I wonder if you can talk a little bit more about what makes these AI models so much more energy hungry, for example, than streaming or even cloud storage.

Andrew Chien: Yeah. The reason is that they all depend on what we call machine learning or AI and machine learning is not the traditional way of writing programs. Nobody writes code for machine learning models, what they do is they take a huge amount of data and they distill probabilities by learning from that data. So, they crunch over this data and they compute statistics over it and those statistics are then embodied in this generative AI large language model.

So, that large language model has literally billions of weights, in fact, some of the big ones have hundreds of billions of weights and, instead of running a program to respond to your request or something like that, they use this model to compute probabilities to predict what the next word of response should be to your request. So, think every time you get a new word, they compute 100 billion or maybe a trillion compute operations like ads and multiplies and this kind of stuff.

Paul Rand: So, that’s a really clear explanation of what’s driving this. And I’m imagining that most of the folks that are listening right now are probably using Chat or Claude or Gemini or some other tool on a pretty regular basis. What is it that you think they need to understand about what this is costing in energy and the implications of that usage?

Andrew Chien: Gosh. I think the simple truth is they’re driving explosive growth of power consumption in the internet in a very substantial way collectively, of course.

Paul Rand: Yup.

Andrew Chien: And I think it’s a huge problem because, for the last 10 or 20 years in Europe and in the United States and in many places around the world, we were trying very hard to reduce carbon emissions-

Paul Rand: Yes.

Andrew Chien: ... and environmental destruction that comes from electricity use and particularly computing electricity use has been an area that I focused on. And we actually were making progress but it’s absolutely clear that this rapid growth of power that was happening a little bit before but now really accelerated with the advent of AI is far outracing our ability to make progress.

Paul Rand: Okay.

Andrew Chien: So, in a very literal and direct way, the environmental damage that computing is causing is growing rapidly.

Paul Rand: So, that raises ... It ought to raise some real questions and I think I saw a stat that you talked about that computing could hit eight to 10% of US electricity by, gosh, 2030 or so and that’s really only four years away at this point. Are we at the point of being alarmed or should we be?

Andrew Chien: I think we should be alarmed-

Paul Rand: Okay.

Andrew Chien: ... because we need to work very hard to find ways to reduce the environmental impact of that power consumption. I think it’s probably forlorn hope that we could stop or slow the increase of power consumption from AI. At this point, I think that horse has left the barn.

Paul Rand: It’s not going to happen.

Andrew Chien: Yeah. And I would say the numbers you quoted are exactly right, it could be eight to 10% by 2030. But really what you should be thinking about is could it be 20% or even 25% by 2035?

Paul Rand: My gosh.

Andrew Chien: Because there’s no one who says it ends in 2030, right?

Paul Rand: And likely it won’t be for that matter. It’s interesting, one of the things that is ... You hear about limits that go onto systems and I’ve only experienced it with Claude but they’ll let you go to a certain point under your plan and then they cut you off. Is that because of the energy consumption?

Andrew Chien: It’s some combination of energy and then the cost of that share of the GPUs that you’re consuming, it’s somehow costs to the company. But yeah, I think that’s a good sign of indicating to you that it actually is costing something significant every time you do one of these queries. And there’s a lot of people who are in the financial side who are saying, “Oh, my God, how can these companies continue to give away so many free prompts as they call them or tokens as they call them?” And I think you’re beginning to see some of the limits around that where they’re saying, “Oh, my gosh, we’re losing so much money, we’re consuming so much power, we have to limit it or we have to start getting real money from people to pay for these things.”

Paul Rand: Well, I guess in some industry that would be called giving them a taste.

Andrew Chien: Giving them a taste, yeah.

Paul Rand: Giving them a taste.

Andrew Chien: There’s something called the freemium model, right?

Paul Rand: Yes.

Andrew Chien: Yeah.

Paul Rand: So, now, you personally have worked on the design of some really gigawatt scale systems for major companies whether it’s Google or Meta, Amazon, Microsoft, and others. Tell us a little bit about how you got involved in doing that and what has surprised you as you get into these types of opportunities.

Andrew Chien: Gosh. Well, we’ve collaborated with researchers in those companies for many, many years and the way I got into that is, when I came to U Chicago in 2011, I had been the VP of research for intel for the previous five years and learned all kinds of things about the future of technology, the end of Dennard scaling, the end of Moore’s law, some of these technical silicon advances that have been happening and it was pretty clear to me that all of that stuff was at an end.

Now, Dennard scaling was definitely already at an end, Moore’s law limped along for maybe another 10 years but now has been roundly declared to be at an end by folks no less luminary than Jensen Huang of Nvidia so it’s pretty much over. So, what we figured out-

Paul Rand: And by the way, for Moore’s law, that basically meant more computation for less energy, is that then the whole doubling component of it?

Andrew Chien: Actually, the Moore’s law was more transistors for the same money, two times the transistors every two years for the same money. Dennard scaling was the more computing for less power.

Paul Rand: Ah, okay.

Andrew Chien: And that actually ended 2008, 2010, people can argue about exactly when.

Paul Rand: Okay, okay.

Andrew Chien: And so, what I figured out coming out of that and starting to think deeply as professors at U Chicago want to do, what does this mean for the future. Computing scaling is changing in a fundamental way, what does it mean? And my conclusion was we’re going to have a big power problem. So, we started working on power for data centers under the clear assumption that it was going to be a massive problem in 2015, actually. So, now 10 years ago and, for the first seven years of that, as you’ve just heard, it was focused on cloud scaling which, I’m going to tell you, it’s forgotten now because the journalists like to talk about how AI caused this big problem but cloud power consumption was growing at 25% a year in that era.

So, anything that grows at 25% a year eventually becomes a problem. And now with AI, we’re growing much faster than that because we added on top of that another 10, 15, 20% and that’s why we have this gigantic problem.

Paul Rand: Okay. So-

Andrew Chien: So, anyway, you asked how I got into this-

Paul Rand: Yeah, please.

Andrew Chien: ... so that’s when we started thinking about data centers and power grids and how do we actually green this growing power consumption, how do we make it less damaging.

Paul Rand: Okay. And all these companies had an interest in looking at it from both sides as well?

Andrew Chien: I think the researchers definitely have an interest, I think these companies have, for a long time, marketed themselves as being green, environmental friendly and so on. Depending on the company, it’s true to a greater or lesser degree but it’s definitely aspirational for all of these companies so we worked with researchers on that. What did we learn from that? We learned a couple of things. We learned all kinds of technological things about the data centers and what’s possible and what’s not, that was good technical research. But we also learned that, fundamentally, these companies are not able of governing themselves to reduce environmental damage.

Paul Rand: So, my understanding is that we’ve got, at least in this country and you referenced the high number in the beginning, but we’ve got over 5,400 data centers that are already in the US. Is it about right? But that’s not going to keep up, that demand is going to double or triple in the next few years. So, what risk, if you got really down to it, does that double or tripling mean for society, for the environment, for our ability to progress?

Andrew Chien: I think of it as a secular change in the environment and how we live. It’s like if somebody told you some company was going to build 5,000 more big box stores, you wouldn’t be saying, “Oh, what’s the fundamental societal risk for that?” You’d be like, “Oh, I don’t want one in my neighborhood,” or something like that. So, I don’t think there’s qualitative risks like that, however, if you go to the climate side, you’d have to say those folks and people worried about climate are worried about risks. So, to the degree to which those additional data centers produce more carbon emissions, produce more material depletion and so on, they exacerbate the problems we have with the environment so I suppose that’s a form of a risk. They risk, frankly, accelerating climate change or, maybe a little bit more optimistically, slowing the rate at which we’re able to slow climate change and making it harder for us to meet our 2035 or 2050 goals.

Paul Rand: Okay. So, as you see this, it’s interesting, in your mind, you’re not necessarily framing this ... You’re framing it more as an opportunity than a risk. Does that mean you think the risks are not as prominent or that they can be overcome or we ought not to be putting our energy into that right now?

Andrew Chien: There’s no question that the data centers add value to the economy in a large and indirect way globally. You and I are benefiting from data centers in Iowa, in Washington State and so on because some service we use runs in those data centers. But when the data center comes to my neighborhood, what you’re seeing in the community, actually, and the backlash against the data centers is I don’t get a direct and tangible benefit from that data center being my neighborhood and I don’t like the look of it or it causes traffic or it’s consuming our water resources and so on and so forth.

So, I think that the community challenge that the data center companies face, and I think I’m beginning to see some signs of progress in this area, is that they need to openly engage the communities and deliver value directly into the community. What could that be? That could be job training programs, that could be high-tech jobs cited in the area and that could be donation and facilitation of AI services for schools, for the governments, all kinds of things. And they so far haven’t really done that very much, they viewed as, well, I’m just going to be a building there and sit there in your neighborhood but I think they need to do much more.

Paul Rand: If there is a data center going up and they’re growing, does that represent any risks to our current grid and do we need to think about upgrades to that grid to be able to provide power to them? And is it possible that people are concerned that that power, there won’t be enough energy to provide for other parts of their lives?

Andrew Chien: Yeah, it’s absolutely a risk in the sense that you’re describing if the power grid folks do nothing. So, if you analyze it in that sense. But, actually, we already know how to fix these problems, we have technology, more transmission, more generation, all of these kinds of things, what we haven’t done is put in place the regulatory policies and the approval structures so that we can build the resources we need in order to support this large number of data centers. In the United States, for the last 10 years, we’ve had minimal power increase on the national level and that’s because-

Paul Rand: Production of power?

Andrew Chien: Production of power or consumption of power equally.

Paul Rand: Okay.

Andrew Chien: And the reason for that is that we have a heavily services-based economy and the manufacturing has been slowly decreasing so we don’t have the need to have run big blast furnaces or chip foundries or something like that that consume lots of power. Now, with AI and these big data centers, now all of a sudden we need to go from flat capacity to increasing capacity at really not such a huge growth rate at a couple of percent per year because it’s not just the data center growth, it’s on that whole base of all of the power growth. This is not extraordinary in either the history of the United States or in the history of other countries, recent history in the world. China, on an annual basis, grows their power grid by 5% and they have done so for the last 30-

Paul Rand: Annually?

Andrew Chien: ... 30 or 40 years.

Paul Rand: Huh.

Andrew Chien: And furthermore, their power grid is now larger than the United States’ power grid so their annual increase is by far greater than what we need to do in the United States. And so, what we really need to do is take the technology we have and figure out how we keep society happy, protect environments and health and neighborhoods but build out the capacity we need in the power grid to support these data centers.

Paul Rand: How much does location matter for the data centers and for emissions and grid impact? Does that come into this and is there a reason that they should be in certain areas, i.e., more close to dense populations or is that not really relevant?

Andrew Chien: Yeah. So, that question’s complicated and the reason is that proximity matters to different people for different reasons. So, the data center companies will tell you I want some of my data centers to be near big urban centers because I want to serve low latency loads in those big urban centers. But as you just mentioned, Meta is building a giant AI training facility in Louisiana, not in a densely populated area, not near large urban centers. You can look at the population map of the United States, there’s nothing large close to there, anywhere close to there. Likewise, xAI is building a giant data center in Eastern Tennessee bordering on Mississippi, again, not near any large urban data centers.

So, there are some data centers that are being placed in places because they can get cheap land, because they can get access to power and, frankly, because regulation is lax in those areas. To be very honest about what’s going on, they can have their way with the local regulatory authorities but we also know that some companies are placing data center in places like the Northwest in Washington State, near big hydro dams and so on so they can have clean power.

So, the power grid or the climate side of it is there are certain power grids in the United States that have much lower carbon emissions per unit electricity and, if you’re worried about the climate, you’d like to have the data centers in those power grids because all power consumption is local, you only get power from your local grid, you don’t get it from across the country.

So, the reason location is complicated is because these different constituencies all want different things, And then, by the way, you and I or the NIMBY folks, they just care about their neighborhood, right?

Paul Rand: Right.

Andrew Chien: If you allow these data centers to build their own power generation, then, all of a sudden, they’re outside of the system, right?

Paul Rand: Yeah, that’s true.

Andrew Chien: So, they’re doing what’s called behind the meter generation and I think it becomes a much more difficult to gather data, to report, much less regulate those kinds of things. So, another way to do this would be to have them construct facilities and put them under the management of the power grids.

Paul Rand: I got you, okay.

Andrew Chien: And then you would maintain that regulatory integrity while bringing the power and the assets and the capabilities of these data center companies to help the power grid forward.

Paul Rand: Okay, interesting solution. And my understanding also is the data centers are using a tremendous amount of water for cooling. And certainly, if that’s the case, there’s certain parts of the country or world where that may have a bigger material impact than others. How significant is that impact and is it getting worse?

Andrew Chien: Yes. The data center community discovered 10 or 15 years ago that the cheapest way, and that’s the key observation, cheapest way to cool a data center is to spray water in the air, evaporative cooling but we used to call when I was a kid Texas air conditioning, right?

Paul Rand: Okay, yup.

Andrew Chien: Yeah. So, the water, when it sprays into the air, it evaporates, the thermal absorption of that phase change from liquid to gas we learned in high school chemistry cools the surrounding air so that’s what they’re doing. The problem with that is you basically are destroying water in the ecosystem. So, you’re taking surface water, potable water and you’re spraying it out in the air and then the air moves away and it’s gone out of the ecosystem, it’s evaporation, accelerated evaporation. And the data centers that were built over a 15-year period heavily made use of evaporative cooling because it was the best practice and so, as a result, these data centers that exist already consume massive amounts of water.

Now, in some areas, it’s less of a problem. I live in Chicago, in Chicago, we’re next to the Great Lakes which is the largest single assemblage of fresh water in the world so you could argue maybe we could afford to do that. If you’re in Arizona or you’re in Utah, there is no such plentiful water supply and it would be irresponsible to pump groundwater in Arizona and use it to evaporative cool. So, there’s been a big push and the data center companies are starting to respond to that. Not all of them have responded but a number of leaders have actually made commitments to not build any new data centers that use evaporative cooling. So, if you [inaudible 00:23:42]-

Paul Rand: Okay. What’s the option to that, by the way?

Andrew Chien: Sorry? What’s the option?

Paul Rand: What’s the option to evaporative cooling?

Andrew Chien: Yeah. So, your air conditioner in your house runs a compressor and it transfers the heat from inside your house to the outside. So, if you don’t use evaporative cooling, then you use a system like that, HVAC system that transfers the heat to the outside. So, that works, that’s great-

Paul Rand: Okay.

Andrew Chien: ... the only problem is that you have to have bigger capacity of these cooling units and you have to consume more electricity in order to do that. We all know, when our air conditioning runs, it  consumes electricity. So, that further exacerbates this growth of power requirement but people are pushing on that and that’s where they’re going. So, like I said, a couple of the leading companies have promised that all their future data centers will not use evaporative cooling but, as far as I know, none of them are going to convert the existing fleet. And some of the companies have not yet promised to not build evaporative cooling data centers so I think it’s a continuing problem.

Paul Rand: Let me, if I can, change direction for a second and talk a little bit about some of your work. You have proposed something that you’re calling the zero carbon cloud or even the ZC cloud. Before we get into how that works, if you can tell me about what the insight is that led you to this idea.

Andrew Chien: So, we started studying this phenomenon called curtailment which is when the electricity is not allowed in the grid and we started studying this phenomenon of negative pricing and we realized, if we used electricity in the right time and right place, then we could actually acquire large quantities of electricity for free or for even a negative price that they would pay you to take it so, hence, the idea of zero carbon cloud. So, now the idea is, if I have this excess renewable energy and I consume it for my data center, then I can actually plausibly claim that that data center has zero carbon emissions, operational carbon emissions just to be right.

So, that was the idea. Now, what’s the price of that? The price of making that work is you got to be in the right place, that place might be in the middle of Texas, actually, like where Stargate is or in the middle of Iowa or something like that.

Paul Rand: By the way, I was just there a couple of weeks ago, it was amazing.

Andrew Chien: That’s an amazing place.

Paul Rand: Yeah.

Andrew Chien: And then the second thing about it is you have to consume electricity at the right time, okay?

Paul Rand: Yup.

Andrew Chien: So, you don’t always have this excess electricity. So, we engaged, being good scientists, in all kinds of research studies to figure out how much of it, when and where and we were able to find places where you could get electricity 90% of the time which is pretty good. 90%, pretty good, that’s an A in a lot of classes but 90% is not enough to make data centers happy, they typically want 99.999%. But the idea of zero carbon cloud is I can deliver this 90% capability with zero carbon.

Paul Rand: So, how is the ... And by the way, I’ve heard of this as talked about as stranded power, is that the same thing?

Andrew Chien: Yeah. We use the term stranded power to apply to the combination of curtailed energy and negative price power.

Paul Rand: Got it, okay.

Andrew Chien: Yeah.

Paul Rand: So, the two together is stranded power?

Andrew Chien: That’s right.

Paul Rand: Understood. Tell me the reaction that you’ve gotten to this both on the encouraging side and the discouraging side.

Andrew Chien: On the discouraging side, I think the traditional data center folks said, “You’ve only got one nine of reliability, we want five, you’re nowhere close.” The encouraging side was a whole community actually took this up and it was the Bitcoin mining community. So, they do this intermittent use of power and they usually do it for low cost but, in fact, a number of them took this up for zero carbon so low carbon kind of Bitcoin mining. Now, I’m not a big fan of Bitcoin mining but that was super encouraging to see that there were some uses that would take up this idea and do it at scale and they did it at gigawatt scale so very, very large scale stuff.

The other thing that happened is that I’ve described this in classic academic absolutes, I put a whole data center there, I turn the whole thing on and then I turn the whole thing off. The world doesn’t really work like that, that’s a black and white characterization of something that has many shades of gray. In the cloud, everything is virtualized, virtual this, virtual that so it’s really easy to think about a data center being virtualized into two parts. One part that has reliable power and one part that operates in this zero carbon way, it goes on and off and on and off. And you put those two things together and you have a data center that doesn’t go on and off, its capacity goes up and down slightly. And that refined version of that zero carbon cloud idea is what we’ve been pushing which is data centers collaborating with the power grid to help the power grid reduce its emissions overall.

Paul Rand: Can you give a little context for how much power does exist or do you think exists from this concept of stranded power? How much are we talking about that, for lack of a better word, could be recaptured?

Andrew Chien: Oh, it’s a lot, it’s a lot. We did a bunch of studies of the midcontinent ISO so that’s the power grid that runs from Alberta down through the center of the country, covers Iowa, Minnesota, most of Illinois except for Chicago all the way down to Louisiana so midcontinent ISO. And that power grid has a good bit of wind energy and a good bit of solar energy and the study we did showed that, for much of the time, the 70, 80% of the time, grid wide, there were several gigawatts of excess energy so that’s one characterization of it, stranded power. We did a study of the California prayer grid, the California ISO and we’ve written a recent paper on this that shows that, in recent times in the spring when they have the most excess energy, that the stranded power is peaking at five gigawatts during the day when they have a lot of sun and hydro runoff and all of those kinds of things.

So, the quantities of this power are really massive and they’re only growing because, the more renewables you add, the more of this you have. And people are combating it by trying to buy storage but it’s very expensive to buy that storage, it’s much better to find some other way to consume this excess energy when it’s created.

Paul Rand: Let us keep going down this solution path beyond even ZC cloud and you talked a little bit about cooling. There is this idea about advanced cooling, I’m not sure if what example you were giving falls into that category or there are other models of cooling that you think could actually be advantageous as well.

Andrew Chien: Gosh. Boy, cooling is a big topic. Who would’ve thought that cooling would be so interesting?

Paul Rand: Absolutely.

Andrew Chien: It’s interesting because power levels are becoming astronomical. Nvidia, in their reference design racks, have gone from 160 kilowatts to 250 kilowatts to a megawatt, they literally have a demonstration rack for 2028 or nine or something that’s a megawatt, it’s huge, huge numbers. So, getting the heat out from that is really difficult and DARPA has some advanced programs in how do you get the heat out of chips and packages and so on to do that and the industry is bringing, scaling off all kinds of conventional technologies like traditional water cooling, phase change cooling where you are boiling off a coolant and using the phase change to move the heat out very efficiently. I think we’re going to see all of those things coming in the near future because the power densities continue to increase.

We’re in big trouble at the data center level because some of our recent studies show that the amount of equipment required to do cooling is starting to be as large as the amount of equipment to do the computing.

Paul Rand: Okay, wow.

Andrew Chien: So you think you have a data center, it has 10,000 square feet in it, you might think, oh, that’s 10,000 square feet of computers. Well, no, it’s 10,000 square feet of computers and then ... Sorry, 5,000 square feet of computers and then 5,000 square feet of cooling equipment.

Paul Rand: Extraordinary, okay.

Andrew Chien: Well, that’s pretty bad already but now you take the power density and you double it again, now you’ve got one quarter computing equipment and three quarters cooling and then you doubled it again and there’s almost no computing in the building, right?

Paul Rand: Yeah, wow.

Andrew Chien: So it’s all cooling equipment. So, clearly, that’s a problem and no one is happy about that and people are trying to find solutions.

Paul Rand: Give me a look five years out, where are we? What’s working better? What are we still concerned about?

Andrew Chien: Well, let me give you an optimistic view of that.

Paul Rand: Please.

Andrew Chien: I think there’s no question that there’s a huge amount more power being consumed by AI but I hope, in terms of progress, that we adopt and regulate and incentivize cooperation with the power grid so that the data centers can actually grow with the minimum negative impact on the grid and the amount of generation you need to add and all of those kinds of things. And I hope that, five years from now, we also have a much better-established expectation for community contribution for investment in data centers in their community, it can be jobs, it can be job programs. I think the data center companies say they want speed and they want predictability, they should want this, they should want a standard formula. I go to your community, here’s the standard formula, everyone’s happy, please come to our community, please deliver on the formula and that would help everybody, I think.

So, I hope those two things happen but I’m afraid it’s the case that, five years from now, we will be looking with some great trepidation at growth in data center power consumption from the eight or 10% that we expect today to some much larger number thinking, dear, how are we going to deal with that. And then the other thing is that I want people to understand that I don’t really think this is a choice of whether or not to do this, whether or not to have more data centers and so on.

Paul Rand: The cat is out of the bag.

Andrew Chien: It’s something we all need and are going to depend on and it’s going to be critical to our economy and so on and so forth. Suppose we were talking about electrification in the 1950s, folks who were saying, “We don’t want electricity, we don’t need electricity,” gosh, that’s laughable now. So, I think that’s the situation so I don’t think there’s an option to not go forward, the question is how to best go forward in this current situation.

Course registration

More From Big Brains

Episode 179

Anxious? Avoidant? How to build more secure relationships, with Amir Levine

Neuroscientist explains how attachment styles change from childhood to adulthood—and how we can become more secure

Episode 178

Could AI models forecast extreme weather events? with Pedram Hassanzadeh

Climate scientist explains how models are being trained to predict heat waves, monsoons and even unprecedented ‘gray swan’ events

Episode 177

Are judges too powerful? The rise of universal injunctions, with Samuel Bray

Legal scholar discusses the expanding authority of single judges to halt federal policies and executive orders