Back to articles
EngineeringMicrosoft Research

Can we AI our way to a more sustainable world?

Doug Burger, sustainability expert Amy Luers, and optimization researcher Ishai Menache examine the global emissions implications of datacenter operations, efficiency gains, and AI's potential across electrification, ma...

This article is displayed from the content provided directly by the source RSS feed. The original source is credited at the bottom of the page.

Technical advancement is moving at such a rapid pace that it can be challenging to define the tomorrow we’re working toward. In The Shape of Things to Come , Microsoft Research leader Doug Burger and experts from across disciplines tease out the thorniest AI issues facing technologists, policymakers, business decisionmakers, and other stakeholders today. The goal: to amplify the shared understanding needed to build a future in which the AI transition is a net positive.

In this episode, Burger is joined by Amy Luers , head of sustainability science and innovation at Microsoft, and Ishai Menache , an optimization researcher at Microsoft Research, to explore how AI can both contribute to and help address climate change, emphasizing the need to separate hype from data and understand its real impact. While datacenters account for a small share of global emissions, their rapid growth raises local infrastructure concerns, even as AI offers powerful tools to optimize complex systems and accelerate climate solutions. The discussion frames AI as a critical but double-edged technology that must be steered carefully to support a sustainable future.

Subscribe to the Microsoft Research Podcast :

  • Apple Podcasts
  • Email
  • Android
  • Spotify
  • RSS Feed

Transcript

[MUSIC]

DOUG BURGER: This is The Shape of Things to Come, a Microsoft Research Podcast. I’m your host, Doug Burger. In this series, we’re going to venture to the bleeding edge of AI capabilities, dig down into the fundamentals, really try to understand them, and think about how these capabilities are going to change the world—for better and worse.

In today’s podcast, I’m bringing in two experts to have a dialogue about the future of AI and sustainability. One, Amy Luers, is an expert on sustainability and the intersection of sustainability, technology, and science. And the other, Ishai Menache, is a world-renowned expert in optimization.

And so thinking about how technology can optimize systems, we’re going to talk about whether AI has the potential to help with climate change and sustainability and the degree to which there are challenges associated with AI. And we’re going to try to get to the root of the issue because that will determine the shape of things to come.

[MUSIC FADES]

I’m really excited about the two distinguished guests I have today. We have Amy Luers, who’s Microsoft’s senior global director for sustainability science and innovation. And we have Ishai Menache, who is a partner research manager at Microsoft Research.

And then the topic, of course, is AI and the climate and sustainability, which I think is on a lot of people’s minds. You know, we have a climate crisis happening. I’ve been a climate hawk since the 1990s. It’s something I, you know, worry a lot about. I care a lot about. Of course, Amy has devoted her career to it, so I can’t really talk, but it’s a really important issue.

And now, you know, we have this AI transition happening. We’re doing across the tech industry a large build-out of large, large, large computing systems, mega datacenters. And there’s a lot of concern in the world about how this might affect the climate, how … what this means.

Amy, we’ll talk, I think, about local communities, as well. And so I really wanted to dig in to the facts. What does this really mean? What do we think the impact’s actually going to be? Like, let’s separate the data from the hype and then also talk about some of the opportunities ahead because I do think there are things we’ll be able to do, and that’s why we have Ishai here.

So maybe I’ll first turn it over to Amy. Can you tell us a little bit about your job at Microsoft and what got you into this space, maybe a little bit of your story?

AMY LUERS: So as you said, I lead the sustainability science and innovation in the Microsoft Corp sustainability team, which really means I get to work with really smart people around the company, around the world, at MSR [Microsoft Research], on shaping and informing sustainability solutions for Microsoft but also for the world.

And part of that is leading our strategy on AI and sustainability. And how I got into it, I’ve been working on sustainability and climate my whole life. And I’ve worked, from the tech sector, I was at Google actually previously. Also was in the White House working at the intersection of the CTO’s office and environment and resources and energy.

I also led an international research institution, UN-based network rather, focused on sustainability. And in that context, after coming out of Google, where I was really … started to think about the power of compute and digital tools for transformation, and— which is why I was brought into the White House to work at that intersection—when I started leading the sustainability network, research network globally, Future Earth (opens in new tab) , I really brought this need to think about innovation and digital technologies in that space.

And I will say the sustainability science network at that time, you know, it was 10 years ago, eight years ago maybe, was a little resistant to thinking about AI and technologies in this space.

And I started a global initiative called Sustainability in the Digital Age, where I really brought together the digital technology and AI community. It was in Montreal, which there’s a lot, a big AI community there, and the sustainability scientists globally. And really started to think about what are the potentials, what are the risks, and led a big international study to put together a research and innovation agenda in this space.

And that sort of really shifted my approach from just “big compute can help things,” which I was really focused on at Google, to this role of AI and machine learning in this space.

BURGER: And, Ishai, so you are a world-renowned expert in, now, ML and optimization. You know, you’ve published extensively. You’re, I think, famous in your research community. You’ve had, I think, broader—you’ve had a lot of impact on Microsoft’s business. You’ve also been published in the Harvard Business Review (opens in new tab) .

So, you know, you’re a little bit polymathy and sometimes a little intimidating to me.

ISHAI MENACHE: Yeah. [LAUGHTER]

BURGER: You know, but I’d like to hear a little bit about your background, just, you know, a short version of your story for the listening audience.

MENACHE: Yeah. My background is actually in engineering. However, my graduate studies were, as you mentioned, like in ML, reinforcement learning, and later on distributed optimization, game theory, a little bit more on the theory side.

So my story is that when I was doing my postdoc at MIT [Massachusetts Institute of Technology], you know, the cloud was kind of on the rise, circa 2009 or so. And I got fascinated by the cloud. My initial interest was actually in the economics of the cloud and, you know, pricing. How you price the cloud. And I got to know about MSR because, you know, around that time there was a new kind of lab opening just by MIT, MSR New England. And I got fascinated by the cloud and, you know, not only the economic aspects of it, but more fundamentally, you know, how do you utilize resources more efficiently?

And that’s what got me to Microsoft Research in 2011. So I was consulting in MSR New England but then moved to Redmond in 2011 to join a lab called Extreme Computing group that was actually dealing with the cloud futures.

And if I can mention, Doug, you were also part of that, so …

BURGER: That’s right.

MENACHE: I’ve known you for quite some time.

BURGER: Yep.

MENACHE: And, you know, so sort of my, let’s say, my angle into that, so there were like a lot of systems people thinking about the, you know, infrastructure of cloud. And then at the other extreme, there were theoreticians. They were thinking about like, you know, the next kind of wave or like, you know, innovating in the area of algorithms.

But I think what was sort of missing is a little bit of bridging between, you know, algorithms and then cloud infrastructure. And that’s where I sort of found a very interesting niche for myself, and later on, for the group, which I founded in 2019.

BURGER: So you recently announced the system called OptiMind . And I, you know, I did a LinkedIn post (opens in new tab) about it because I was really excited about it.

And just tell us what the system does, like why … it got a lot of attention. So what does the system do? And then maybe we’ll dig a little bit into optimization for the audience. And then, but then we have to get back to AI.

MENACHE: For sure. So, you know, stepping back a little bit. So what is actually optimization or mathematical optimization ? So optimization or mathematical optimization is a way of using mathematics to make the best decisions when there are many choices and some limitations. OK.

And, you know, just a little bit more, you know, concretely, so in every optimization problem you have, first of all, a description of the problem that you need to solve. Then you have a bunch of decisions that are actually, in mathematical terms, these are the variables. You have an objective. What is your goal? What are you trying to optimize?

It could be something that you’re maximizing, revenue, but it could be that you’re minimizing costs, so there are different versions or different kinds of goals or objectives. And then there are constraints, which is like you cannot do whatever you want. There are some sort of limitations such as capacity constraints in the cloud setting or other factors that you have to account for in order to come up with the best possible decisions.

BURGER: So what … so maybe a simple … just to be silly for a sec, so I have a, you know, complex drive to work, and one day I find that the way I usually take is blocked and my brakes are really worn and I can only stop twice. So your framework might be able to figure out, like, what path gets me there saving the most gas.

MENACHE: Right. So that’s one example. And maybe you don’t want to pay for tolls for some reason, so that limits, you know, the roads that you can take. You know, there’s speed limits and such things. These are all constraints that you have to account for.

BURGER: There might be some speed traps, but I’m willing to go by a speed trap if the route is much shorter.

MENACHE: Maybe.

BURGER: So stuff like that. So it gets pretty complicated, doesn’t it?

MENACHE: Right. It gets pretty complicated because especially when the, you know, maybe you’re a single driver, but in optimization settings, think of like some of the problems that we worked on actually with the Dynamics 365 was also in the context of field service, which is about managing technicians at scale. So think of like not just you, but thousands of technicians that have to fulfill or that have to take care of certain work orders. So it would be thousands or tens of thousands of work orders.

And then you need to assign the technicians to these work orders. And there’s a bunch of constraints. Maybe not every technician can do every work order. You have to account for the traveling of the technicians, right. So it’s like you’re not going to send a technician that is in Spokane to do something in, let’s say, in Seattle because, you know, all day will be wasted on traveling.

BURGER: And it’s not sustainable.

MENACHE: It’s not sustainable. [LAUGHTER] And also, you know, the gas, obviously. So all these kinds of considerations, you can map it formally into mathematical optimization. And then there are techniques of solving this problem to optimality.

So essentially there is some machinery and there are experts that can take these problems and come up with the algorithms, but not everyone can do it. So it requires some expertise. In fact, graduate-level expertise in operation research or in, you know, algorithms, computer science type of algorithms. And when gen AI was emerging, we saw an opportunity to democratize optimization with gen AI in the following sense that a person that is not an expert can define what they want to do.

You gave your example about, you know, getting to work. It could be like, you know, a simple example of packing, which is like I have a suitcase that I, you know, I have a limit of, like, 20 pounds. And I have a bunch of things that I have to … that I want to fit in like, you know, I have with certain importance. Some are more critical, like, you know, I don’t know, like my laptop, and all that. But then there are books that are quite heavy, and maybe I still want to read books.

BURGER: Or I’m running an airline, and I have to schedule the flights, and I want to minimize fuel.

MENACHE: Yeah, that, too. And essentially, so you want to be able to describe what you need to solve in plain English, specify the problem, say what the decisions are, like I mentioned, like what your goal is, and then what constraints need to be accounted for.

And you want to use AI that will help you take all these considerations and essentially formulate the algorithm itself. So write down the recipe, the mathematical recipe, that would produce an optimal solution.

BURGER: Got it.

MENACHE: So that’s what OptiMind is about. [It] is a small language model that was trained especially for this kind of scenarios of, you know, taking natural language and mapping it into an optimization algorithm.

BURGER: So this is really great, and I think we’re going to come back to this. I want to now go back to Amy. When we think about AI and these datacenters that the industry is building and they, you know, they use water, they use electricity, you know, there’s contention in some communities about them being placed there. If I, you know, I’d love to be really data-driven and just kind of very factual.

So if I look at the overall picture, like, what is the real impact we think of this transition on, you know, climate, sustainability. And it’s complicated, right, because there are many sources of emissions? Electricity gen is one, but you have renewable energy. But it takes materials to build these things. So can you kind of give us some framing to help us understand it?

LUERS: Yeah. So first of all, you know, I think when we think about AI and climate, a lot of people think about just the infrastructure side. And I think it’s really important to think about this holistically. I actually personally believe that AI will be one of the most influential factors determining our climate future, for better or worse. But I also believe that we actually need AI to solve the climate crisis. So with that as context, let’s talk about the infrastructure, remembering we have to really think about the full context. You know, let me put this into context.

So from a climate perspective, what matters is the emissions to the world, the emissions of greenhouse gases to the world, heat-trapping gases, …

BURGER: Right.

LUERS: … to the climate, not specifically energy, right, because energy can be in different forms.

BURGER: Right. It’s, what are you putting in the air?

LUERS: What are you putting in the atmosphere?

BURGER: That’s right.

LUERS: So, you know, if you think about it from a global perspective, the world uses about … energy itself accounts for about 75% of all of the emissions that go into the atmosphere (opens in new tab) .

BURGER: Wow, that’s a lot.

LUERS: So that’s a lot. But a lot of people think it’s the whole thing. So there’s other things that are not energy. [LAUGHS]

BURGER: OK, three-quarters, three-quarters …

LUERS: But in the context of … so from a climate perspective, datacenters account for about .5, less than .5% of all emissions as of 2024 (opens in new tab) .

BURGER: OK. But they’re growing?

LUERS: But they’re growing. And so if you’re growing and you think about … there are lots of projections. It’s hard to project really beyond a couple years, as you both know, because things are changing so quickly, both on the demand, on the efficiency, what we’re using. Are we going to be using small language models? Like, we don’t know what the future looks like.

The IEA [International Energy Agency] projects by 2035 that the, the electricity use could double (opens in new tab) . And so from electricity use, actually, datacenters use about 1.5% of global electricity (opens in new tab) ,…

BURGER: Yup.

LUERS: … and that could double, could be between three and fi— even more than double.

But that’s still, from their projections, it would still be less than 1% of global emissions. So even if that would double in that space. So it’s still in terms of a global emissions perspective, which is what the climate cares about, …

BURGER: Right.

LUERS: …it’s a small percentage.

BURGER: Can I just go back for a second, break that down?

LUERS: Yeah.

BURGER: So, so energy is three-quarters (opens in new tab) , of … generates three-quarters of emissions. But that includes burning fuel, …

LUERS: Yeah.

BURGER: …transport. And then what fraction of that three-quarters or let’s say just total emissions do we think electricity is?

LUERS: So electricity …

BURGER: Generation …

LUERS: …is about 20% … or, no, electricity … the energy that’s produced is consumed … about 20% of it is consumed as electricity (opens in new tab) .

BURGER: Got it.

LUERS: Now, in terms of emissions, about 35% of the emissions from energy is from electricity (opens in new tab) . And part of that is because electricity … the reason there’s that difference is …

BURGER: You’ve got coal plants.

LUERS: You’ve got coal plants, and it’s not as efficient when you do coal plants. You actually get efficiencies when you go right from solar to … in terms of just the energy because you lose a lot of heat in the thermoelectric plants, right? So there’s an efficiency there. But, so about 35% of the energy emissions are from electricity, and electricity production is really the key issue. You know, the key issue of today is, like, electricity and datacenters, right? How are you going to get enough electricity? How are you going to get enough clean electricity?

And that is something that is often more of an infrastructure problem than actually the energy problem. I mean, they’re both true. But it’s getting that electricity in the right location at the right time. And that’s sort of a big …

BURGER: It’s a big messy problem.

LUERS: It’s a big, messy problem that we can unpack a little bit. Because I do think there’s a role for AI, a huge role for AI.

BURGER: Maybe even an optimization problem.

LUERS: Maybe even an optimization problem. Exactly.

MENACHE: Maybe.

BURGER: We’ve got this guy in the room. This is exciting.

LUERS: So we should unpack that. But I think before we go off that, just two points that I think are relevant. One thing is that, which is often not necessarily realized by people who don’t spend their lives … haven’t spent their lives thinking about climate, but to tackle the climate problem, we need massive amounts more of electricity. That is, that’s just part … I said we had 170,000 terawatts [read: terawatt hours] of energy. Most of that, to be able to solve the problem, has to come in the form of electricity because that’s what we can decarbonize easiest. So one of those challenges is actually more electricity.

BURGER: Got it. So let me again try to break this down to a simple statement. So we’ve got, you know, about 35% of emissions are due to electricity use …

LUERS: Thirty-five percent of energy emissions.

BURGER: Thirty-five percent of energy emissions, which is three-quarters of the pie. So we can do the multiplication. And, of course, you know, as we decarbonize electricity, there is a probably too slow but ongoing transition towards a lower carbon emissions generation of electricity. You know, you convert a coal plant to a natural gas plant, it gets better. You put it to solar wind, it gets even better. But then the demand for electricity is going up, in part fueled by, you know, the tech industry and building the datacenters and AI, but in part because, like, we have to stop burning fossil fuels.

LUERS: Right. Electric vehicles, turning all of our heating into electric heat—turning everything … the phrase is sort of electrify everything ahead of … the IEA, you know, says we’re in the era of electricity, right?

BURGER: So, so we have a huge pressure on electricity demand. And we need to have it be, you know, low carbon electricity generation, but the demand is just going to go up and up and up with or without, you know, this datacenter buildout and this AI buildout. But, but of course this is happening, and the hope is we can use it to provide a lot of value.

LUERS: Yeah. So there’s one other sort of context which is really important here, and that is that the other concerns that are being raised around this issue of electricity and datacenter growth is at that local community level, right?

BURGER: Right.

LUERS: So datacenters contribute a very small percentage globally in terms of emissions and even in terms of electricity, but they’re really concentrated. They’re one of the most concentrated industries in the world. There’s this great figure if you ever want to look up the recent IEA report on energy and AI (opens in new tab) , which really shows the concentration of different industries, and steel’s way on one side, which uses 7% of the emissions, which produces 7% of global emissions. And datacenters are all the way on the other side, which are real in terms of the level of concentration, how close they are together.

And the reason that’s important is that there’s certain pockets of the world where there’s really a lot of datacenters, and they keep going into those areas. It’s changing a bit now because of the dynamics that are happening. But when that happens in the … then in those big … in those areas, yeah, they’re a major user of electricity, right. And when the growth happens quickly in those areas, then it can be …

BURGER: It affects the local grid.

LUERS: … then it can … it can put a strain on local grid. And so there are concerns that are being raised in certain regions in the world about datacenter growth. And, and you know, I’m really optimistic that those are mainly infrastructure problems, and they can be addressed.

And we need to figure out how to do that. And I’m really … that’s why I’m so excited about … in January, we announced our community-first infrastructure initiative (opens in new tab) where we’re really focusing all of our work now on how do we design our datacenter development to ensure that that rapid growth is not a net negative, but actually a net positive for those communities. And that includes, you know, committing to paying all of the prices that it requires to meet our electricity needs so that our datacenters do not drive… prices increase in communities.

BURGER: So I can see how we can get to a net level like, you know, we basically don’t drive up local prices …

LUERS: Yeah.

BURGER: …don’t exacerbate local water supplies and bring it in, however you can do that. But how do you make it a net positive?

LUERS: Well, I think that, you know, we have been saying we’ve been net positive at a global scale, and I think we’re shifting that to say, what do we mean … what does it mean for net positive at a local scale? And I think at a local scale, for example, datacenter water use is, for cooling, can go net positive in the sense that in the datacenters themselves, we are actually beginning to design systems that use essentially zero water for cooling.

BURGER: Right. It’s recycling. Yup.

LUERS: Certainly. So we can, you know, that isn’t in place everywhere, but there’s happening now.

And then we replenish water, so we can do things … so for example, it turns out in many places—water in some cities is lost with leaking pipes, and AI, it turns out, can help identify those leaking pipes (opens in new tab) . So even if you get half of that, even if you save half of that, that can be way … our … the emissions from datacenters … the loss … the use of water in datacenters is only a fraction of what we would save. And so you can, you can amplify and save water oftentimes even by using AI, another optimization problem, in many ways.

BURGER: So, so this is more of a, this is more of a commitment from the company to work with the community to get them to a better place. But of course, using our global compute because we’re not going to just run the AI analysis that makes the pipe better in the datacenter that that can be …

LUERS: Oh, no, no.

BURGER: Right. But we’re just trying to … yeah …

LUERS: And it’s not just about … we’re not doing it just with AI.

BURGER: Of course not. Of course not.

LUERS: We’re also investing in training. We’re investing in NGOs. The real focus is to really understand what that looks like. And, you know, my interest is really to say, you know, can we increasingly co-design? What does community positive mean? This is all new because this rapid growth was the first time that this became such a serious issue of concern.

BURGER: Right, right. Well, I’m really glad you’re pushing on this, and I think it’s really important for the communities because if, you know, if we can put enough focus and enough innovation to make these things a net positive, I’ll certainly feel a lot better about it.

Going back to the emissions. So one of the reasons I brought you both here today was because Amy gave a talk at our research showcase, which really inspired me. And you talked about a lot of the emissions being generated through things like transport and inefficient management of some of our large, you know, nondigital systems.

And I … I heard that and I’m like, wow, this might be a place where some of our research could help. And of course, that’s my job, right, is to find places where our researchers can amplify the effect of their work.

So, Ishai, maybe just very briefly, can you talk a little bit about when you worked with the supply chain and you applied your optimizations techniques. And this was even before OptiMind, where you know you could do scenario planning for natural language. What sort of efficiencies were you able to drive just working internally with our business teams?

MENACHE: Yes, so I’ll give you a couple of examples, but starting off with what you mentioned about transport actually. So as part of this intelligent fulfillment service we are actually accounting for shipping costs. And one thing that one can do is account … also take more explicitly into account, you know, emissions and the sustainability considerations when doing the shipping itself, right? So that’s one kind of one option.

BURGER: So we could say, like, get … have the same, like, level of risk and time to delivery, but as your objective just try to drive your emissions down.

MENACHE: Correct.

BURGER: As one example.

MENACHE: Correct. There are other related examples of when, you know, what kind of hardware you use, when you have to fulfill the requirements, and for example, you want to use hardware that has been sitting for a long time in the warehouses. So that also has some implications.

Now, you know, we’ve worked over the years on various systems that, you know, efficiency has been a major goal of us, but sustainability is closely related to efficiency, and I can illustrate it through concrete examples.

So one is virtual machine allocation, which operates at a [very fast] time scale. It’s the process that essentially maps VM requests to physical servers.

BURGER: That’s in our cloud, you know, …

MENACHE: That’s in our cloud. Correct.

BURGER: … when a customer wants to use something. Just again, we’re keeping it, like, not too geeky here.

MENACHE: So one of our goals was to increase packing density, which means that we want to operate the servers close to 100% of utilization. And it is well known, actually, it’s a study by Google (opens in new tab) . I guess you’re familiar with it. Actually, you know, as you increase packing density, actually you reduce the power per unit of useful compute, right? So that’s well known. So for example, I don’t remember the exact numbers, but if your server is utilized at 50%, you still consume, like, close to 100% of the power.

BURGER: Yeah, you’ve provisioned it. You know, it’s being transmitted. You know, your chips leak, they have static power dissipation, all the stuff, sort of stuff I used to work on.

MENACHE: That’s actually … Doug knows better than me for sure. You know that’s been his area of research.

So another example that we’ve worked on since 2022 is rack placement. So you have these demands, and you have to decide how to exactly place them, these racks of servers within the datacenters. And there you have to account for power and cooling and, you know, space, and all that. And one of the things that we were able to achieve with our optimization is reduce power fragmentation by 1 to 2%.

Need an n8n workflow or help installing it?

After the briefing, move to execution: find an n8n template or a creator who can adapt it to your tools.

Source

Microsoft Research - microsoft.com

View original publication