top of page

Episode 73: Feeding the Matrix: Nvidia

The complete transcript for episode 73.

Episode 73: Feeding the Matrix: Nvidia

Molly Wood Voice-Over: Welcome to Everybody in the Pool, the podcast where we dive deep into the innovative solutions and the brilliant minds who are tackling the climate crisis head-on. I'm Molly Wood.


This week I’m excited to bring you the first interview in my special series on AI and energy use which I’m calling Feeding the Matrix. We’re all hearing quite a lot about the energy required to train and operate generative AI models about the need to construct more and more data centers worldwide to power these models. News stories about how Microsoft, for example, has struck a deal to re-open part of Three Mile Island to get more power Google, Meta and Amazon Web Services are all looking to nuclear power and other energy options. And it’s reasonable to wonder what a suddenly inexhaustible need for electricity might do to global sustainability goals and carbon emissions overall and what companies are doing about it So. I called some!

This week  I’m starting at the foundational layer Nvidia is the company that has become synonymous with AI in a lot of ways the chip-maker dominates the A-I market the graphics chips it developed in the 90s for gaming have turned out to be perfect for training large A-I models and its software platform called CUDA makes it easier to use those chips for general-purpose computing

It’s doubled down on specialized chips and software for AI and its stock price has DEFINITELY matched the increase in demand for its products over the last year or so

But what responsibility does Nvidia have to make sure its tech doesn’t cook the planet and turn humans into batteries? You know. Metaphorically.

Let’s dig in.


Josh Parker 

My name is Josh Parker. I lead corporate sustainability in Nvidia. In previous lives, I was an electrical engineer and a lawyer, but have been in sustainability now for several years. And yeah, excited about what AI is doing to advance sustainability.


Molly Wood 

Give me, guess, let's start with sort of the overall picture of leading sustainability for NVIDIA. Like, what does that look like and what are, know, broadly at least the sustainability goals for a company like NVIDIA?


Josh Parker 

So sustainability at Nvidia does look a little bit unique because if you think about our products and services, the place where potential greenhouse gas emissions happen and sustainability impacts more broadly, it's really largely downstream from us. And so when we think about trying to have the maximum positive impact on sustainability, we focus a lot of our attention primarily downstream. So how do we make our technology more efficient?

How do we innovate around data centers, chips, networking, all sorts of ways to reduce the impacts of our products and services when they're being used? And then also, we have this second order impact of AI for climate and AI for sustainability. So in addition to the direct footprint of our products, we also try to enable people to do really cool things to advance sustainability with our technology. So those are our two key focuses. Then of course, we look upstream as well.

because we're a fabless semiconductor company, our platforms company now, our primary impacts are not within our direct operations. So we're looking always downstream and upstream to try to manage those impacts.


Molly Wood 

Let's put a finer point on that before we dive into AI. I know everybody are freaking out, but we will get there. So when you say downstream and upstream, it's like on one hand, you're manufacturing, right? So there are first order impacts, scope one and two, if you will, of Nvidia's own operations. But downstream means that you are effectively someone else's scope three. It's like maybe even a more jargony way to put it, but let's just break it down a little bit for.


Josh Parker 

That's right. Yeah.


Molly Wood 

people who may be not familiar with this, the corporate usage.


Josh Parker 

Sure, yeah. So the greenhouse gas protocol scopes one, two, and three. We talk about all that all the time. But for us and for our customers and our suppliers, we all try to manage emissions across the value chain. So not only do we look at our direct emissions, but we look downstream emissions generated when our products are being used, end of life impacts. And then upstream, we look at our suppliers and the emissions generated there.

And we're running some analyses now, hoping to publish some of the results in this year now, product carbon footprints and life cycle assessments that really put a finer point on where those impacts happen. And so for us, our direct impacts from our direct energy consumption is really small compared to what happens with our customers when they use our products and our suppliers when they manufacture the products. So our customers care about

our emissions and our upstream emissions because they have to report those emissions themselves as well. So all of us are reporting kind of across the value chain as best we can. And so it makes sense for all of us to try to minimize impacts across the entire value chain. So that's why we're looking kind of outside of our direct operations primarily because that's where the largest impacts are.


Molly Wood

 And then say more about the transition to a platforms company.


Josh Parker

Yeah, it's actually been a very long time coming. if you look, especially in the AI space, if you look at the software stack that we started developing to enable AI called CUDA, that was developed back in 2006. started development back in 2006. So we're about 19 years in to this evolution of the company towards AI and towards a platform company.

The software really is a key component of our value. When we try to make AI accessible to our customers and to the public and to governments and organizations, enabling the use of our advanced chips and servers with software is really, really critical. So it's been a long time that we've been kind of preparing for this transition. But as you mentioned, we have really become

a platform company because we find opportunities. If you look across the entire stack, you have so many more levers to pull to try to increase performance and also to reduce impacts. I'm looking at everything, of course, from a sustainability perspective, but it allows us to innovate across the stack in ways that we wouldn't if we were focused exclusively on chips. And also, of course, to enhance the performance, which is critical for our customers as well.

Nvidia is a little bit unique in the fact that although we innovate across the entire stack, we actually sell products and services across the stack as well. We don't say you have to buy our entire server or our data center or just our services. We sell chips, we sell networking equipment, we sell cards, we sell servers and racks, anything across the stack. We really want to enable this ecosystem through our technology.

We're happy to work with partners wherever they are in the stack.


Molly Wood 

So, okay, let's turn more specifically to AI and climate. You actually gave an interview in which you said you were surprised that this is not a more nuanced conversation, that the primary response to the AI energy use conversation has been concern. Can you talk a little bit?


Molly Wood 

more about that. think we're all still, you I think a lot of people are wondering like, cool, everyone says if you just put some AI on the climate problem, we'll be good. And I think we want that to be the case, but I'd love to hear more of your thoughts on that.


Josh Parker  

Yeah, those of us who are techno optimists are eager to see what we can do with AI. And there's so much reason to be optimistic, more than we ever have before. I really think AI is the best tool for sustainability the world has ever seen. And we're just starting to unlock its benefits. It does make sense, though, for those of us who've been focused on climate and sustainability for a long time, you see the near term uptick in energy. And you say, wait a second, this is going to lead to increased emissions.

We've got this sense of urgency now. We can't allow emissions to rise at all, even in the near term. We need to try to manage this. So it's natural to have that initial response. But we also need to have that more thoughtful response afterwards, which is, okay, we see energy going up in the near term or potentially staying flat, you know, maybe some fossil fuel plants are staying online longer than they were expected to. All things that we need to take into account. But if you look,

forward past the next couple of years, you see the impacts of accelerated computing, the platform that AI is being built on, and AI itself having dramatically positive impacts for sustainability. And there are so many variables involved in trying to figure out what the net impact is of AI and of accelerated computing. If you take a very simplistic approach and say, okay, NVIDIA is selling more platforms and services, so their energy is going up,

This is a problem. the most simplistic analysis is the most alarming, but if you look kind of behind the curtain and see things like energy efficiency and how the energy efficiency of the accelerated computing platform is dramatically improving, even just every two years, you have a lot of variables that add color to the story that makes it a lot more palatable and, a lot less alarming and actually leads to that optimism that I was talking.


Molly Wood 

Is there, I want to talk about NVIDIA's efforts toward creating more sustainable and energy efficient products. Before we get there though, like is some of the concern warranted, do you think in the sense that in a rush to build more data centers to keep up in the AI race, if you will, that is there reason to think there are companies who might abandon or delay any of their sustainability goals?


Josh Parker 

I haven't seen any evidence of that. So, you we work, like I said, kind of across the ecosystem with large companies, small companies, startups. We haven't seen any deterioration in the commitment to the sustainability goals. Now you have seen some increases in kind of scope three emissions for some of the hyperscalers, which because they're building out new platforms. So that's something to take note of and to make sure that we're managing.

But I think of this kind of similar to fusion. And I know you did an interview at Climate Week with Commonwealth, right? If you focus on, okay, to build a new fusion reactor, you're going to need to invest in concrete. You're going to need to invest in the technology. There are going to be some environmental impacts to that. But the promise of the technology is so overwhelmingly positive and that it makes sense to make those investments. And in the case of AI,

the near-term positive impacts are really already happening. So Fusion, it's out in the future. Absolutely, we should be pursuing it. But we don't know exactly when we're going to solve that. With AI and accelerated computing, already seeing very, very dramatic benefits that are starting to offset impacts of the technology itself. One example I'll give in manufacturing, we have a manufacturing partner, Foxconn, who's building a plant in Mexico.

actually to manufacture some of our products. And they used AI and 3D modeling, our omniverse platform in developing the plant, the factory on the front end. And they're forecasting that they will reduce the energy consumption during manufacturing by 30 % because they were using AI for it. And because manufacturing is such a big energy consumer, you know, the footprint of that AI to run those models was tiny compared to the savings that we're going to see in the manufacturing sector through.

through AI. absolutely we should be.


Molly Wood 

Right. So they use, let's be even more specific in that example, they used AI to predict in advance to model out how to create this plant so that it would be maximally efficient.


Josh Parker 

That's right. Yeah, they optimized the design of the factory to try to reduce energy consumption and they're achieving a 30 % projected reduction in energy. those are some of the near term impacts that we're seeing that I think make us optimistic, reasonably optimistic about the near term immediate impacts of AI for good and for climate.


Molly Wood 

Right? I'm gonna ask you more about digital twins in a minute, but let's also talk about the Nvidia products themselves. I understand that there is like a shift in chip architecture that hopefully will also continue to bring more energy efficiency in data center and computing platforms.


Josh Parker 

That's right. Yeah. So the architecture that Nvidia innovated and deployed is called accelerated computing. And in traditional servers, traditional data centers, it's CPU focused and the data center is basically built around the CPUs. And accelerated computing, we still use CPUs, but we supplement those with very, very powerful, they're called GPUs because of our legacy as a gaming and graphics company, graphical processing units.

those are really, good at running math very, very efficiently, much, much more efficiently than CPUs can. And they can do a ton of it in parallel. And the result is, because AI, it turns out, relies very, very heavily on matrix math, these GPUs are ideally situated to run all of the computations required for AI. And they do it so much more efficiently than the CPU-only architecture could. So the hardware that we've developed in addition to the software stack that I mentioned and the proliferation of data available to train models is really what has enabled the AI explosion over the past few years. We have enough compute and we have enough data to be able to enable AI and that's why we're seeing AI proliferate the way it is because we have both of those two things finally in a way that allows us to run the AI algorithms.


Molly Wood  

How much of, I mean, one thing I love about the efficiency conversation, right, is that we know that energy efficiency, saving energy, using less, is a massive emissions reducer. But efficiency itself is usually like more efficient, not to speak in tautologies. It saves you money, it saves you time. 


Josh Parker 

That is the beautiful. Yeah. Yeah. So if you have competing priorities, you know, how do you balance them? And that's really the beautiful thing about this moment, both for Nvidia and I think for the ecosystem more broadly for our partners is that the economics line up very, very well with sustainability goals to be able to innovate, to drive the increases in computation that we're looking for the increases in performance.

to help enable new frontier models and new achievements with AI, we absolutely need to manage energy and we need to improve energy efficiency dramatically. Otherwise, you just are spending all of that energy in heat. It's wasted. You get temperatures where you're melting chips and things like that. So the performance gains are tied directly to energy efficiency improvements. So they go very much hand in hand. And of course, that

That helps our customers, that helps our partners with their bottom line because they're doing things much more cheaply ultimately than they could have previously because of the energy efficiency. And to put this in context, one data point I'd love to talk about because it's just really mind blowing is that I mentioned generational improvements in energy efficiency. So between our last platform, Hopper, and the current generation platform that we have, Blackwell, we've seen a 25x reduction.

in energy for the same task. So 96 % less energy is consumed for the same inference task on the new Blackwell platform than it was on Hopper. And just a two year innovation. incredible.


Molly Wood 

Wow. I was just going to say, and what was the time lag there? So you could imagine in theory a platform comes out in two years that, I mean, 25x, that's a pretty high bar. But if you do that again in two years, that feels like a pretty big deal.


Josh Parker 

Yeah, it's huge. this is one of those variables that I was talking about. If you're trying to look at the overall net impacts, you have to take into account these really, really dramatic gains in energy efficiency and other efficiencies on water and so forth that we're seeing. And that type of innovation, we can't predict exactly how much energy efficiency we'll gain with our next architecture after Blackwell. But if you look over the past decade, that type of reduction

is has been very, very consistent. So with every generation, we're making dramatic improvements in energy efficiency. And if you look back over the last decade, it's actually a hundred thousand times more energy efficient than we were just 10 years ago for AI inference. So it's, it's really hard to wrap your head around those types of numbers, but it's, that's really been what's enabled the performance gains and those performance gains again have been enabling for AI generally.


Molly Wood Voice-Over: Time for a quick break. When we come back, we’ll talk more about efficiency gains and when they will outstrip emissions gains and using AI to actually do things in a more sustainable way.


Molly Wood Voice-Over: Welcome back to Everybody in the Pool. We’re talking with Josh Parker who leads corporate sustainability at Nvidia about how to move into an AI future without making the climate crisis worse. I asked Josh ok so these efficiency gains you’re making 


Molly Wood  

At what point does that start to decrease overall consumption? There's sort of the Jevons paradox, right? Which is that as much as is available, we will use. And so it's like LEDs, right? great, it's 96 % more efficient. We'll just build 96 % more capacity. Is that happening? Or is there a point at which it will sort of, like you said, there may be a short uptick and then a leveling off or hopefully even decline?


Josh Parker  

Mm-hmm.


Josh Parker  

So if you look just at the kind of first order impacts, saying, okay, we're building out new data centers. We're using a lot more of accelerated computing because, you know, there's this value here and there are opportunities for new innovation. That's going to go up and we expect that to continue for a while. And Jevons paradox, yeah, we expect because we're driving the cost down of compute and we're enabling more for the same dollar.

there's going to be more demand for it. You know, we're moving up the demand curve. So that's, that's going to happen. And that's why we expect, you know, we'll continue to see Nvidia hardware being very much in demand for the foreseeable future. The other side of that, however, is what's happening when AI comes in and accelerating computing comes in and starts doing things that are being done elsewhere, but less efficiently. So,

transitioning traditional data centers over to accelerated computing is more energy efficient. so as old existing infrastructure starts to deteriorate, it's time to replace it. We expect more and more of those to actually be transitioned over to accelerated computing. And then of course you have the benefits of AI longitudinally across other sectors in manufacturing and transportation that will end up having much bigger impacts to reduce energy.

than the data center footprint itself, which is still, estimates are data centers overall use maybe 2 to 3 % of global energy. so even if, and AI is a small fraction of that, less than 1%. So even if AI grows significantly over the next several years, it's unlikely to, its direct footprint is unlikely to be larger than its net savings in other sectors.


Molly Wood 

Right. And then let's talk a little bit more about using AI itself to solve some of these issues. So you alluded to with the Foxconn example, this idea of a digital twin, of the kind of modeling that it's capable now, that we're capable of now that we weren't capable of before. Can you talk a little more about that? I know that there's like this Earth 2.0 project that's been happening within NVIDIA.


Josh Parker 

Yeah, yeah, there are so many exciting things to talk about about the applications of AI and sustainability. The digital twinning, very exciting technology, especially because again, manufacturing is such a big energy consumer and such a bit has such a big footprint generally. Digital twinning is really a unique combination of some of Nvidia's strengths where we use our 3D modeling technology. You know, we've been a graphics company for 30 plus years now.

We combine that 3D modeling technology with AI in a way to help plan, optimize, and train new factories and robotics and so forth to make them smarter from the get-go. They're designed specifically for efficiency in ways that really end up having dramatic improvements down the road once the factory's actually built. It means you don't have to test things in the real world. You don't have to consume the energy.

the resources, the physical resources, the manpower, it can be done digitally first, and then when you build it, it's already optimized and more efficient. So that's digital twinning, very exciting, and we're just starting to see the really dramatic impacts of it. You mentioned Earth2. That is also a very exciting initiative. We're trying to enable more accurate and faster weather and climate modeling through AI.

And AI actually has a huge potential to help reduce the computation required for advanced accurate physics simulations. If you use AI to supplement existing physics-based simulations, you can do things much more quickly, much more accurately, and much, more energy efficiently than you have been able to in the past. So we're working with governments, with other organizations to help enable

integration of this technology into the overall climate effort and modeling.


Molly Wood 

How about model optimization? Like there's this idea of sort of right sizing models, not necessarily asking AI to, please don't feel that you need to analyze the entire corpus of human history to answer this question, right? I could create a model that's actually the right size for that slash, I could train models in a much more efficient way. How are you addressing that?


Josh Parker

Mm-hmm.


Josh Parker 

There's so much innovation happening in this space specifically, which you would imagine because we really are still in the very early days of the AI Renaissance, of this fourth industrial revolution. So there's a ton of innovation happening to figure out what is the right size and what techniques can we use to try to optimize models, both on the training side and on the inference side.

So we're seeing all sorts of new techniques being innovated almost weekly in AI for new types of pruning, overtraining to try to reduce model size, but still keep a lot of the performance and the functionality. Small language models, medium language models, and a mixture of experts technologies. It's easy to get very wonky and a lot of this ends up being very technical.

to try to understand how exactly these models are being massaged and tweaked and adjusted. But we're definitely seeing a proliferation of more sizes of models. We're seeing optimization of models where for a lot less parameters than we used previously, we can get the same performance, which means you're more efficient on the training and on the inference that way. So it's really across the spectrum.

And a lot of the work is being done by our partners. And we're doing some of it at Nvidia as well. downstream, the big model companies are doing amazing things to optimize.


Molly Wood 

And so that's primarily, that's not necessarily in your wheelhouse. That's mostly about the Microsofts and the Open AIs and the Amazons sort of saying, we can optimize these models or is there a component to that that you participate in also?


Josh Parker 

Yeah, yeah, we train some of our own models and release them as well. We released a really neat coding model called Star Coder in 2024. We do some that work ourselves as well, and we certainly try to partner with them to help. But you are seeing a lot of the innovation there with our partners because they are training the frontier models and doing amazing things. So they have a vested interest, of course, in trying to manage that. And that's why we're seeing so much innovation.


Molly Wood 

This is unfair and not totally yours to answer, but I think a question a lot of people have is where the energy consumption mostly occurs. Is it in the training or the inference, like the query process?


Josh Parker 

That is a difficult question. Because things are the landscape is evolving so rapidly, we do expect, you know, over time for inference to dominate, to become the majority of the energy consumption. But it really, it comes down to a question of how durable do you expect your models to be? If the model, if you train a frontier model, and you're able to use a variety of version of that model for


Molly Wood 

or a little bit of both? Is that the question? Yeah, exactly.


Josh Parker 

eight years, then absolutely the inference is going to become the majority of the energy consumption. But we're still in this phase where we're seeing so much innovation and the types of models that are being developed, how they're being used, how they interact with each other, know, one model calling another models being pruned, being overtrained, etc. There is a lot of innovation happening there. But the conclusion is we do expect inference to become more significant in training soon.

and for that to last.


Molly Wood 

And then I guess I wonder, you've mentioned partners, like how big a conversation is this with hyperscalers? Obviously this is a robust ecosystem. imagine you're all talking all the time. Like to what extent do you feel that you are in the room bringing up sustainability all the time?


Josh Parker 

That's definitely what's happening. So I mentioned earlier that we haven't seen any indication that organizations are retreating from their commitments, their sustainability commitments, which is fantastic. so that conversation continues. we, within the ecosystem, all recognize, because we report value chain emissions and we're looking at value chain water and so forth,

It's important for us to collaborate to make sure that we're working together because we do have aligned incentives and aligned goals. So we are talking to our large partners regularly, suppliers and customers. And the great news is that we really are rowing in the same direction. We're trying to decarbonize. We're trying to reduce water consumption and we're making significant progress in part because it's an industry effort industry wide.


Molly Wood 

How often, I want to go back to something you said about retrofitting inefficient data centers. Like when we talk about deployment of these much more efficient technology stacks, are we talking about primarily new data center construction or are you talking about, like how often is a data center retrofitted? How often do they replace the equipment inside?


Josh Parker

So it's a mix and NVIDIA, we don't always have visibility into how our products are being used. Whether they're number one, replacing existing hardware. And number two, whether even if it's a new build, it might be replacing an existing software load. So AI of course can do some work that traditional data centers with traditional software algorithms were doing previously. So even if it's a new build, it might be taking some of those workloads over and doing it in a new.

platform in new way. So we definitely see a mix, and again, we don't have perfect visibility into this, but it makes sense that in the near term, the focus would be on new builds because we have existing data, traditional data center infrastructure that's already busy doing things. Over time though, we expect to see the substitution and kind of the upgrade effect being very significant because it is a more energy efficient platform.

And as data centers get retired, we expect to see more of our partners adopt the accelerated computing infrastructure for those data centers as well.


Molly Wood 

Right. And do you have much visibility on the kind of scope of new building?


Josh Parker 

Yes, yeah. Business has been good, which is good for the business, but also, again, I think it's good for the industry and good for sustainability because it ends up being more efficient. I think if you look at our public financials and so forth, you get a sense of the pace of deployment of NVIDIA hardware. There are other companies participating as well.


Molly Wood 

You're selling a lot of chips.


Josh Parker 

And it is growth is very robust. And I think we've said in our public statements, we expect that to continue. We don't see a change in that. And, but, you know, that kind of leads into the question we just talked about, which is, okay, but how much of that is net new? How much of that is substitution and upgrades of existing infrastructure? And I don't think anybody has a good handle on that. One thing else to mention though, is that there's a lot of talk about the net new infrastructure.

because it was a little bit of a surprise, right? When ChatGPT burst onto the scene, this was a huge rush by all these companies to deploy AI because they saw, this is proof that there is value here, that we can do amazing things with AI. So there has been a near-term supply-demand challenge with both energy and with data center hardware that we're managing.

We expect now that companies are in this, we're all better able to forecast how this is going to roll out, what future demand is going to be. So we expect the future to be a lot more, a lot less chaotic because this is more something that we're all expecting. And we're also seeing, you know, there's concern about data centers, huge data centers being deployed in the same locations and it's straining the grid and so forth.


Molly Wood 

Mm-hmm.


Josh Parker 

One of the wonderful things about AI is that it doesn't care where it goes to school. So the training portion of AI and to some extent actually inference as well, can happen in locations where you previously couldn't have data centers. We see partners like Crusoe who are building data centers at stranded energy sites or at renewable energy sites because you don't have the same latency dependency that you do for traditional workloads.

build a data center in West Texas where there's extra gas that's being flared, capture that, use it to train your model, and then put your model on a hard drive and take it to wherever it needs to be used. So it's really a fantastic aspect of the technology that helps manage the impacts of the deployment as well.


Molly Wood 

Right.


Molly Wood 

I was actually wondering if that was an innovation opportunity, to the extent to which NVIDIA might be designing for intermittency. Like you don't have, like you said, the uptime requirements. You could certainly train models when the sun is shining or the wind is blowing, for example.


Josh Parker 

Mm-hmm.


Josh Parker 

Yeah. And that, that type of, issue and feature is built into, our, our software and our hardware. we have, when you're training a model, absolutely. You have checkpoints where if you decide you need to take the data center offline, or if there is some kind of a failure, you go back to your last checkpoint and you just pick up from where you're left off. So, that is that type of functionality is, available to, companies that want to use it in the near term.

Everybody wants to get the most out of their hardware that they can. So, you know, they're looking to try to maximize the utility of it. But certainly there is the opportunity to take it offline based on intermittency of energy as well.


Molly Wood 

Right. And then finally, tell me about the innovation pipeline. Obviously, NVIDIA is an innovator, but no one can do it alone. Is there a corporate venture arm? How are you getting new ideas in to continue to solve some of these, other than just asking AI, which at this point, I think we're all just doing. hey, how do I make this better? Right.


Josh Parker 

Yeah, right. With varying degrees of success. But yeah, the innovation, is, I know I sound like totally over the top of Drunk Kool-Aid, but the innovation in AI and sustainability is really amazing. And I think it's largely because AI is a democratizing tool for technology. You're able to, with AI, become a coder, you know, and use technology in ways


Molly Wood 

haha


Josh Parker 

that you didn't have the technical skills for before, because it's basically a layer that makes technology more available, more accessible to all of us. So we're seeing a proliferation of startups across AI, but specifically in climate tech. We have a program called Inception, where we try to enable startups to use AI to use our platform. And we've got last count.

I can't remember the number of it's thousands of companies in this inception program. And it's over a thousand that are in climate tech specifically. So, lots of people who want to do amazing things with technology now have a new tool that allows them to do more. And you don't have to be, you know, a PhD physicist to participate in, in climate change. If you have really, really powerful AI tools to help you bridge that gap to the technology. so.

Very, very optimistic. It's early, of course, early days to see direct results, but there is a ton of innovation happening and I expect in the next several years we'll start to see very exciting and promising results. Carbon capture and storage, weather and climate modeling, all sorts of technologies.


Molly Wood 

I mean, I think this is where I started with the kind of...

Dismissive comment, not dismissive, but there's the sense that everybody's sort of waiting for these results. But I feel like it is important to note here that the technology is new, the kind of race and freak out for more energy to run it is new, that we all hope that it proves its capability over time and that we will be able to hopefully solve more than just composing better sounding emails very soon.


Josh Parker 

Right, right.


Molly Wood 

Not that that's not important, but I think there is this sense of like, okay, cool, if we're gonna like build this out and consume a bunch of energy, when are we gonna turn to using it to solve really big problems?


Josh Parker 

Yeah. Yeah. And in the near term, there's a bit of a storytelling problem. And I like to focus on stories. People's appreciation of AI, especially if you haven't used it in the last six months, it may be, yeah, I can use AI to create an image of, you know, my cousin riding a donkey in the jungle. It's doing so much more than what the average consumer sees behind the scenes and getting that narrative out there, getting those stories out there.


Molly Wood 

Mm-hmm.


Josh Parker 

about the carbon capture and storage, about optimizing traffic systems, about optimizing manufacturing, and about really step change innovations around quantum computing or fusion. If we can advance fusion development by just five years using AI, absolutely. I mean, that's game changing. And those types of things are feasible.

reasonable to expect AI to help accelerate innovation across those industries significantly.


Molly Wood

Love it. Love it. Thanks for ending on a storytelling note. Josh Parker is head of sustainability at Nvidia. I really, really appreciate the time.


Josh Parker

Thanks, Molly. My pleasure.


Molly Wood Voice-Over:

That's it for this episode of Everybody in the Pool. Thank you so much for listening.

Next week, we’ll dig into data centers specifically with Chris Walker who leads sustainability for AWS you know, the Amazon cloud infrastructure and we’ll talk about surprisingly nerdy and sometimes low tech ways to make EXISTING data centers a lot more efficient in addition to building newer better ones.

Email me your thoughts and suggestions to in at everybody in the pool dot com and find all the latest episodes and more at everybody in the pool dot com, the website. And if you want to become a subscriber and get an ad free version of the show, hit the link in the description in your podcast app of choice.

Thank you to those of you who already have. See you next week.

bottom of page