Insider's Guide to Energy
The Energy Industry is uniquely evolving as traders are under increasing pressure to manage costs, cash, limits, and risks. The Insider’s Guide to Energy Podcast addresses current and emerging challenges business executives face daily through stories shared from peers and industry experts while covering topics such as innovation, disruptive technologies, and emerging trends.
Insider's Guide to Energy
190 - NVIDIA’s AI Revolution in Energy: Smart Grids, Renewables, and Carbon Capture
In this episode of the Insider's Guide to Energy Podcast, we dive deep into how NVIDIA is revolutionizing the energy sector through the power of AI and accelerated computing. Discover how NVIDIA's cutting-edge technology is being integrated into smart grids, enabling more efficient and reliable energy distribution. The conversation covers the company's collaboration with energy and utility companies, offering insights into how AI is optimizing legacy infrastructures and paving the way for a smarter, more resilient grid.
Beyond smart grids, the podcast explores NVIDIA's role in advancing renewable energy. Learn how AI-driven simulations and digital twins are helping to optimize wind farm production and other renewable energy sources, making them more cost-effective and efficient. NVIDIA's innovative approaches are not only enhancing energy production but also contributing to significant reductions in carbon footprints, a crucial step in combating climate change.
The episode also delves into the exciting potential of AI in carbon capture and sequestration technologies. NVIDIA's collaboration with industry leaders like Shell demonstrates how AI can accelerate the development of carbon storage solutions, making them more accurate and efficient. By leveraging advanced AI models, NVIDIA is playing a key role in addressing some of the biggest challenges in the energy industry, from reducing energy consumption in data centers to supporting groundbreaking research in sustainable technologies.
We were pleased to host: https://www.linkedin.com/in/dionharris/
Visit our website: https://insidersguidetoenergy.com/
Transcript
00:00:01 Dion Harris
Everyone knows NVIDIA is very involved in AI, but what many may not know is that we're closely working with a number of energy and utility companies to bring AI to the energy sector.
00:00:16 Chris Sass
Trusted source for information on the energy transition. This is the Insider’s Guide to Energy podcast.
00:00:28 Chris Sass
Hello, energy leader. This is Chris Sass, host of the Insiders guide to energy, and I asked for just a few moments of your time to make you aware of a brand new program that.
00:00:36 Chris Sass
I think.
00:00:36 Chris Sass
You'll be excited about program is called Podcast Power Media Training for energy leaders.
00:00:42 Chris Sass
You're probably already aware the podcasts are a dominant force in the media today. As an executive in the energy space, you are often asked to be.
00:00:49 Chris Sass
A guest on.
00:00:49 Chris Sass
A podcast and your legacy media training leaves a lot of gaps in the skills you need to be successful on a podcast. Common sense doesn't all.
00:00:57 Chris Sass
To do.
00:00:57 Chris Sass
It.
00:00:58 Chris Sass
And to avoid these pitfalls, insiders guide to energy has partnered with the antenna group and their media trainers to create a hands on interactive 2 Day Workshop designed for executives in the energy and clean tech space. Specifically, you'll get all the skills you need to have successful interviews and to reach the audience you intend to reach. So whether you're reaching out to customers.
00:01:19 Chris Sass
Future employees or investors. This is the class for you. Space is limited to the 1st 10 and those that are selected will get the opportunity to be a guest on the insiders guide to energy practicing your new skills.
00:01:32 Chris Sass
Welcome to another edition of the Insiders Guide to Energy. I'm your host Chris Sass, and with me today is my co-host Jeff McAuley. Jeff, another great day in energy. How you.
00:01:39 Jeff McAulay
Doing today always exciting and this topic boy, I'm amazed that we haven't gone into more depth today and really excited to have the AI conversation with you today.
00:01:50 Chris Sass
Well, if you're going to talk AI, there's no way to talk it without talking about NVIDIA. They're they're in the news all the time. They're a leader in the field, and I am super excited to have our guest today from the product group at NVIDIA, Dion Harris. Welcome to the podcast.
00:02:03 Dion Harris
Thanks for having me. Pleasure to be here.
00:02:05 Chris Sass
So we've set the bar high. We're expecting amazing things from today's conversation, so don't let us in our audience down. You know, you in videos got.
00:02:12 Chris Sass
You.
00:02:12 Chris Sass
Know a lot, a lot of credibility, a lot of things to stand up to. I guess the first thing is we're on an energy podcast and we're talking to what's perceived as a chip manufacturer. Let's start there. Why isn't Energy podcast talking to?
00:02:25 Chris Sass
Media.
00:02:26 Dion Harris
Well, when you think about NVIDIA, like you said, we're known for GPUs and of course, you know, make world class GPUs and that is sort of the bedrock of our.
00:02:34 Dion Harris
Platform, but in order to make those GPUs useful and accelerated computing as a computing platform useful, it takes a lot of domain specific. You know such software to really activate and optimize that underlying infrastructure. And so NVIDIA spent a number of years doing.
00:02:54 Dion Harris
You know acceleration, doing accelerated computing domain by domain. So energy is one of those domains that we've been tackling for a number of years.
00:03:03 Dion Harris
And not just in terms of how we can leverage AI, but how we can leverage various simulation techniques leveraging material science, leveraging renewables energy research. So there's all these different confluences of technologies that we're able to bring to bear and really OfferUp for the energy sector to take advantage of.
00:03:23 Chris Sass
I I hate doing infomercials and commercials, but I do want to know what kind of space are you in or what products are you in in the energy space right now.
00:03:31 Dion Harris
Absolutely.
00:03:32 Dion Harris
So like I mentioned before, you know Invidia is an accelerated computing platform company. And so the way that we approach every industry is we think about, OK, what can we add uniquely from an underlying infrastructure standpoint from a software and acceleration library standpoint and then how can we enable the platform companies within that industry or within that domain?
00:03:54 Dion Harris
To take that to market, to their customers. So for example, we're working with companies like utility data to.
00:04:00 Dion Harris
Build out smart smart utility grids to help enable them to take advantage of AI to really transform a lot of the legacy grids that exist in the country today. So in that particular instance, we're leveraging our core platform technology. We're taking, you know, making some some adjustments to hardware to integrate with their hardware.
00:04:20 Dion Harris
Platforms, but it's the software that we enabled through our AI platforms, whether it be object detection, whether it be sensing, whether it be the data analytics packages. And so we're making all of those available to companies that can then build on top of that to deliver service services.
00:04:38 Dion Harris
To their customers.
00:04:40 Jeff McAulay
So it sounds like NVIDIA goes way, way beyond just delivering chips. You're an active participant now. Help us understand you're not running the data centers, but you are an active participant in helping data centers run better and more energy efficient. So we definitely want to hear about how you can help those data centers use.
00:05:00 Jeff McAulay
Less juice than there may be forecasted to today.
00:05:03 Jeff McAulay
And then absolutely come back to you, see the the energy ecosystem and the electric grid as an application that you are actively developing solutions with other ecosystem partners. So that's way more levels than I understood and video to be involved in. Is that a accurate description?
00:05:23 Dion Harris
Very accurate and and so you know first if I just unpack accelerated computing cause I always like to start there because that's the crux of what we do. It's a full stack problem. So it starts with like I mentioned.
00:05:33 Dion Harris
You know, processors where there be CPUs, DPUC'S and GPUs, but it's very reliant on the network like like you described. The data center today is really the new unit of compute, meaning that we're no longer just looking at leveraging a single server or a single rack even to service some of these workloads. It really is the full data center and working in concert.
00:05:53 Dion Harris
So we have to optimize at the networking layer and then again all of this takes you know, very robust orchestration and software to make it all work as one. So we we do a lot of investment in our software in fact and video has more software engineers than hardware engineers believe it or not. And so when you just look at the data center as a whole.
00:06:13 Dion Harris
We have to innovate it all. The layers of the stack and what that translates into is once you've done that, you can have orders of magnitude overall speed up. And so because you're able to make an application run faster, you're often able to make an application consume less energy. And it seems kind of counterintuitive, right, because you're taking a GPU, you're adding it to a server, therefore.
00:06:35 Dion Harris
Getting more workload, more, more power consumption to that server.
00:06:39 Dion Harris
But because we can improve the productivity by ten 30-40 X in a lot of cases, you can actually drive down the consumption of that particular application.
00:06:49 Chris Sass
So all the stories I see in the news about data centers taking over the world and needing more and more power, what you're saying is you've driven power consumption down by 1020%. So does that mean that the demand is just there's that many more applications coming into?
00:07:03 Dion Harris
The world. So I think there is this, these competing forces, right?
00:07:09 Dion Harris
Basically you have the data growth is is exploding when you look at some of the estimates they they they suggest that you'll have over 175 zettabytes of data produced by 2025.
00:07:21 Dion Harris
So you have that growth in data, but then you also have, you know, sort of this emerging capability that's being brought together by both accelerated computing and AI that's improving the overall compute efficiency by, you know, hundreds of thousands of of X factors. So we're not just talking about, you know, 5 to 10X, but over the 10 last 10 years, for example.
00:07:42 Dion Harris
We've improved the efficiency of our platform for certain AI workloads by 100,000 X. So these are numbers that that don't even make sense when you're usually looking at at sort of.
00:07:52 Dion Harris
Computing frameworks and generation over generation improvements. But on top of that, the key thing that we we try to drive home is that once you deploy a lot of these models, AI and some of these accelerated applications not only reduce or reduce the time for the solution, but they reduced the overall energy.
00:08:12 Dion Harris
Consumption. And so there's a number of use cases that we can talk about and I'm sure we'll get into it that can actually, you know.
00:08:18 Dion Harris
Describe how that that trade off happens.
00:08:22 Jeff McAulay
I'm excited to get into the use cases just on the data center operations because I've seen numbers that were on the order of eight to 10 gigawatts of data center power consumption today and that's going to 15 to 20. So more than doubling by 20-30. And so because that load growth is so large.
00:08:41 Jeff McAulay
It's important that you're able to provide some energy efficiency, so maybe that's not as as taxing I'm. I'm also curious how much of this power can be used by renewable energy. We see big targets by the big tech companies, Apple, Google, Amazon.
00:09:00 Jeff McAulay
How involved are you in those renewable energy targets or facilitating use in data centers by renewable energy?
00:09:08 Dion Harris
Yeah. So first, you know what NVIDIA is is first and foremost. We're focused on making our platform as efficient as possible. I mean, that really is sort of where we try to spend most of our effort to make sure that we're delivering.
00:09:20 Dion Harris
How you turn customers to make sure that they can.
00:09:22 Dion Harris
You know, do the most work with the least amount of energy and the least amount of cost and the least amount of people. I mean, it's all about how can you help them optimize their investments. So that's really where NVIDIA is.
00:09:32 Dion Harris
It's focused for.
00:09:33 Dion Harris
Most when you look at some of the downstream effects of what's happening in the data center, I think there's a couple of things that we like to highlight.
00:09:42 Dion Harris
Which is.
00:09:43 Dion Harris
As you look at that growth of say 1 to 2% of overall energy consumption for data centers where we're really looking to figure out is how can we help the data centers really affect that other 98% of energy consumption. So in other words, how can we accelerate the research and renewables energy sources? How can we help improve the efficiency?
00:10:03 Dion Harris
Of the fleets that exist out there in in aeronautics and in automotive.
00:10:07 Dion Harris
How can we also improve the overall waste that happens in large industry like manufacturing and heavy industry use cases? So you know a lot of the the attention looks at sort of data center consumption. But we think that a lot of all the innovation that's happening across industry that's going to help us tackle that problem of of reducing carbon footprint and reducing overall impact on climate.
00:10:31 Dion Harris
It's going to be driven by computational science, so you know, we try to make sure our platform is as efficient as possible so that we can help power some of those breakthroughs.
00:10:41 Jeff McAulay
One more question on the location of the data centers because I've heard this come up in a couple of different cases and heard both sides of the argument that data centers need to be located close to population centers because they're performing tasks that require low latency. And we have very little patience for clicks these days or or how long it takes for something to load.
00:11:02 Jeff McAulay
On the other hand, I've heard that for batch processes that you could have remotely located data centers, you could do model training, some more advanced, computationally intensive work in areas that are maybe very windy but don't have the transmission.
00:11:18 Jeff McAulay
I mean, how would you approach that are, is there an opportunity for remote data centers to take advantage of stranded renewable energy today?
00:11:26 Dion Harris
Absolutely. And so and we're seeing that happen. In other words, when you look at sort of where data centers need to be built, there's a couple of key functions that we talk about, especially as relates to AI work.
00:11:37 Dion Harris
Flows when you think about some of the training workloads that don't care where they are. In other words, they can be trained anywhere in in, in remote areas that have lots of, you know, access to renewable resources, whether it's wind, sun. In some cases there might be, you know, opportunities to locate data centers and deserts where you have lots of access to solar renewable energy.
00:11:58 Dion Harris
But the key thing is they don't need to be close to densely populated areas because they're not serving up workloads and users. They're creating a a function that's very central.
00:12:06 Dion Harris
Wise and batch oriented. There are some inference based workloads where you do want some proximity to usage or you want proximity of the data such that you can make it available and make sure you have the latency targets that that facilitate a you know a great user experience, but for the most part we are starting to see a huge.
00:12:26 Dion Harris
Shift where you see AI, what we call foundries, where people are building these models, they're building a lot of these AI capabilities and they're they're the data centers are being placed where resources are plentiful and where they're less disruptive to highly populated areas.
00:12:40 Chris Sass
There's one approach, or one of the uses we've talked about now is data centers, location and kind of the demand side of the equation making things more efficient. Yeah, but folks use AI every day and in the energy business and utilities, how are the grids using AI? What? What are we getting into there and what is NVIDIA helping us with?
00:13:00 Dion Harris
So so grids are really, UM, an interesting problem to solve and and so just some background on my career, I spent about five years of my my career working at PG&E, which is our Northern California utility.
00:13:12 Dion Harris
And one of the things that you really come to appreciate about an electricity grid is it is truly a grid. It is a distributed set of infrastructure and components and endpoints that all need to work together to be able to deliver power, you know, perform to deliver power.
00:13:32 Dion Harris
Reliably and efficiently.
00:13:35 Dion Harris
So the the unique challenge about this particular problem is that it's not a centralized problem. It's not like you can just put something in a data center and have it actively manage this entire infrastructure. And So what we've done is we've basically leveraged a lot of our edge based AI use cases, right? And so that can be like I mentioned before, it could be remote.
00:13:55 Dion Harris
Fencing it could be image detection. Say for example for wildfire detection it could have image classification for predictive maintenance. So we have brought to bear a number of.
00:14:07 Dion Harris
Edge AI sensing use cases that can enable smart grids to take all of those endpoints, whether they're meters, whether they're Transformers, whether.
00:14:15 Dion Harris
Substations and allow them to really take the data and that exists within their their sort of portion of the grid and make that available to the broader system. And once you can do that, you truly unlock the power to improve efficiency to make sure that you can deliver better utilization. Like today most grids.
00:14:35 Dion Harris
They're about 30% utilized and so by having this intelligence that you bake into the grid, it allows you to deliver more utilization, more reliability and and ultimately more efficient.
00:14:46 Chris Sass
Where are we along this? Where? Where are folks using the software and and how? How much are they embracing and leaning?
00:14:52 Chris Sass
Into the AI today well.
00:14:54 Dion Harris
Like I mentioned, we're mentioned we're partnering with with some you know, key partners. I mentioned utility data, they're they're one that has taken some of our our core technologies and fully baked it into their platform and are having lots of success.
00:15:07 Dion Harris
Working with other meter companies, for example like hubba like Hubble, I'm sorry.
00:15:12 Dion Harris
They've done some really incredible work where they're they're along this journey of moving from just pure smart meters to actually endpoint sensors. And So what I would say is it's it's really a continuum. And so right now we're in this area where we're, like I said, we have smart meters. We're getting to a point where we can do predictive analysis. But as we move forward and start to, you know, really adopt and leverage.
00:15:34 Dion Harris
Buying a broader, broader scent across that grid.
00:15:36 Dion Harris
That you can think about the coordinated efforts that we can do in real time across those resources and you can also start think about the dynamic you know, sort of control mechanisms that you can do to optimize the grid. And so I would say that's definitely more future tense. But right now as we're getting a huge installed base of, you know, smart meters and and smart devices within the grid.
00:15:57 Dion Harris
Is now going to unlock a whole new set of of opportunities that go on tap into that.
00:16:02 Jeff McAulay
Diane, as we're looking ahead here, what other exciting applications are on the horizon? I'm thinking in particular around digital twins predictive maintenance that could be for grid infrastructure, but also help replace major energy using components. You mentioned a few earlier and that way you know old equipment uses.
00:16:22 Jeff McAulay
More energy and if you can replace them earlier, there's an energy impact there as well. What other simulations in that category do you think will be coming online soon?
00:16:32 Dion Harris
Yeah. So, so I mean, digital twins is, is a really exciting area. We could spend a whole hour talking about that, so.
00:16:38 Chris Sass
We will. I think we're going to have NVIDIA back and do an episode just on digital.
00:16:42 Dion Harris
Twins in the future? Yeah, so I'm looking forward to that, that, that should be fun. I'll tune into that one as well. But. But what's really exciting about digital twins, and I'll just describe it in this context.
00:16:53 Dion Harris
It is truly a confluence of a number of technologies that are coming together. In other words, you have simulations so you can take your first principles based simulation which often used to model, you know, grid technologies or just general operations of plants for example.
00:17:10 Dion Harris
And you can also take AI surrogate based approaches that help you know, accelerate that and create more real time interactive approaches to that modeling capability. And then you have the visualization elements that bring all of those data sources into a single model that allows you to have a feedback loop and a control mechanism, but also a representation.
00:17:30 Dion Harris
Of the full system.
00:17:32 Dion Harris
And so there are a number of different examples we've highlighted. We've worked with Siemens Mesa to Model Wind farm production capabilities. So when when you think about how they're designing and building wind farm, they need to understand the optimal placement so that you can maximize the wake wake of of each of the each of the.
00:17:52 Dion Harris
Wind generators to to maximize not just the.
00:17:56 Dion Harris
Output, but also to optimize the cost to make sure that renewable resources is, you know, cost effective. So we had some great great use cases where we showcased that. There's also some interesting opportunities where we work with Siemens Energy in a heat steam recovery plant where we demonstrated capabilities of leveraging some of the AI capabilities to do real time modeling.
00:18:17 Dion Harris
That enable them to really optimize the the up time and availability of of of certain resources. So the the examples are countless. And like I said I'll save some of those for for the person that gets to talk about that during the digital twin segment. But it's a really exciting area that we're doing a lot of work in.
00:18:36 Chris Sass
Right. So you keep saying we've done.
00:18:37 Chris Sass
Work with, yeah.
00:18:39 Chris Sass
I.
00:18:40 Chris Sass
I I I'm good at waving my arms around and pointing at whiteboards and PowerPoints and things like that, help me understand what that means. What is? Is there an NVIDIA engineering or Dev team that's working with Siemens? It is some some relation. Is it working about the API's? What? What does it mean that you're working with that? I.
00:18:58 Dion Harris
Don't quite understand that that's a good question.
00:19:00 Dion Harris
Because it's not an obvious connection because NVIDIA is known for having chips and having infrastructure. But we also have world class PhD scientists who are practitioners.
00:19:10 Dion Harris
In a lot of these scientific domains. So for example, you know, as we're working in climate, I'll use that as an example where we have a number of climate tech efforts where we're building AI surrogate models, but we have actually practitioners who've, you know, LED research both in higher Ed and in industry around developing these these particular.
00:19:30 Dion Harris
Models and so we're not just building the hardware, the software that runs the infrastructure, we're also bringing domain specific knowledge.
00:19:38 Dion Harris
In concert with a lot of our partners, when I say so, when I say we're working with Siemens, Gamasa are working with the world, The Weather Channel, for example.
00:19:48 Dion Harris
It's really about helping them understand how they can leverage our underlying platform to achieve their goal in in a lot of cases, it's not just our platform from a hardware perspective, but understanding what are the latest and greatest technologies in terms of AI that they can leverage to go and tackle their problem because oftentimes they are very rich in domain expertise.
00:20:06 Dion Harris
But a lot of the AI expertise, because it is such a nascent space and because it's evolving so quickly, you know, NVIDIA, as you mentioned at the outset of the call, has a lot of a lot of experience in, in looking at not just, you know, sort of how it applies to industry, but how it applies to a number of different industries and domains. So we usually bring that to bear to a lot of our partnership.
00:20:27 Dion Harris
Engagements and help them, you know along that along that continuum.
00:20:31 Jeff McAulay
Makes sense. That's great. So does NVIDIA have its own LM? Are you working with open source models or working with a range of different models depending on the?
00:20:41 Dion Harris
Application. So in terms of large language models there there are a number of key foundational models that are out there, of course and.
00:20:51 Dion Harris
Open AI and has done a lot of great work.
00:20:54 Dion Harris
We have the open source models like Llama 3 for example, that recently was outputted. NVIDIA has developed a framework called Nemo Megatron which is basically an open source model that helps people fine tune and build their own foundational models. So the key thing about NVIDIA is is that we want to be a very open platform, so we support all of those.
00:21:14 Dion Harris
Major, you know foundation models that that I described. We also build models ourselves just so that we become practitioners and not just sort of vendors and suppliers in the.
00:21:23 Dion Harris
Space. And so I, you know, I say that to say LM's is a very rapidly evolving space and we don't think there'll be any single model that rules them all. We think that, you know, every customer will want to have their own model that speaks to their own business and their own scientific domain and their own industry. So we try to help facilitate that through a lot of our foundational models.
00:21:46 Dion Harris
A lot of our tools that we make available, whether it's our Nemo platform that helps people build and train and develop.
00:21:53 Dion Harris
Models themselves, we have things like guardrails that allow them to build models that you know adhere to their specific policies and and and sort of desires. So we we wanna like said enable the broader community to take advantage of AI and take advantage of a lot of these technologies so that so that it's not just a selected few that can leverage these technologies.
00:22:16 Jeff McAulay
One of the biggest challenges in the face of all this growth has not been the lack of demand it's been.
00:22:24 Jeff McAulay
People just can't get enough chips. There's a line around the block. So what does NVIDIA look like from a supply perspective to meet all of this growing?
00:22:33 Dion Harris
Demand well again, NVIDIA is is in a very, you know sort of unique position in that we have really been sort of at the beginning of of this whole AI revolution.
00:22:44 Dion Harris
Like we've been investing in this for the last several you know, I said the last couple of decades, let's say now and so what's really exciting is to now see.
00:22:54 Dion Harris
The impact that it's having across a number of industries and so to answer your question around the supply and demand, we are making our platform available, you know across every cloud provider we're engaged with every OEM, every systems maker to make sure that you know as we have supply available, it can be, you know accessed ubiquitously.
00:23:14 Dion Harris
And right now when you see you know how these systems come on.
00:23:18 Dion Harris
Line and you see sort of the the plans to bring these systems online. You know they're very robust. You know, a lot of our partners are building very capable, robust systems. And so we feel like as they come online that will help clinch a lot of that demand. But of course, you know this, this is one of those things where you don't have the crystal ball to know where where everything will land. But I think I'm very excited to see how our partners are bringing our systems.
00:23:40 Dion Harris
Bringing our GPUs to market both in cloud and in on premise systems via like our a lot of.
00:23:45 Dion Harris
Our OEM partners.
00:23:48 Chris Sass
Now the AI you're you're calling it a revolution. It definitely seems to have been a disruptor in the industry and across all industry.
00:23:57 Chris Sass
But we're also pretty early, right? I mean, we've been doing machine learning and neural networks since. I mean, I think I graduate classes in that more than 20 years ago. But you know, with the language models and where we are today, it seems you've taken a huge leap forward.
00:24:11 Chris Sass
What transformative things do you see coming just on the horizon that are going to impact the energy industry or the development maybe of new technologies? And we're trying to get greener. We're trying to get fusion. There's all kinds of things we've got going on. Where does this technology help us and how?
00:24:28 Dion Harris
Well, there's a couple of points that you made there that I'll just.
00:24:32 Dion Harris
Under score 1 is that you said it's very nascent. I mean AI, although we've been at it for a number of years and like you said, you know, a lot of these machine learning models have been around for.
00:24:42 Dion Harris
Gates, in terms of its being applied in real world context, very new where I see there being incredible opportunity, especially for energy and just the broader, I'll say sustainable sort of tech ecosystem as a whole is this ability to leverage.
00:25:02 Dion Harris
AI in place of traditional methods for whether it be for simulation, whether it be for modeling, whether it be for predictive analytics, a lot of those capabilities are going to really unlock a whole new set of of of efficiencies and even services I think.
00:25:19 Dion Harris
A lot of the industry sector, one thing that I think I'm really excited about is seeing a lot of the work happen in, in, in some of the fusion reactor space modeling, for example, there's some great work happening in the UK Atomic Energy Commission where they basically did a digital twin. But it's not just a digital twin from the sake of having visualization.
00:25:39 Dion Harris
Elements, but it also was able to model.
00:25:42 Dion Harris
The plasma physics using AI, but it also coupled that with the the the hardware and the structure of the CAD designs so that you now have this incredible representation of how can you model and keep the the plasma stable to to create higher higher efficiency fusion reactions.
00:26:02 Dion Harris
But also in the context of the actual physical structure. And so what's really exciting is that this confluence of AI and simulation and digital twins is gonna truly, I think, unlock a breakthrough as we get to this space of trying to really tackle renewable energy. So I think that's, that's the part that's really.
00:26:20 Dion Harris
Writing and then there's lots of great work happening. You know, in the here and now around the carbon capture and sequestration as well. So I think that's another really exciting area that we're seeing AI apply, yeah.
00:26:30 Chris Sass
Let's take a second there. What? What?
00:26:32 Chris Sass
What kind of work in in carbon capture?
00:26:35 Dion Harris
Sure. So so we've done some work with Shell to bring to bear a technology, it's it's an AI based approach called for your neural operators, Fourier neural operators and the xanos is is what it's called.
00:26:50 Dion Harris
Short and what's really incredible about this is basically leverages this AI surrogate model.
00:26:56 Dion Harris
To enhance the simulation of the coal plume migration for subsurface environments, and so it's trained on sort of a lot of real world conditions. And so it allows you to understand at great depth you know how you can do real time simulations. It delivers a lot of computational accelerations that you can do them faster.
00:27:15 Dion Harris
Iterate very quickly to understand where it's an optimal placement and storage for some of these.
00:27:22 Dion Harris
Carbon carbon storage reservoirs and it also creates super resolution techniques so that you can do a lot of high resolution simulations very, very quickly and efficiently. And so what's what's really cool about these techniques is, you know, they're brand new and a lot of them have been developed for other.
00:27:42 Dion Harris
Industries and sectors like you know climate and weather or for automotive design, but we're absolutely we're actually using them in, in this carbon sequestration modeling use case. So it's pretty exciting.
00:27:53
Hmm.
00:27:56 Jeff McAulay
So with with maybe a last question here, I've got to push the envelope a little bit. Yeah, there is important discussion around ethics in AI and in particular in the context of geopolitics. There's a lot of, you know, global race for the commanding heights of artificial intelligence. How do you think about including?
00:28:15 Jeff McAulay
Ethics or geopolitical concerns? In a lot of this work that you're doing or or does it come up?
00:28:21 Dion Harris
Absolutely. I mean, I think NVIDIA is is definitely trying to approach this, this topic of AI and this adoption of AI in a very responsible way. I'll use an example when we we built a service called Picasso, which was basically an image generation service. We wanted to make sure that all the underlying content was ethically.
00:28:41 Dion Harris
Source and so we worked with partners to make sure that we had copyrighted images and copyrighted content to go and train that model. So whenever we're developing models and looking at sort of our AI use cases, we try to, you know, take into consideration both the ethical and the sort of overall.
00:28:56 Dion Harris
All most responsible way to do that. Another key thing that I would say we do oftentimes is when we develop these tools and technologies, a lot of times we've.
00:29:09 Dion Harris
Understood them and had them for years before we even make them available, because a lot of these technologies you don't know, sort of how useful or harmful they can be until they're in the wild. But you want to make sure you can do as much due diligence as you can. And so we take.
00:29:21 Dion Harris
A very prudent approach to releasing things in the wild to make sure that we can can at least mitigate as much as possible, and then the third thing I would say that we're really sort of adamant about doing is creating tools such as people who are building these technologies can really make sure that they're done in responsible ways. So we have, you know, some of these things.
00:29:41 Dion Harris
Nemo guardrails that allow people to build out models that you know can adhere to or behave the way that they want them to behave in other word.
00:29:51 Dion Harris
You don't want them, you know, having profane language, if you're releasing a customer service model, right? So you can build in guardrails and make sure that they're adhering to your, your, your brand policies and guidelines. Also making sure that they adhere to the the laws of physics we have, you know, physically informed, known networks.
00:30:11 Dion Harris
That when you're deploying some of these solutions, make sure that you're not creating hallucinations in physical simulations that that allow you to get accurate and very reliable information. So all of those things are are being brought to bear to help make sure that we can do things.
00:30:27 Dion Harris
As ethically and responsibly as possible when deploying AI.
00:30:32 Jeff McAulay
Well, it's really a tremendous journey here. We've covered so much ground in a limited amount of time. Dion, you, you, we started off with the chips themselves and usage in, in data centers and locations, but that is very much just scratching the surface. And you've taken us on a journey here through a range of applications.
00:30:52 Jeff McAulay
From the supply side to demand to transmission to carbon sequence.
00:30:56 Jeff McAulay
Relation to performance and operation. It's really a huge amount of application that are very exciting both here and just around the corner. I I can't imagine a a better guide for this journey. Thank you so much and we're excited to do another chapter with some of your colleagues. So really appreciate the time today.
00:31:14 Dion Harris
Thanks for having me it.
00:31:15 Dion Harris
Was a pleasure.
00:31:17 Chris Sass
For our audience, we hope you've enjoyed this episode as much as we did making it. We try to look at all angles of the energy industry where it is, and NVIDIA is an example of us. You know, thinking outside the box and showing you what's going on. It's a complex industry, as you all know, being members of it.
00:31:31 Chris Sass
And there's a lot more to it than there's scenes on the surface, so we hope you enjoyed this episode. If you like content like this and want to continue seeing content like this, please like and subscribe, add comments on our YouTube channel and follow us. That's how we know you're appreciating the content and we will see you again next time on the Insider’s Guide to Energy. Bye for now.