When you plunk a $100 million wind farm down on 98,000 acres of varying terrain, you want to know a few things. You want to know that you’re optimizing the location of your multi-million-dollar turbines. You want to know that the turbines you source can handle the gustiest gust of wind they will ever encounter without shattering dramatically in a viral video. And you want to test potential uses cases and changes in software, which is cheap and changeable, rather than in hardware, which is expensive and hard to edit.
Support TechFirst: Become a $SMRT stakeholder
Which is why Siemens Gamesa, the global renewable energy company, is working with NVIDIA to generate AI-powered digital twins of its turbines.
In this TechFirst, I chat with Dion Harris, Lead Product Manager of Accelerated Computing at NVIDIA and Greg Oxley, Siemens Gamesa. Scroll down for the video, podcast, and transcript, or check out the story in my Forbes column …
(Subscribe to my YouTube channel)
TechFirst podcast: digital twins via AI
Transcript: chatting with NVIDIA and Siemens about AI, wind farms, and digital twins
- Dion Harris, Lead Product Manager of Accelerated Computing at NVIDIA
- Greg Oxley, Siemens Gamesa
(This transcript has been lightly edited for length and clarity.)
John Koetsier: So, you’re creating digital twins of wind farms. Why is that?
Dion Harris: Well, I’ll let you jump in there and then I can tell you why other customers are doing it, but would love to get to your take specifically…
John Koetsier: Go ahead, Greg. Yeah.
Greg Oxley: When we look at operational wind farms, there’s obviously a lot of physics at play there. So we have wind farms that we’ve sold and have been put in various positions, and now we have to service them. And what we want to do in moving into the digital twin space is to be able to have an accurate digital model of the entire wind farm where we can play out scenarios.
So this could be realizations where we have, you know, incoming weather events and we want to see how to optimally operate that wind farm as we move through these type of events. We could be testing new control strategies or something that we want to look moving to the future and we want to see how the wind farm will perform under those new control paradigms.
So, for us, it’s all about first of all, having these accurate models that we can in real time run realizations of what could be occurring at the wind farm, and having the type of platform where we can immerse ourselves into that data. Because it’s fair enough to say that you have this digital model or this digital model, but when it all comes together and how do you immerse yourself and do decision-making based on the results of what’s coming out of that digital twin? That’s an important aspect, then it’s a big data problem. It’s a high technology problem to be able to wrap your head around those results. So it’s, for us, equally important to having the models and the individual models that make up the digital twin, is to be able to immerse ourselves into those models and those results and have actionable information that we can proceed with.
John Koetsier: Greg, is it also about where you might want to place the individual windmills in a wind farm where they get optimal access to wind?
Greg Oxley: Right. So, when we think of digital twins, we can think of the initial initialization of that digital twin as the siting activities that we do in placing those wind farms. And a lot of those models that we start with our initial guess with and then we release them into an operational type digital twin model, they need to be corrected and informed as data is coming in. And we can tune those models for those like site-specific scenarios that we’re seeing, and take that information during the digital twin operation and the plant operation, and take the learnings there and the tuning we’re doing, and feeding it back to the model development that we have a priori to the wind farm going into the operation… so, in the design phase. So it’s all a connected loop and everything comes together. We gotta get that data engineering and all that information coming back and forth.
John Koetsier: Dion, let’s bring you in here. Is this something that you’re doing with other customers as well? Are you trying to create digital twins of different types of technologies, installs, machinery, to understand how it works?
Dion Harris: Absolutely. And so, what’s been really interesting is to see all the different use cases, right? So everything from medical use cases, like creating a digital twin of a blood vessel, literally understanding if you model different blood flows how will that result in helping avoid hemorrhaging or how will that avoid potential ruptures of blood vessels, for example.
Also, obviously in additional wind farms is also modeling the globe itself. You know, we’ve had a lot of interesting insights that we’ve been able to glean from leveraging similar approaches to building digital twins of the climate to understand how different scenarios will play out.
And so, the whole notion of digital twins as it relates to scientific and industrial use cases is truly exploding. It’s really interesting to see how it’s being applied.
John Koetsier: Greg, talk about the speed here of modeling. You’ve got some massive speedup with using the new technology and the AI that you’re working on. What’s the speedup?
Greg Oxley: Well, for instance, I can say one use case where we’ve worked with NVIDIA to develop GPU-accelerated wake modeling. So, in this case, when you have a parcel of land and you need to place wind turbines around that land, there’s a lot of things that you have to take into consideration.
So you can think of the simplest case, you just have flat terrain with no barriers, and this becomes a very simple case. You’re just worried about the wakes of the wind turbine. How is the wake from each turbine going to interact with each other. But on top of that, we also have a lot of other things to take care of. And particularly on the onshore space where you’ve seen wind turbines placed along turbine cliffs, you’ve seen wind turbines placed in and around forests.
And so what we have to do is, first of all, we have to model the flow all around the wind park, and we do this at the beginning.
And then on top of that, we overlay the layouts and their wake interactions on top of that prediction. Now, in the optimization loop, we don’t necessarily run that flow model again. We have this background flow model of how the wind is going to behave as it’s going around mountains and in valleys and how that’s going to be distributed across the park, and that when it comes to doing the layout optimization, we have to change the turbine positions. And every time we move a turbine, we have to recalculate the wake model, right?
And now you can imagine our solution space is huge. We have 20 kilometer by 20 kilometer parks, let’s say for, you know, in the biggest cases. And so we’re talking about thousands and thousands of iterations. So in our traditional CPU-driven wake modeling, we get into a limit where we just have a bottleneck where we can’t explore the solution spaces as effectively as we want to.
Now with the GPU-accelerated wake model where we’re seeing anywhere from 10 to 500 times speedups — I’m not sure if I have those numbers exactly correct, but they’re massive speed ups — what this does is allow us to explore larger spaces of design, many more potential layouts that we can look at, and ultimately come to the most optimal solution, according to our predictions.
John Koetsier: It’s super interesting, actually, because I live in a mountainous area — I actually live on a mountain, and I love going for walks and hikes and stuff like that — and I’ve always marveled in my particular area where it seems like the wind is in my face every step of my walk. [Laughing] It’s like, it’s swirling around, it’s coming back, and I’m like, how is this possible? Does that sound insane to you? Or does that sound like you’ve heard that before?
Greg Oxley: That sounds like mountain wind to me [laughing].
Dion Harris: Certainly.
Greg Oxley: And it’s something that we deal with quite a lot. If you were to take, let’s say, an isosurface across that farm of how the wind direction is changing and how the wind is accelerating or recirculating — we get these re-circulation bubbles, as wind goes over a mountain and you get these very turbulent flow structures going behind it — you know, it can be so variable.
And, like you said, walking around mountainous terrain going one way the wind can be to your face, then it’s at your back, you turn around, it’s in your face again, right? So, it’s a complex problem. And we know from modeling wind, from modeling weather, that you just have to look at your forecasts. How, how often are your forecasts accurate? You know?
John Koetsier: Yeahhh.
Greg Oxley: I’m sure you may have some complaints on that front. It’s a very difficult… and that’s not to put down the weather modeling or anything, it’s a very difficult problem to solve. And this type of digital twin work that you’re talking about in terms of modeling climate, for instance, or getting a better understanding of larger scale processes than we’re looking at in wind farm design, you know, it really takes that type of feedback between model and how it’s performing compared to what it’s observing, and that kind of tuning and feedback and everything that, that really has a chance to drive down those errors.
John Koetsier: Well, that’s really interesting. And, Dion, love to hear from you on this, because Greg mentioned weather. And it’s notoriously complex, as you mentioned, and notoriously difficult, but you’ve got something that you say, “Hey, makes predicting extreme weather events, 45,000 times faster.” Talk about that.
Dion Harris: Sure. Sure. Well, first before I discuss that, I want to hit on a point that Greg mentioned but I want to just drive home, which is, sort of, what’s delivering the speedup. Today, it’s really driven by AI, right? Leveraging different neural network approach, whether it’s a physics and foreign-based approach or a data-driven based approach, that’s what gives you sort of this real-time nature of being able to iterate and change these parameters in real time, right? Because otherwise you would have to go and resimulate this in a first-principles based model using CPUs or GPUs, that will take a long time.
And so, because you’re able to leverage AI… and in this case, we use a platform that we call Modulus that allows us to build models based on physics as well as data-driven models. And of course you can use your simulation data to train and inform the model as well as informed data. So it’s taking all these different inputs to come up with a fairly accurate real-time model. So then if I take that and, like I said, looking at the wind farm, that’s a very complex system, right? When you talk about just the dimensions of that, and I’m not sure… I’m getting some feedback [wind-like noise in background]. I don’t know if that’s you or me, but we’ll keep going … it’s getting louder here [noise suddenly stops].
John Koetsier: That’s better.
Dion Harris: There we go. Yeah. Anyway, so, like I said, as we take the example of the wind farm… very complex system.
Now, if you expand that by a thousand times, 10,000 times, and you’re modeling the earth climate, and when you understand all the sort of interdependencies… and when we say complex system, what we’re really describing is something over here [stretching out one arm] — and my hand is out and far to the left if you’re out of frame — affects something over here [stretches out other arm], which is, again, completely out of frame, but you get the idea… is that is how do you model those interdependencies?
And so what we’ve done is we’ve leveraged our understanding of physics. And so there’s very popular physics equations around partial differential equations which help you model that and the change across complex systems. And then we’ve also built in the capability to use what are called fourier neural operators. And so this is again a specific type of neural network that can use some physics to inform understanding of complex systems to then accelerate that and still deliver the accuracy.
So now, that was sort of a windup to kind of describe what we’re doing on the climate side. So we announced something called FourCastNet. And this is something that’s really exciting, and the thing about it, it’s not just in NVIDIA. You know, we don’t like to sort of make it sound like we’re doing these things in a vacuum. We’ve worked with researchers at Berkeley Labs, Caltech, Purdue, Michigan, and Rice University to develop this network. And so what’s really exciting about it is we’ve been able to simulate the climate.
And so, whenever you’re looking at climate simulation, what’s really important is resolution, right? And the reason why resolution is important, because it gives you the amount of detail and accuracy by which you can build that forecast, right?
So, when you’re talking about climate change or climate weather in general and you say, “Okay, the Earth’s climate went up by an extra two degrees,” that’s on a massive scale. But what you really want to know is how is that going to affect wildfires in California? Or how is that going to affect the flooding that’s happening in Sri Lanka?
So, it’s about getting more specificity and granularity the more resolution you can sort of impute in the model. So what we were able to do with FourCastNet, compared to sort of the state-of-the-art, first-principles based simulations, we were able to improve the overall sort of performance per… I’ll call it “node” is the best way to describe it.
So we were only using 22 GPU-accelerated nodes, and we were able to deliver the performance of roughly about 98,000… I’m sorry, 984,000 nodes on a specific system that’s out of MIA. It’s based on a platform called IFS.
And so the reason why that’s interesting is when you think about modeling climate, because the systems are so complex, you’re also going to have significant load whether you’re looking at the actual carbon footprint or the energy that it takes to simulate those. And so it’s really about how can you simulate these massively complex environments, but in a very efficient way possible. Because if money was no object, if power was no object, you can just throw CPUs at it all day and you can get there. But because we’re trying to do it in an intelligent way, you know, AI is giving us some tools to model these very complex systems in a very efficient way both in terms of time and energy efficiency.
So, it’s really about how do you harness the power of AI to then deliver the accuracy that you would have had from your standard first-principles based models to then give you that speedup in performance and overall efficiency. So, long-winded answer, but hopefully that connects the dots between like, say, I would get from sort of wind farms-based simulations to AI to then looking at how that applies to the globe.
John Koetsier: That connected a lot of dots, but I’m still stuck on these very popular physics equations that were never picked last at the ballpark or never attended prom alone or something. But I also want to know, like, what did you do? Did you, were you running these models on your computer and then stopped, because you had all this wind noise and all of a sudden it stopped? What did you do to get better audio all of a sudden? [wind noise in the background returns].
Dion Harris: Um, it’s back. I didn’t do anything, actually. [Laughter]
John Koetsier: Technology is still full of gremlins. It’s okay, no worries.
Dion Harris: Yeah, I didn’t do anything, but it stopped and now it’s back… and now it’s gone again.
John Koetsier: Well, let’s turn to Greg here. And, Greg, want to bring you back in. What is the real-world impact of what you’ve been doing here? Obviously you’re using it to plan, to put the wind farms in place. Clearly you’re also using it to estimate what could happen maybe at different storms, wind speed, that sort of thing. We’ve seen the video of windmills tearing themselves apart, right? Those massive wings just flying apart. What are you getting out of this? Are you getting increased safety or getting increased income, better power generation?
Greg Oxley: Well, yeah. All of the above, I would say. So, I like what Dion mentioned there in that the climate scales are, of course, very important, but all the scales are so intrinsically linked. So if we have climate scales, we have mesoscales, which is like the mesoscale weather modeling type side of things. And then we have the microscale, which is a wind farm scale. We call it microscale in wind energy, but it’s really quite large compared to what you might consider a microscale in your normal life. But what this allows us to do is really get rid of the unknown. I remember General Chuck Yeager, a famous test pilot who broke the sound barrier in the U.S.
John Koetsier: First.
Greg Oxley: Yeah, the first one. You know, he was a test pilot in California and they had a lot of deaths, and they always feared… they called it the “ughknown.”
And I think the ughknown is what we’re always trying to mitigate what we don’t know and put in the appropriate buffers to take care of that. But that puts us in a non-ideal situation, right? We would rather clear that out and understand as best as possible the unknowns, and get to the true optimization instead of just adding buffers on top of everything to get to a true understanding of how your wind turbine is performing in all expected situations, and be able to react when extreme situations are coming in, and be able to react in a way that that’s productive.
And I don’t mean in producing energy. I mean, maybe shutting down your turbines and not producing energy, but doing that when it’s necessary and making the correct decisions based on models that you trust and are proven to get to the optimal solution, whether it’s in operations, whether it’s in initially siting the turbines. You know, that’s the name of the game.
John Koetsier: That’s really challenging… [crosstalk]
Dion Harris: Yeah, one thing I think I’ll just add there.
John Koetsier: Go ahead.
Dion Harris: I think what’s interesting about, in terms of real-world impact… years ago, before I was even at NVIDIA, I worked at a utility company. And so I was a part of bringing renewable energy to the masses and we were going to give customers an option to choose: do you want to buy renewable, or do you want to buy your standard coal or fossil fuel energy? And it wasn’t just about the supply and the output, but it was truly about cost, right? And so, to the extent that these are going to help make this more attractive, that’s what’s going to drive adoption. It’s not just about creating these technologies and assuming everyone will be willing to pay a higher price for renewable energy.
So it’s really about how can we drive down the cost and make them more cost competitive, and then increase adoption, right? Because right now we don’t have a problem where it’s, you know, we can’t produce enough wind energy, it’s really can you purchase enough at the right cost?
Greg Oxley: Yeah. And technology is moving so quick right now. You know, like the digital twin technology on the cloud and the type of computing resources that we have access to, and the type of AI modeling that we can have to supplement the physics-based modeling. And to kind of… I guess it’s changing so quickly in a big organization like Siemens Gamesa or Vestus, like the big OEMs. It’s difficult to react with that. And traditional tools are sticky. You know, the traditional things we use, and the engineers that have been there for a long time…
John Koetsier: [Laughing] Habits.
Greg Oxley: …are really a bit stuck in the way we have been. And to accelerate and drive through this digital transformation, it’s really difficult because the user adoption part of it is so key and so difficult. Because, I’m an engineer too. I love my tools. But we really have to, in order to accelerate this, we really have to do something about inserting new tooling and new methods into engineering workflows. It’s not just about developing new technologies. It’s about the adoption of that technology. And that is a big challenge, I can tell you, across any industry. I have experience in aerospace, in energy. We see it everywhere.
John Koetsier: Couldn’t agree more. What you were talking about earlier was the “ughknowns,” which for Chuck Yeager was, before they broke the sound barrier, there were wondering could material last? Would it fly apart? Would it stay together? Would the planes still function over the sound barrier?
Dion Harris: Yeah.
John Koetsier: But that brings to mind the issue, look, there are known knowns, there’s known unknowns, and there’s unknown unknowns, and those are usually the most dangerous ones. You’re not even thinking about them, but they could totally get you. Now, you’re using modeling to predict what’ll happen.
How do you connect that model to the real world and ensure that it’s increasingly tied to the real world? Because if your model’s off and you think, oh, we’re going to be a hundred percent fine during a wind of 200 kilometers an hour, no problem whatsoever, keep the turbine going, right? [Laughing]. You know, then you’ve got a problem.
So how do you continue making sure that that model conforms to reality?
Greg Oxley: Right. So I can give an example from, let’s say, a first principles physics-based modeling where, let’s say we have a CFD model and we have across a wide range of sites, we’ve run it, and then we know what’s happening operationally in the wind farm after it’s been built. So we’ve run it a priori in the design, and then it’s been run afterwards. So we can do benchmarking and we call this like our continuous improvement. So what we’re looking at is, let’s say in this case, if we’re talking about quantity of wind and the potential power output of a wind farm, we have our a priori prediction. We have our wind modeling models, physics-based, that got us to an AEP prediction. And then we have what’s really happening in the field.
So we’re always benchmarking back and forth and actively, like in CFD, for instance, you can tune constants, for instance. So you’re actively always in a physics-based model, turning the knobs that you need to get across a wide range of parks that say the most… the least error with what’s actually happening in the field. Now the same thing with machine learning models, you’re constantly training, they’re constantly improving. So you need this feedback from actual performance in the field, the “reality” of what’s happening, feeding back to your original predictions and tuning back and forth all the time.
Dion Harris: Yeah.
John Koetsier: So somebody’s got to be the guinea pig and somebody’s got to actually see, does it actually work in the real world? And eventually sometimes those break, but eventually you get enough data that you’re pretty confident you’re coming 99.99 — and you add nines as you add more data — percent of situations and realities.
Dion, let’s turn back to you and maybe we’ll end here. This is pretty exciting stuff: creating digital twins using AI modeling the world to understand what’s going to happen with your technology when it’s in place out there when you’ve actually spent $500 million placing it in the ground, or in the North Sea, or wherever you’ve put your turbines.
Dion Harris: Right.
John Koetsier: Where’s it going next? You’ve talked about 45,000 times faster at predicting extreme weather events. You’ve talked about 10,000 times faster at modeling certain things, but what’s the next step?
Dion Harris: So, I think, like I said, because it’s being applied in a bunch of different domains and disciplines, I think it’s going to take a different trajectory depending on the use case. I’ll just use climate since we’ve been sort of touching on that throughout. I think to your first question around validation, it’s always interesting with climate, because we have hundreds of years of actual data of recorded, actual weather data, climate data. So you can literally do a post hoc prediction. So you say, take the observed data points 200 years ago, run it through your model and see how it predicts, you know, 50 years out, 20 years out, 70 years out, etc. So you can literally start to see and validate your models against historical data, which is a lot of what they do in the climate space, because you have this sort of rich history of data. And so that’s on the simulation side. Then the AI side, of course, we’re validating versus simulation and observed.
So I think when I… and that’s really key when you talk about climate, because when we’re describing some of these outputs, right, and we’re saying, “Okay, what is the climate gonna look like 50 years from now?” And that has very real implications, right? If you’re trying to determine how much carbon should we limit into our environment? What sorts of policies should we put in place in order to avoid some particular outcome?
You have to be very confident in that outcome to spend millions and billions of dollars today in order to drive real behavior.
So I would say that’s really the next step, is like as we start to improve some of the accuracy of some of the models and the resolution as I’ve described before of some of the models, I think that’ll start to get us to a very sort of defensible position about taking action and doing mitigative steps to address any sort of climate change issues that we’re facing.
So, I think what’s next is, like I said, there’s still room to grow on these models. Like, they’re a lot faster, but they can get better. And so I think the iterative process that Greg was defining, was describing earlier, you know, we’ll continue doing that. And as they become fairly well understood and adopted — ’cause, I mean, no matter how good it is, unless it’s used and embraced by the scientific community, you’re not going to get any real traction — so as that happens, we think we can really start to drive real policy, real decision making, real mitigation strategies and adaptation strategies that can hopefully overcome some of the issues that we think may be facing.
John Koetsier: Well, I hope that’s certainly the case, because we’ve certainly had a lot of people that are saying like, “My gut trumps your data,” [laughing] right? Or, “My convenience trumps your freedom to breathe in 20 years or something like that.’ But hey, we’ll see how those things go. Greg, I want to thank you. Dion, I want to thank you. This has been super interesting. Thank you for your time.
Dion Harris: Absolutely.
TechFirst is about smart matter … drones, AI, robots, and other cutting-edge tech
Made it all the way down here? Wow!
The TechFirst with John Koetsier podcast is about tech that is changing the world, including wearable tech, and innovators who are shaping the future. Guests include former Apple CEO John Scully. The head of Facebook gaming. Amazon’s head of robotics. GitHub’s CTO. Twitter’s chief information security officer, and much more. Scientists inventing smart contact lenses. Startup entrepreneurs. Google executives. Former Microsoft CTO Nathan Myhrvold. And much, much more.
Consider supporting TechFirst by becoming a $SMRT stakeholder, connect to my YouTube channel, and subscribe on your podcast platform of choice: