AI’s Energy Demands: The Future of Data Centers and Grid Challenges


ENB #223 AI’s Energy Demands: The Future of Data Centers and Grid Challenges

STU TURLEY

In the Energy News Beat – Conversation in Energy with Stu Turley interviews Riley Trettel, VP of Data Center Development at Hut 8, about the growing energy demands driven by AI and Bitcoin mining, and the challenges facing U.S. grid operators. Riley explains Hut 8’s focus on building large-scale data centers for AI model training, emphasizing the importance of scalable power interconnections and the role of microgrids, natural gas, and nuclear power in meeting future energy needs. They also discuss AI biases, alignment issues, and the rapid advancements in AI technology, highlighting the potential for a transformative future in energy and computing.

#aiinenergy #ai #nuclearpower #naturalgas

Check out Hut 8 Here: https://hut8.io/

Highlights of the Podcast

00:00 – Intro

01:00 – Hut 8’s Role & AI Data Centers

02:30 – Future Grid Capacity Needs

03:47 – AI Energy Consumption & Global Impacts

05:36 – AI Deep Learning & Transformers

09:21 – AI Bias & Model Alignment

12:28 – Nuclear Power’s Role in Data Centers

14:21 – Microgrids & Energy for Data Centers

19:05 – Natural Gas as a Solution

22:53 – Hut 8’s Future & AI Training

26:50 -Closing Remarks & Contact Information


Automated Transcript and we disavow any errors unless they make Stu sound smarter or better looking.

Stuart Turley [00:00:07] Hello, everybody. Welcome to the Energy News Meet podcast. My name’s Stu Turley, president and CEO of the Sandstone Group. Today is just an outstanding day because I get to talk to some seriously fun expert here. We’ve got Riley Trettel. He is the V.P. at Data Center Development at Hut eight. But before we jump in, there are things that are going on in the United States grid that are very critical. We have a AI that is driving so much demand. We have bitcoin mining that is even driving further demand. But that grid operators have a gigantic problem how to fund this or how to integrate things in. And I just enjoyed getting to chat with Riley here over this. And this is going to be an outstanding energy discussion. Thank you, Riley, for hopping on the podcast today.

Riley Trettel [00:01:00] And thanks so much for having me. Stu Stuart Really honored to be here.

Stuart Turley [00:01:03] Well, I tell you what, I just I really enjoy this. Tell us a little bit about what you’re doing there at data Center Development at HUD. Tell us what that is.

Riley Trettel [00:01:13] So, Honey is a digital infrastructure company. Today we have about a 1.3GW footprint spreading across North America, consisting of three megawatts of power generation assets and about 1000MW of data centers. So the data management arm specifically and today is targeting building large scale data center campuses for A.I. training, for building the next generation of A.I. models that the largest tech companies in the world are trying to deliver to consumers. So the way that we got into this business of building these AI supercomputers is because with our existing footprint and with the pipeline that we’ve developed over the years, growing to the scale that we have today, we have quite a few very large assets that are particularly well-suited for building these A.I. supercomputers. And as these computers scale and as the use case for large power interconnects to the grid and to power plants directly become more valuable, we are uniquely positioned to provide that to the market, given that the traditional way of interconnecting the traditional way of building these data centers is not quite fast enough to meet the demands that the market has put forward for for new models and and new capabilities.

Stuart Turley [00:02:30] And I believe the even the EIA has said after the fact check myself and there are about 4 or 5 different reports but one of the reports has said that Ercot has to double their capacity within the next five years. You know, it took us 100 years to get here.

Riley Trettel [00:02:46] Yeah, it’s I would say that a lot of the estimates of where capacity is going to go is are undershooting what’s what’s really going to happen. I mean, on a long, long time scale, a thing that I’m very passionate about is increasing energy consumption and the efficient use of that energy for useful technologies globally. And that means that generating large amounts of power directed those into useful technologies, into things that improve quality of life and that scales far beyond, you know, Ercot or PJM or Besos capacity and so on, and the long tail future of humanity. There are serious considerations. How much power will be consumed by computing in terms of the overall percentage of power that we that we produce today. It’s only about 1% that could that could scale up to 99%, but a very long, long tail future. So really just focus here on taking the power interconnects we have today, taking the pipeline of these interconnects that we’ve developed and dropping the most useful technologies that improve quality of life on those. And so take those assets and directs humanity’s energy into things that can improve our society. So that’s what we’re focused on and building today. And we think that this is something that’s going to scale for hundreds, if not thousands of years into the future. And we’re really excited to be here kind of in this new technological industrial revolution that’s happening geopolitically. Things are moving and shaking in the energy world. Things are definitely moving and shaking and computing. We’re seeing a huge renaissance in data and software side technologies that are driving physical use of of energy and computing to very large scales. People in Silicon Valley are talking about four gigawatt, 40 gigawatt computers already and the system was definitely not built to handle those those types of loads. And so we’re really trying to to know where the puck is going and really understand the the fundamental technology that’s being deployed here and where it’s going to go in the future as opposed to what’s needed now. So it’s it’s fun to discuss the history of AI. And more broadly, machine learning and deep learning to understand why are we here and where are we going? So deep learning itself is is simply the act of a computer learning and being reinforced on a pattern in data. Right. So you push data through a computer, it learns what patterns in that data, and then new data that the computer encounters, it will be able to recognize those patterns and give you a result. So the early days of deep learning were very small models, very small data sets and small compute. And the results of that research were, wow, the results that this computer learns these patterns fairly well, let’s try it with more data than with a larger computer. And then people are really surprised. They said, wow, this this thing is pretty good. It had a recognition. Let’s throw even more data at it. And this is continued for for 15 to 20 years, too. Right. We got to the point where they said, all right, let’s scrape the entire Internet for every single piece of text that we can possibly get our hands on every single piece of human interaction. Let’s boil that down into a massive dataset and let’s design a model that can take advantage of the connections within that dataset. And there was a really big innovation that happened with something called a transformer. So what a transformer does is it essentially in a very simple sense, increases the total context window that the model can think in. So stick models that of of yesteryear that operated sequentially could only thing from one word to the next. Whereas with with Transformers now the model can think about what it was saying 80 or 800 words ago and it can also look forward 880 or 800 words to decide what it should say now based on what it’s going to say in the future. So it has a much greater attention span and it selects its words more carefully based on that attention. This is what drove a huge revolution in large language models that we see now with open age more and more ubiquitous in the business world, at least I use it every single day to improve my productivity. And if if you don’t use it or if anyone listening doesn’t use it, you should you should try it out and you should see a capability that it has. It can speak fairly well. But because of this, this is just the next iteration of the AI experiment, which is more data, larger, more capable model and a bigger and it came out with a good result. So as we look into the future, you know, people are starting to think how far can this pattern recognition theme and can this experiment truly go? If we give these models or if we improve the size and the technology and architecture of these models and we feed it more higher quality, higher fidelity data such as audio and video type texts, right? These things going to be able to produce in terms of images, videos, speaking visualizations, what is it going to do for robots and for cars that navigate through the real world, right? That traditional way of thinking and how this experiment has gone in the past is that with more data, with a better model, it’s produced a very good result. And so you have companies like Google, Microsoft, Amazon, Mazda, who essentially see the road converging toward an ultra capable model AI agent, so to speak, which which people like to talk about is artificial general intelligence or artificial superintelligence. Whether it reasons or not, there could be a very powerful AI hidden in here. If you build the compute and you build the model and you collect the data in order to train it. And so a lot of these these you having this AI or building or birthing this A.I. as, as an existential threat to their company, I guess. So one of their competitors having it and having access to it and building it, they view that as an existential threat to their own business, right? So it’s essentially the most expensive technological, nonmilitary arms race that’s ever taken place in history of humanity. And in a lot of these, the leaders of these huge companies are willing to spend a vast amount of capital to raise or to not be second place or not be third stupid one of their competitors.

Stuart Turley [00:09:21] Being second, not too bad being third year out. But I’ll tell you, I’ve enjoyed watching the evolution and there’s about 19 questions I’ve had. You’ve had great, great comments here and I’m trying to get them, but I’ve enjoyed watching Grok on X evolve and and I really, really enjoy how useful it is for me. I use Grok on X all the time and it is a phenomenal research tool for me. And and I think Elon has done a great job getting his team doing that without without eliminating a lot of. The biases that I’ve seen in open chat, open chat, GPT and all those. Early on it looked like they were trying to change history because they’re machine learning was actually preprogrammed in certain ways. And so when you have that in as it’s trying to learn, hats off to Elon. And I think that he has done a great job with growth. I don’t know how it stacks up with everybody else’s opinion, but my personal use of the others is their biases we’re coming across because they don’t like. It was very evident when I’m asking research questions of Chat GPT or any of those others because I believe we should use the lowest kilowatt per hour to deliver to everyone on the planet to eliminate energy poverty. And I really don’t care what form of energy we use, if it’s wind, solar, but if it’s not good for the environment, let’s do it. That’s not use it well. Wind is not always the best solution to use. And that’s what gets me down. And then when you see the biases coming back out, you can see that they have biases built in. How do you eliminate biases? Right. Does that make sense?

Riley Trettel [00:11:15] Yeah, it does make sense. And you’re spot on that Ellen has done an excellent job on this front. He was able to spin up Grok relatively quickly. He was a bit behind the eight ball compared to some of the other companies. But today, Ellen will probably energize the largest A.I. supercomputer before the others. So he’s making very good progress.

Stuart Turley [00:11:36] And that’s great.

Riley Trettel [00:11:37] And that’s going to be 100,000 GPU cluster to be a very capable machine. So when that when that is fully spun up and really begins its training cycles, it’s going to enable much larger model architecture. And you’re going to see in real time, I’m glad that you’re a Grok user. I see in real time the capability of that system increase. But what you’re touching on there about how the model thinks and its biases is a huge conversation called alignment. And so alignment that AI is that was born out of air safety research. And essentially a lot of people who know far more about this technology than I do, and I’m not the most equipped to talk about A.I. alignment. You’d have to ask the researchers, the universities and and in the big tech companies who are dedicated to this. What let’s get.

Stuart Turley [00:12:27] Elon on the podcast with you, me, you and him, or let’s do that.

Riley Trettel [00:12:31] And that would be great. I’d love to discuss alignment with him. He is very good thoughts. But you know, Elon acknowledges that this AI, it could be very dangerous if it’s not aligned with with humanity, if it has improper incentives. And so I think what is going to happen is, is what we’re already seeing play out. So we have models that that are released by various folks and they have different biases in that. But naturally, because of that, you have you have different, you know, choices in the market which model is right to use. And so maybe there won’t be a single model that’s perfectly unbiased and perfectly aligned with humanity. Humanity has a very wide range of thoughts and ideas and beliefs. And so, yes, we have different models trained by people, and it’s trained on data that comes from different experiences. Then you have a selection in the market. But the key is that it remains decentralized. There remains a race toward the top. As long as there’s any one individual company doesn’t kind of have the master model at any given time for the master supercomputer, that is the most capable. And we’re going to continue to see that variety. And that should and should definitely help with alignment. But on the energy front, you touched on a good point about what energy is going to need to be used to build these supercomputers and, you know, the the renewable attributes of it or how good it is for the environment, so to speak. The most logical answer is definitely nuclear for these workloads. And you’re seeing a lot of it in the marketplace and you’re seeing Constellation fire up the the the plants for Microsoft Three Mile Island.

Stuart Turley [00:14:04] I did not have really I did not have firing up the old Three Mile Island on my bingo card last week or week before. I did not see that one coming. But it is cool. We need to fire them all up, fire all the old nuclear folks back up.

Riley Trettel [00:14:20] It’s amazing. I love nuclear. The nuclear renaissance is really what built the United States transmission systems, what it is today. Our transmission system is insanely beautiful and highly capable and complex, and it has such high capacity and such high reliability because it was really built around these huge nuclear assets. And so to see that technology have a renaissance is is amazing. However, there’s a big headwind that everyone in this industry trying to develop these large data centers is facing, and that is from regulatory. So not necessarily that people don’t want. To see these data centers be built, but just that there’s a lot of red tape and paperwork and different stakeholders in the process to get connected to the system at this large of a scale. We’re talking about this system is built around the largest power plants. Think of the Dam Grand Coulee in Washington or Palo Verde nuke in Arizona or Mobile in Virginia. In Georgia. Six four and four gigawatts. Those are the largest single generating assets in the entire system. And those are extremely robust. The transmission system essentially emanates from from plants of that size. So when we’re talking about data centers in the 5 to 10 gigawatt scale, the system wasn’t really built or designed to serve a load of that size. And so what I think will happen and where we think the park is going and what of work that we’re doing over here on this on this solution is is it’s it’s funny to say this, but microgrids people talk a lot about microgrids for the past decade. But the largest and most capable microgrids that are going to come up completely isolated from a system is going to be built to serve these workloads. So when you look at renewables, everyone wants to build solar in southern Arizona and interconnected to whack and sell power to the sky. So the interconnection queue on that is extremely long and people are looking at 7 to 8 years interconnect. And so there are a lot of people with modules onshore, with equipment, with tens of thousands of acres of land that can deploy these assets. Whereas what is a data center, a data center for Training I or or other use cases in a large concentrated fashion is essentially an analog of the consumer load profile of the grid and always showed that it may fluctuate up and down from time to time, but it requires a high level of reliability. And so but it’s also a buyer of power 24 seven, just like consumers on the grid are. And so there’s a world where you can build massive island and solar plant on tens of thousands of acres in Arizona. For folks who have who have modules in one of them faster than they can deploy them interconnected into the system. And you can set up a PPA with that solar facility and offtake that is just as financeable as connecting that to the system where either the data center or a massive battery storage system is buying all the power that comes off of the solar farms. The solar farm doesn’t experience any curtailment. It has a fixed price or variable PPA based on on where it’s located with the data center at a battery and then the data center and the battery. And the key thing to understand is that a data center already has a battery. So data centers, as they’re built today for additional data centers for these types of workloads, have a UPS uninterruptible power supply. That is something that continues, that maintains a steady constant flow of power to the computers. If there are frequency deviations or if the if the grid goes down. And while we’re waiting for backup generators to fire up the ups and the battery and that UPS serves the data center in that period of time, data centers are built today with five minute duration batteries, typical 5 to 9 minutes depending on the start up time of the generators. But the solar farm can make their economics pencil As long as we buy all of their power and we can make the economics of a battery of a larger capacity battery hence allowed. If it means that we are buying power that was sourced 100% from this island, that solar farm. And if this is the only way to energize a 5 to 10 gigawatt data center in any meaningful short period of time, we can you know, if you wanted to actually connect it to the system, it could be 10 to 15 years. You’d have to redesign massive parts of a transmission of generation to do that. And so there is a world where, yes, nuclear is the most logical solution, but because of the regulatory headwinds, that nuclear face is because of the Nimby problems that nuclear has it made easier for for data centers to to build island and solar plus storage. And really this kind of a point.

Stuart Turley [00:19:04] I this is very interesting because I think you’re right on track. I think microgrid is going to be the way that data centers are going to be able to do that. But I think the winning formula based on physics, physics and fiscal responsibility matter. And I think that you’re going to see a mix of those that can come up with natural gas power plant funding to storage and having solar to offset for tax credits. That’s the winning formula until we can get the regulatory process right now. Right now, the current administration of the Harris Biden administration has cost the taxpayers more than $4 trillion in regulatory issuance across all of the sectors. It is absolutely horrific. What. Legislation through regulation has happened, and I think you’re on the right track with this. To be in that microgrid. But boy, natural gas is the shortcut to getting to where we need to be with that, in my opinion. Is that a fair statement?

Riley Trettel [00:20:14] That’s a completely fair statement. And I was just talking about. Island systems for building five gigawatt clusters. But in the immediate term, the market need is really for a 500 to 1GW cluster size. And those are the system today. The transmission system is capable of serving loads of that size. There are several Bitcoin mines in Texas of that size. And so so this is something that that we can do today and we spend a lot of time working with with regulated utilities, with state public service commissions to, to energize and to develop loads and interconnects of the size and natural gas plays a very key role in getting those loads on line. So we have a lot of utilities today with with the load growth that they’re seeing from the industrial sector, from a reshoring of a lot of our domestic production capacity of chemicals, metals and things that are of strategic importance and also data center load growth. Most utilities in the country are short generation and so connecting very large data centers to the system isn’t feasible from the Public Service Commission’s perspective because then the rate payers will be subsidizing a lot of those costs. And so while data centers may be looking at 5 or 6 years to interconnect because there isn’t any generation, the transmission may be built effectively, but there isn’t enough generation to serve. And so gas plants and gas plants that are that can be deployed quickly, you know, turbines that are onshore and ready to be deployed solve a very particular need and that we can go directly to a utility and say we will build the data center. We will also build the gas plant and generation required to match that load. And so essentially we can take advantage of these strong pockets of transmission. And by dropping that additional generation, we can achieve speed with that utility. So because natural gas can be deployed so quickly and because there are a lot of areas where you have pipeline capacity, where you can get fuel, you have jurisdictions where you can get air permits, you know, located on good fiber routes and in strong areas. The transmission system. This does solve the need for data centers, and it also is going to bolster the general generation capacity of the system more broadly. And and really, at the end of the day, the the goal of the of the Public Service Commission and the utility is, is to create a strong and reliable system that serves ratepayers. The grid is good for for keeping people healthy, keeping people safe and giving us good quality of life. And so if these data centers are looking for speed, then natural gas is definitely an answer to two. Part of the problem that we’re facing.

Stuart Turley [00:22:52] I tell you what, Riley, I have really enjoyed our conversation. I hope to have you back again and visit with you on this. In fact, I’ve got a couple of data center podcasts that I’ve got coming up. I’d love to have you as a guest on that as a panelist, and we’ll get that over to you and stuff. What do you see coming around the corner for Honey?

Riley Trettel [00:23:13] Yeah, So today, Honey is focused on developing data centers for A.I. training. So coming around the corner for us is continuing to build out that business, build these large scale campuses, take the campuses that we have today that are particularly well-suited for this technology as well as the ones that are coming down the pipeline fairly soon, within the next 12 months, and and trying to build the most capable training facilities that we can. They’re really trying to solve this AI training and inference shortfall capacity that exists today. And that is what’s coming down the pipe right now. And what we’re we’re all working very hard toward and we’re very excited about it on long scale. What Honey is building is essentially a new type of company. People in Energy will be familiar with the concept of an IP, an independent power producer that builds power plants, and it is essentially an inverse IP in that we are building load interconnection capacity and power land at the scale of, if not larger than the largest power plants in system today. And so when we think about developing high powered land and having these interconnections, it’s really around the central theme and motivation that drives a lot of folks within the company, which is increasing energy usage, driving it towards useful technologies, improving quality of life. And that starts with having these large scale assets that can consume power. So today the most profitable thing and the most valuable use of that energy is to build these large AI training data centers. But that use case may shift in the future. There may be other applications such as synthetic natural gas, hydrogen stored. Marriage that these assets could be could be used for. And we we are really just trying to position ourselves as the inverse IP where we can we can take these assets, these interconnections and consume power from the system strategically to drive it towards technologies that that humanity values. So with this technological revolution that’s happening and with strikes we’re seeing and in the space industry and we’re just very excited about where humanity’s going and where the United States in particular is, is leading the charge globally to drive new technologies. And the key to that is, is making our transmission system capable of of deploying massive amounts of energy towards technologies that are strategically important for the United States and and our allies and our interests globally. And so we’re we’re very excited to be where we’re at today and building what we’re building. And it’s going to be a very interesting future in the short term to see these large AI training facilities be built and the capability that A.I. is going to have when, when these things come online is going to be just mind blowing. And I think it’s really going to drive a lot of efficiency and a lot of more quality human interaction into our society. People are going to be able to focus on higher level things that maybe the A.I. can’t do. Maybe the A.I. isn’t going to be able to think or rationalize. It’s gonna be very good at recognition and it will be able to do a lot more detailed analysis of things that humans just aren’t good at, which then frees up humans to do things that that we are good at, you know, are more artistic and more creative efforts to really improve society. So we’re excited to see that in the short term and in the long term, we’re we’re excited to see where where this can go and what humanity can can really do if we start to maximize the use of of energy.

Stuart Turley [00:26:50] Well, that’s really cool. Riley. How do people get a hold of you?

Riley Trettel [00:26:53] So I’m not super active on social media. AI is something that I want to start working on in ads and start like putting out a little bit more content. I do have a LinkedIn profile which you may be able to link in in the in the show notes. Here I have the letter which is not very active. I do a lot of reading. I don’t do a lot of talking. I like to sit and observe and take in information on that. So I guess maybe on a future episode, if I come on Stu, I’ll have a more developed social media profile that I can advertise.

Stuart Turley [00:27:24] Well, just LinkedIn is great for the business environment, and I think that you’re you really are actually going to go very far in life. I can tell by the way, that you do think rather than run down the road and you’re doing this, I’m just really excited to have gotten to meet you and I’m excited to find out about what’s going on today, and I’m looking forward to our future conversation. So thank you for stopping by the podcast today.

Riley Trettel [00:27:49] Yeah, I really appreciate the kind words. Jun is great to meet you as well. I’d love to come on future episodes. We there’s a lot more to dive in, especially with regards to energy and the market today and how the market is going to to evolve. So there’s a lot going on and we’re really excited about it, but I appreciate you having me on.

Stuart Turley [00:28:08] Sounds great. Talk to you soon.

This article (AI’s Energy Demands: The Future of Data Centers and Grid Challenges) was created and published by Energy News Beat and is republished here under “Fair Use” with attribution to the author Stu Turley

••••

The Liberty Beacon Project is now expanding at a near exponential rate, and for this we are grateful and excited! But we must also be practical. For 7 years we have not asked for any donations, and have built this project with our own funds as we grew. We are now experiencing ever increasing growing pains due to the large number of websites and projects we represent. So we have just installed donation buttons on our websites and ask that you consider this when you visit them. Nothing is too small. We thank you for all your support and your considerations … (TLB)

••••

Comment Policy: As a privately owned web site, we reserve the right to remove comments that contain spam, advertising, vulgarity, threats of violence, racism, or personal/abusive attacks on other users. This also applies to trolling, the use of more than one alias, or just intentional mischief. Enforcement of this policy is at the discretion of this websites administrators. Repeat offenders may be blocked or permanently banned without prior warning.

••••

Disclaimer: TLB websites contain copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to our readers under the provisions of “fair use” in an effort to advance a better understanding of political, health, economic and social issues. The material on this site is distributed without profit to those who have expressed a prior interest in receiving it for research and educational purposes. If you wish to use copyrighted material for purposes other than “fair use” you must request permission from the copyright owner.

••••

Disclaimer: The information and opinions shared are for informational purposes only including, but not limited to, text, graphics, images and other material are not intended as medical advice or instruction. Nothing mentioned is intended to be a substitute for professional medical advice, diagnosis or treatment.

Be the first to comment

Leave a Reply

Your email address will not be published.


*