Powering AI could use as much electricity as small country: Researcher

Powering AI could use as much electricity as small country: Researcher

(NewsNation) — Researchers have pointed to privacy concerns regarding the rapid growth of artificial intelligence, but left unmanaged, the technology also poses major sustainability threats, according to emerging research.

Alex de Vries, a researcher at VU Amsterdam and the creator of the tech and economy site Digiconomist, says powering AI could use as much electricity as a small country.

De Vries’ research article “The Growing Energy Footprint of Artificial Intelligence” was published in the journal Joule in October, highlighting the potential consequences of AI-related electricity consumption. He spoke with NewsNation on Tuesday about the “hype” surrounding AI and whether it can be created and used sustainably.

The following conversation has been edited for clarity and brevity.

NewsNation: Can you explain what energy consumption looks like for the average person’s AI usage?

De Vries: If you’re using a service like ChatGPT, you’re not confronted with the energy cost but in the background, you have this really massive model with billions of parameters, trained on a massive amount of data that’s serving you those responses. Ultimately, that’s where the energy consumption is. It’s in getting those really massive models set up and serving them because now you can build them. Every time you interact with these models, you need to use those billions of parameters as well, just to come up with your response. If you’re a user of AI, then that’s the part that you’re missing.

NewsNation: What other tools might we regularly use that require similar energy consumption?

De Vries: It really depends. If you have an interaction with ChatGPT or something similar, then ultimately the footprint of that interaction, or the average energy consumed per interaction, will be just three watt-hours, which is by itself, not a tremendous amount.

Three watt-hours is like a low-lumen LED light for one hour. That’s not that massive. If you’re doing this to something like a Google scale, where you’re talking about 9 billion of those interactions every day, then of course, it starts to add up.

The Alphabet CEO said that an AI-powered Google search will take 10 times more power than a standard Google search.

NewsNation: What are some of the challenges regarding energy usage as AI expands?

De Vries: Bigger tends to be better in AI. So you make a bigger model, the model is more robust, and the model will perform better. But obviously, bigger models also mean you need to have more computational resources to set up and operate the model and, in turn, energy to run the machines that are serving the model.

That is a bit of an unfortunate thing with AI, which is something we haven’t really seen before. With regular data centers, over the past decades, we have seen a tremendous increase in demand for digital products. But efficiency gains in hardware have, at least historically, kind of offset the extra need for power to serve all those new digital products that we’ve been creating.

But in AI, you suddenly have this really interesting dynamic where even if you give companies more efficient hardware or a more efficient model, they can just leverage that to make their models even bigger than before and create better-performing models than before. So that kind of negates your efficiency gains.

NewsNation: Are there standards and parameters in place to make sure AI development is ethical and sustainable?

De Vries: When it comes to sustainability, there’s only so much you can do. The general principle that I just mentioned, bigger is better, is always going to be true. So ethical development would mean that if you want to somehow limit your environmental impact, you have to put a stop somewhere, but then risk that your competitor is going to make a bigger, better model that manages to lure everyone away from your own business.

You can commit to doing this with green or renewable energy but then still, I think you’re just making things look better on paper.

Whenever we are increasing our energy demand, we tend to have to fuel that from the backup source, which is typically fossil fuels. So even if we put our renewable energy in AI, it just means that something somewhere else is going to have to be powered with fossil fuels instead.

NewsNation: As AI becomes more pervasive, Is there a sustainable way to interact with it?  

De Vries: What you can personally do is just limit your use of AI to your actual needs.

With ChatGPT, we have seen a lot of curiosity, people just trying it out just because they happen to be curious, while there’s not really any purpose to using it other than that. That could be viewed as wasting a bit of resources for curiosity reasons, although curiosity is not necessarily a bad thing.

But I think it the first responsibility is with the companies to make products that make sense, not just put AI in and because everyone is putting AI in their products.

They always want to put the buzzword into whatever they have. So if the current buzzword is AI, everyone wants to do AI. They actually forget step No. 1, which is trying to figure out what problem you’re actually trying to solve, rather than just slamming AI on something.

It definitely has the potential to be a solution to multiple things, but at the same time, it’s also a technology that has inherent limitations and just isn’t going to be the magic bullet to fix everything. It’s not a magic cure.

Source link

Denial of responsibility! NewsConcerns is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a Comment