AI consumes more electricity than a google search

 

Studies predict that artificial intelligence, including its generative and language models like ChatGPT, along with data centers and cryptocurrencies, may double their electricity consumption by 2026 and consume more energy than a typical Google search. In the May 2024 report released by Electric Power Research Institute (EPRI) and the January 2024 analysis by the International Energy Agency (IEA), one Google search may consume an average electricity of 0.3 Watt-hour while a query using OpenAI’s ChatGPT is around 2.9 Watt-hour per request, presently.

 

With about nine billion searches daily, the total electricity required to produce results for these requests may amount to almost 10 Terawatt hours in a year. This only tackles the text-based queries of the users, so generating music, photos and videos using text prompts and other AI applications is expected to require much more energy than what is already consumed with the language models. In the recent report by EPRI, the research states that generative AI models and other high-energy-consuming platforms like cryptocurrency may be widely used, and evidence about their usage is starting to emerge.

energy consumption generative ai models chatgpt
OpenAI’s text-to-video model, Sora | video still courtesy of OpenAI | read more here

 

 

Doubled energy consumption of generative AI like ChatGPT

 

The IEA report states that at the moment, the AI server market is dominated by the technology firm NVIDIA, with 95 percent of shares. In 2023 alone, NVIDIA shipped 100,000 server units that consume an average of 7.3 TWh of electricity annually. The research believes that by 2026, generative AI models like ChatGPT will consume at least ten times more energy than they did in 2023. As for cryptocurrencies, they consumed around 110 Terawatt hours of energy in 2022 alone, which accounted for 0.4 percent of the global annual electricity demand. The IEA report anticipates that this will increase by more than 40 percent, to around 160 Terawatt hours, by 2026.

 

As an overview, the demand for energy consumption in data centers, which are used to store information and run websites and can help generative AI models like ChatGPT work, accounts for about 40 percent. Then, all these computers generate a lot of heat, so to keep them from running smoothly, data centers need powerful cooling systems to prevent the computers from overheating. This constant cooling also uses a lot of electricity, making up about another 40 percent of the total usage. There’s a remaining 20 percent of the energy used for other equipment to make these all work, which can range from security systems to backup generators in case of a power outage.

energy consumption generative ai models chatgpt
Sam Altman on Bill Gates’ Unconfuse Me podcast | video still courtesy of GatesNotes and Youtube | read more here

 

 

Finding ways to make data centers consume energy efficiently

 

To keep the energy consumption of AI, cryptocurrencies and data centers as efficient as possible, the study of EPRI suggests using advanced scheduling and resource allocation to reduce the rates of data servers not fully utilizing their computational abilities even though they’re still using electricity. They also propose the use of virtualization and containerization. The former means dividing a single physical server into multiple virtual slots, which companies with different computer programs can rent to store and process their data.

 

The latter is to divide a single data server into equal parts for sharing and renting out to other companies that may need it. In these ways, the underutilized computational capabilities of the data servers can be fully used, and the energy consumption can be justified and used efficiently (a 20-percent reduction per server is foreseen). Data centers, which are also used by AI platforms and apps, might be able to buy and sell electricity based on the real-time price, just like how stocks are traded. This could help data centers save money and also help the power grid run more smoothly. By using less power during peak times (when everyone else is using a lot of electricity), data centers can also help reduce strain on the power grid and prevent blackouts.

energy consumption generative ai models chatgpt
render of AI and GPU processors | photo by Igor Omilaev via Unsplash

energy consumption generative ai models chatgpt
round and silver gold coins portraying cryptocurrencies | photo by David McBee via Pexels

 

 

project info:

 

institutions: Electric Power Research Institute (EPRI), International Energy Agency (IEA)

studies: Powering Intelligence (May 2024), Electricity 2024 (January 2024)