Monday, July 8, 2024
29 C
Surat
29 C
Surat
Monday, July 8, 2024

Energy-hungry AI: What is the hidden cost of your GenAI discovery?

Must read

According to a 2009 Google report, a traditional Google search consumes an average of 0.0003-kilowatt hour (KWh) of energy. This energy can run a light bulb in your home (9 watts) for about 2 minutes. By 2023, an average of about 8.5 billion searches are done on Google every day, which means 2,550,000 KWh of electricity per day, which is about 2000 times more than the electricity consumed by an average Indian in an entire year (1255 KWh).

In May this year, Google announced that the company would integrate AI into its search engine, which would be powered by its most powerful AI model – Gemini. According to Dutch data scientist Alex de Vries, who spoke to The New Yorker on the subject, a single Google search integrated with AI would consume 10 times more energy (3 KWh) than a traditional Google search. This is 20,000 times more than what an average Indian consumes in a year.

Alex de Vries, who is also the founder of Digiconomist, the organization responsible for the Bitcoin Energy Consumption Index, has said that if Google goes ahead with AI integration in its search, its energy consumption will reach around 29 billion terawatt-hours (TWh) per year. This figure is equal to the electricity consumption of Ireland and more than Kenya.

Why are AI systems so power-hungry?

AI systems require a lot of computational power to run complex algorithms to process large corpuses of ever-growing data. When you enter a prompt in ChatGPT, it is processed by the chatbot using its servers hosted in data centers. According to the International Energy Agency, these centers alone account for 1-1.5 percent of the entire global electricity use.

“I think we still don’t understand the energy needs of this (AI) technology,” said OpenAI CEO Sam Altman at a public event in Davos this January. Altman expressed the urgent need for “leading-edge” technology like nuclear fusion to power AI operations, given the growth projections for the leading technology. This is a clear indication that industry leaders of Large Language Model Chatbots are looking for ways to maintain current power consumption levels and secure its future demand.

AI’s carbon emissions – a setback for the UN’s 2050 net-zero emissions goal?

According to The Shift Project, a French nonprofit that works to reduce energy dependence on fossil fuels, data centers that power cloud computing and AI systems generate 2.5 to 3.5 percent of global greenhouse gas emissions. That’s equivalent to the entire aviation industry’s greenhouse gas emissions.

The energy consumption and subsequent carbon footprint of different AI models varies considerably. For example, the BigScience project BLOOM, an AI model with 76 billion parameters (internal variables that the model learns during training), consumes 433 megawatt-hours (MWh) of electricity.

In contrast, OpenAI’s GPT-3 from 2020, which had a comparable number of parameters at 175 billion, consumed 3 times more power at 1287 MWh, according to data from the Artificial Intelligence Index Report 2024 published by the Stanford Institute of Human-Centered Artificial Intelligence (HCAI).

The CO2 equivalent emissions (in tonnes), i.e. total greenhouse gas emissions expressed in terms of carbon dioxide, was 25 tonnes for BLOOM and for GPT-3, it was 20 times higher, i.e. 502 tonnes.

Latest and breaking news on NDTV

The report also notes that there is a serious lack of transparency by AI developers regarding the environmental impact of their models, with most developers not making their carbon footprint public.

Google’s Sustainability Report 2024, released recently, also reflects the energy hunger of AI’s nascent technology. The company saw carbon emissions increase by nearly 50 percent over the last 5 years due to advancing its new AI technologies. Microsoft’s Sustainability Report 2024 shows similar trends, with CO2 emissions increasing by 29 percent in 2023 compared to the previous year.

According to SemiAnalysis, a US-based independent AI research and analysis company, AI will push power consumption of data centres to 4.5 per cent of global energy production by 2030. Another estimate by the International Energy Agency suggests that the total power consumption of data centres could double from 2022 levels to 1000 TWh (equivalent to Japan’s current power consumption). India has around 138 data centres, with 45 more reportedly operational by the end of 2025. The United States has the highest number of data centres at 2701.

Lawmakers are beginning to take stock of the situation. The European Union has taken cognizance of this and adopted a new regulation in March this year. Under this scheme, all data center operators are required to report their energy and water consumption (used for cooling systems). They are also mandated to provide information on the efficiency measures being implemented to ensure reduction.

In February, US Democrats introduced the Artificial Intelligence Environmental Impact Act of 2024. The act proposes to establish an AI Environmental Impact Consortium of experts, researchers, and industry stakeholders to address the environmental impact of AI.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest article