artificial intelligence is insatiable when it comes to electricity consumption – Kommersant

artificial intelligence is insatiable when it comes to electricity consumption - Kommersant

[ad_1]

Rene Haas, CEO of British company ARM, which has made a name for itself by solving problems of minimizing the power consumption of smartphones and increasing the life of batteries, said the same must be done with regard to artificial intelligence. This is reported by The Wall Street Journal.

AI models like ChatGPT are “simply insatiable” when it comes to electricity, the paper quotes Rene Haas as saying. The more information such models collect, the smarter they become, he says. But “the more information they collect to become smarter, the more energy they expend.”

Mr. Haas said that until this problem is resolved, breakthroughs in the field should not be expected. “It will be difficult to accelerate any breakthrough events if the energy requirements for large data centers where people do research continue to grow and grow,” he says.

According to Rene Haas, without achieving more efficient use of electricity, “by the end of the decade, AI data centers could consume 20% to 25% of all US electricity needs. To be honest, this can hardly be called sustainable (development).”

At the beginning of the year, the International Energy Agency noted that one request to ChatGPT consumes 2.9 kWh of electricity. This is about the same amount as a three-minute run of a 60-watt incandescent lamp. And that’s ten times more than it takes to complete a Google search. According to agency experts, the demand for electricity from the AI ​​industry will increase at least tenfold in three years, from 2023 to 2026.

Read about what tasks the new version of the popular chatbot can solve in the material “ChatGPT has achieved an “A””.

Victor Buk

[ad_2]

Source link