Chatgpt may not be as swallowed as

Chatgpt may not be as swallowed as

CatThe Openai Chatbot platform may not be as eager for power as supposed to once. But its appetite largely depends on how the chatgpt is used and AI models that respond to requests, according to a new study.

A recent analysis By Epoch ai, a non -profitable AI research institute, tried to calculate the amount of energy that a typical Chatppt request consumes. A commonly cited statistics Does the Chatppt require approximately 3 watthers to be able to answer a single question, 10 times more than a Google search.

Epoch thinks it is a overestimation.

Use of the latest model of OPENAI for Chatgpt, GPT-4OAs a reference, Epoch noted that the average chatgpt query consumes approximately 0.3 watthers – less than many household appliances.

“Energy consumption is really not a big problem in relation to the use of normal devices or to heat or cool your home, or drive a car,” Techcrunch Joshua, the analyst data that carried out the analysis.

The EIA energy consumption – and its environmental impact, in general – is the subject of a controversial debate while AI companies seek to quickly extend their infrastructure footprints. Last week, a group of more than 100 organizations published an open letter Call the AI ​​industry and regulators to ensure that new AI data centers do not exhaust natural resources and do not force public services to rely on non -renewable energy sources .

You told Techcrunch that his analysis had been stimulated by what he described as obsolete previous research. You stressed, for example, that the author of the report arriving at the estimate of 3 Watts Hours supposed Openai used older and less effective fleas to execute his models.

Epoch ai chatgpt energy consumption
Image credits:AI era

“I saw a lot of public speeches which properly recognized that AI would consume a lot of energy in the years to come, but did not really describe the energy that went to AI today”, Did you say. “In addition, some of my colleagues have noticed that the most reported estimate of 3 watthers per request was based on fairly old research, and based on napkin mathematics seemed to be too high.”

Admittedly, the figure of 0.3 wattheurs from Epoch is also an approximation; OPENAI did not publish the necessary details to make a specific calculation.

The analysis does not also take into account the additional energy costs incurred by chatgpt features such as the generation of images or the processing of inputs. You have recognized that the “long entry” chatgpt requests – requests with long attached files, for example – probably consume more electricity in advance than a typical question.

However, you said it expects the basic energy consumption of the basic line to increase.

“”[The] The AI ​​will become more advanced, the formation of this AI will probably require much more energy, and this future AI can be used much more intense – manage many more tasks and more complex tasks, than the way people use the Chatppt today, ”said you.

Although there was Remarkable breakthroughs In the effectiveness of the AI ​​in recent months, the scale at which AI is deployed should lead to enormous expansion of eager and electricity infrastructure. Over the next two years, AI data centers may need California’s electrical capacity in 2022 (68 GW), According to a Rand report. By 2030, the formation of a frontier model could require an output power equivalent to that of eight nuclear reactors (8 GW), predicted the report.

Chatgpt alone reaches a huge number – and expanding – of people, which makes its server requires just as massive. OPENAI, as well as several investment partners, plans to Spend billions of dollars for new AI data center projects In the coming years.

OPENAI’s attention – with the rest of the AI ​​industry – also moves to so -called reasoning models, which are generally more capable in terms of tasks they can accomplish, but require more IT To perform. Unlike models like GPT -4O, which respond to queries almost instantly, by reasoning the “think” models for a few seconds to a few minutes before responding, a process that more sucks in computer science – and therefore power.

“The reasoning models will take more and more tasks that older models cannot and generate more [data] To do this, and both require more data centers, “you said.

Openai began to publish more models of powerful peeler like O3-min. But it seems unlikely, at least at this stage, the efficiency gains will compensate for the increase in electricity requests from the “reflection” process of reasoning models and increasing use of AI in the world.

You suggested that people were worried about their IA fingerprint of using applications such as the Chatppt rarely or select models that minimize the necessary IT – to the extent of realistic.

“You can try to use smaller AI models like [OpenAI’s] GPT-4O-MINI, “Did you say,” and use them sparingly in a way that requires processing or generating a ton of data. “”

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *