Because responding “thank you” and “please” to artificial intelligence has a huge energetic impact

Because responding “thank you” and “please” to artificial intelligence has a huge energetic impact

How many times, while making a request to the artificial intelligence, do we use “Please”? And how many more do we thank language learning models for their response? Our tendency to “humanize” AI has a price: reply “thank you” or “please” to ChatGPT, Gemini or other similar models has a significant energy impactboth in terms of electricity and water consumed – and of course, money. But how is it possible that kind words have an energetic impact? In this article we explain what happens.

How much the use of kind words with AI models consumes

The use of “please” and “thank you” has a greater energetic impact for a simple reason: every single request to a chatbot it costs money and energy and, similarly, each additional word increases the energy consumed by the server. In other words, all “thank you” or “please” must be processed by the system: every calculation in the system, however, consumes energy and costs money.

The same CEO of ChatGPT, Sam Altmanconfirmed this huge energy expenditure by answering the question of an X user, who wondered how much money OpenAI had lost in energy costs caused by all the people saying “please” and “thank you” to the AI ​​models.

To this question, Sam Altman responded: “tens of millions of dollars” to which he added an ironic “well spent, you never know”.

tweet-Sam-Altman

Among other things, knowing the actual consumption of AI is not easy, given that the companies involved are not very transparent about the data. According to estimates by theInternational Energy Agency (IEA), to generate a text an artificial intelligence model consumes approx 0.3 Whwhich rise to 1.7 Wh to recreate an image ea 115 Wh to generate a short (6 seconds) and low quality video.

To get an idea, charge a smartphone requires an expenditure of energy equal to 15 Whwhich pass to 60Wh if the device to be charged is a computer.

And this only applies to electricity consumption, to which must be added thewaterfall used for server cooling: according to a study from the University of California, a email of 100 words written by ChatGPT-4 can consume more than half a liter of water.

Should we stop being nice to AI?

The question at this point is one: is it better to stop being kind to artificial intelligence models to waste less energy? Actually no.

Logically, we would say that to avoid this waste of energy and money, it would be enough to stop asking “please” and thanking artificial intelligence. However, this hypothesis is refuted by a recent study, which demonstrated how the rudeness could even make our interactions worse with artificial intelligence.

This would happen because the AI, when faced with rude requests or responses, could generate inaccurate content and vague, even going so far as to interrupt the conversation for safety reasons.

At the same time, in the moment in which We try to express ourselves in a courteous manner and polite, we tend to be more accurateprocessing requests in a more precise and detailed manner. In this way, it will be easier for the AI ​​to respond in a coherent and comprehensive manner and, consequently, less energy will be consumed because no further explanations will have to be generated.

In short, continuing to address artificial intelligence with education – without exceeding with formal words – could reduce energy expenditure, helping AI to provide us with more relevant answers.