Immagine

How much does AI pollute and why? The environmental impact on water, energy and CO2 consumption

Alongside the advantages and possibilities created by large artificial intelligence models such as ChatGPT, there are several concerns regarding the origin of primary sources, copyright infringements and user privacy, but an often overlooked problem is related to energy costs of the development and use of these online models: dedicated servers in fact perform complex calculations even to generate a simple image or written text. According to a study by the startup Hugging Face, in collaboration with Carnegie Mellon University, generate a single image with the most common graphical AI models consumes, in a few seconds, the same energy needed to fully charge a smartphone. Given the ever-increasing use of these models, consulted billions of times a day for tasks such as managing emails or generating texts and images, these are problems that should not be underestimated. Producing the energy needed to power AI involves the production of climate-altering gases such as CO2on which several studies and analyses are beginning to spread. It must be considered, however, that with the same amount of work completed, comparison with the energy consumption required by humans it’s not always easy.

CO emissions2 AI-related

Speaking of carbon footprint of AI, the so-called “CO footprint2” mainly linked to large generative models, the emissions of this greenhouse gas depend both on the energy consumption of the servers and on the nature of the sources used to produce electricity. In countries that produce more energy with renewable sources or nuclear power, the amount of carbon dioxide can be much lower than that generated in countries more dependent on fossil fuels, such as the USA, where a good part of the AI ​​startups are located.

Image
The United States currently leads the way in research in AI models. Credit: Stanford Institute for Human–Centered Artificial Intelligence, via Wikimedia Commons

However, the carbon footprint of AI must consider, in addition to the use by the user, also the contribution due to Model creation and training. Researchers at the University of Massachusetts Amherst estimate that the entire process could produce up to 300 tons of CO2. In the case of ChatGPT-3the energy used was equal to that consumed by the life cycle of 5 carsfrom production to scrapping, after a mileage of 200,000 km.

AI’s water consumption

The Washington Post and the University of California Riverside have also confirmed that AI models are particularly “thirsty”: the Server cooling needs sometimes lead to the use of Cooling Towers where the heat is transferred to the water. The water used is partly lost through evaporation and in part recycled 3 to 10 times before being dumped in the sewerto avoid the concentration of bacteria or mineral salts.

Based on the estimates of this study, a 100 word email written by ChatGPT-4 can “consume” more than half a liter of water; if one in 10 currently employed Americans sent an email with ChatGPT-4 a week, the servers would use 435 million liters of water per year, the water needs of the Rhode Island area (about 1 million people) for one and a half days.

The comparison with human beings and the ethical implications

As highlighted by a “provocative” article published on Scientific Reportshowever, the comparison between the energy expended by the model and that used by a human being to complete the same task could still be in favor of AI: they are estimated lower emissions between the 130-1500 times for a complex textand from 310 to 2900 times for one image.

According to the study, AI models have large consumption peaks but they return results in very short timeswhile a human artist or writer may take several hours to finish his work and revise it. The calculation of energy used includes the electricity needed to run a PC, but also a part of the CO emissions2 related to food production, heating of work premises and other needs of the human operator. While on the one hand it is not correct to neglect the AI training phase in calculating the energy required, moreover, Human operators also need Of years of studies to be able to produce quality work, with considerable investments of time and energy.

Image
While training an AI model is certainly an energy-intensive process, our training is also expensive, in terms of time and energy used. Credit: Tulane Public Relations, via Wikimedia Commons

The authors of the work do not hide the obvious and heavy implications for our societyincluding the ever-increasing threat to jobs and the resulting well-being of citizens. In addition to these dire consequences, AI models also raise concerns about the possibility of “hijacking” and manipulation of information, as well as for the increasingly widespread legal disputes on sources of the data processed.

The unusual perspective of this article, beyond the ethical implications, suggests possible positive effects of the use of AI for cut down on “wasted” time and the resulting energy consumption: tasks that involve the processing of huge amounts of data, for example, are performed more efficiently by AI, leaving the human more creative tasks or who require a more “outside the box” mind.

This approach may seem futuristic, perhaps dystopian, but it is already used in the scientific field. In 2023, an AI model was able to virtually test the most efficient reactions and “design” catalysts for the production of oxygenstarting from the elements in meteorite samples. A job completed in just two months would have cost, according to researchers at the University of Science and Technology of China, 2000 years of man-hours by human colleagues.