According to a recent study, the water use of data centers is extremely large to support the ChatGPT artificial intelligence chatbot, which is used by billions of people globally.
ChatGPT has attracted notice for its capacity to provide thorough responses to a variety of inquiries, when given prompts. It is the fastest-growing consumer application in history with over 100 million monthly active users, and it has also passed a few tests.
Meanwhile, experts pointed out that these successes might have been at the cost of the chatbot using a lot of water. While past research focused on the carbon footprint of these AI models, experts asserted that the large-scale water usage required to run them has “remained under the radar.”
Read More: OpenAI Rolling Out ChatGPT Plugins To Plus Users
According to recent research, a conversation with an AI chatbot in a single system may consume a “500ml bottle of water” for 20 to 50 questions. In order to conduct the research, a new framework was developed to determine how much freshwater that is pure is used for cooling servers that run AI models as well as for generating electricity to power data center servers.
Scientists gave an example to illustrate their point by stating that just for the GPT-3 training, Microsoft is estimated to have used 700,000 liters (185,000 gallons) of water, enough to build 370 BMW vehicles. Scientists noted that Google’s LaMDA is supposed to consume a “stunning” quantity of water in the range of millions of liters.
In the study, researchers stated that AI models can, and also should, take social responsibility and set an example in the collective efforts to combat the global water scarcity challenge by reducing their own water footprint. Given ChatGPT’s billions of users, they said that although a 500ml bottle of water might not seem like much, the entire cumulative water footprint for inference is still very enormous.