
Measuring the expenditure involved in crafting an artificial intelligence product, such as ChatGPT, can be an intricate endeavor. However, one unmistakable requirement for Microsoft-backed OpenAI in its quest for technological advancement was an abundant water supply, sourced from the Raccoon and Des Moines river watersheds in central Iowa. This water served the crucial role of cooling a potent supercomputer pivotal in instructing AI systems on the nuances of human writing.
Prominent technology developers, including industry giants like Microsoft, OpenAI, and Google, have been candid about the substantial costs accompanying the burgeoning demand for their AI tools. These costs encompass everything from pricey semiconductors to an upsurge in water usage.
Yet, they have tended to shroud the specifics in secrecy. Few residents of Iowa were privy to the state’s status as the birthplace of OpenAI’s formidable language model, GPT-4, until a Microsoft executive divulged that it had been “literally created amidst the cornfields of Des Moines.”
Constructing a substantial language model entails an exhaustive analysis of patterns within an extensive repository of human-authored text. This computational endeavor exacts a considerable toll on electricity consumption and generates copious amounts of heat. To forestall overheating, data centers necessitate a continuous influx of water, typically directed to a cooling tower located outside these warehouse-sized facilities.
In its latest environmental report, Microsoft made a noteworthy disclosure: its global water consumption surged by a staggering 34% between 2021 and 2022, reaching nearly 1.7 billion gallons—equivalent to more than 2,500 Olympic-sized swimming pools. This sharp uptick, in contrast to previous years, has been linked by external researchers to the company’s intensive AI research efforts.
Shaolei Ren, a researcher at the University of California, Riverside, who has been diligently endeavoring to quantify the environmental impact of generative AI, stated, “It’s fair to say the majority of the growth is due to AI,” highlighting Microsoft’s substantial investments in generative AI and its partnership with OpenAI.
A forthcoming paper by Ren’s team is poised to shed further light on the water consumption of ChatGPT. Their estimates suggest that ChatGPT consumes approximately 500 milliliters of water each time it responds to a series of 5 to 50 prompts or questions—an estimate that fluctuates based on server location and seasonal factors.
It’s worth noting that this estimate encompasses indirect water usage, which the companies in question do not actively measure. This includes water used to cool the power plants that supply electricity to the data centers, which play an integral role in sustaining the AI ecosystem.