Training GPT-3 in Microsoft's U.S. data centers directly consumed around 700,000 liters

 


Training GPT-3 in Microsoft's U.S. data centers directly consumed around 700,000 liters (185,000 gallons) of water, according to estimates outlined in a pre-print research paper. 

The amount of water, which was used to cool the data centers, is equivalent to producing 320 Tesla electric vehicles or 370 BMW cars, the research claims.

When broken down, a basic exchange with ChatGPT of 25 to 50 questions is equivalent to consuming a 500-milliliter water bottle, according to the paper.

  • It estimates that the 185,000 gallons consumed to train GPT-3 would have been triple that amount if done at Microsoft's Asia-based data centers.
  • OpenAI hasn't said how long it took to train GPT-3.
  • The researchers, from the University of California Riverside and the University of Texas Arlington, argue that AI model creators need to address their water footprints.

Post a Comment

Previous Next

Contact Form