Now guess how much power it took for each one of those wrong answers.
The upper limit for AI right now has nothing to do with the coding or with the companies programming it. The upper limit is dictated by the amount of power it takes to generate even simple answers (and it doesn’t take any less power to generate wrong answers).
Training a large language model like GPT-3, for example, is estimated to use just under 1,300 megawatt hours (MWh) of electricity; about as much power as consumed annually by 130 US homes. To put that in context, streaming an hour of Netflix requires around 0.8 kWh (0.0008 MWh) of electricity. That means you’d have to watch 1,625,000 hours to consume the same amount of power it takes to train GPT-3.
It’s a drop in the bucket compared to what’s actually causing damage like vehicles and plane travel.
Estimates for [training and building] Llama 3 are a little above 500,000 kWh[b], a value that is in the ballpark of the energy use of a seven-hour flight of a big airliner.
That being said, it’s a malicious and stupidly formed comparaison. It’s like comparing the cost of building a house vs staying in a hotel for a night.
The model, once trained can be constantly re-used and shared. The llama model has been downloaded millions of time. It would be better to compare it to the cost of making the movie.
An average film production with a budget of $70 million leaves behind a carbon footprint of 3,370 metric tons – that’s the equivalent of powering 656 homes for a year!
Now guess how much power it took for each one of those wrong answers.
The upper limit for AI right now has nothing to do with the coding or with the companies programming it. The upper limit is dictated by the amount of power it takes to generate even simple answers (and it doesn’t take any less power to generate wrong answers).
https://www.theverge.com/24066646/ai-electricity-energy-watts-generative-consumption
If the AI wars between powerful billionaire factions in the United States continues, get ready for rolling blackouts.
Time for nuclear to make a comeback.
It’s a drop in the bucket compared to what’s actually causing damage like vehicles and plane travel.
https://cacm.acm.org/blogcacm/the-energy-footprint-of-humans-and-large-language-models/
That’s around 570 average american homes.
That being said, it’s a malicious and stupidly formed comparaison. It’s like comparing the cost of building a house vs staying in a hotel for a night.
The model, once trained can be constantly re-used and shared. The llama model has been downloaded millions of time. It would be better to compare it to the cost of making the movie.
https://thestarfish.ca/journal/2025/01/understanding-the-environmental-impact-of-film-sets#%3A~%3Atext=While+it's+easy+to+get%2C656+homes+for+a+year!
The water consumed by data centers is a much bigger concern. They’re straining already strained public water systems.