Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> "Leaving aside [all of AI's potential benefits] it is clear that large-language A.I. engines are creating real harms to all of humanity right now [...] While a human being is responsible for five tons of CO2 per year, training a large neural LM [language model] costs 284 tons."

Presuming this figure is in the right ballpark – 284 tons is actually quite a lot.

I did some back of the napkin math (with the help of GPT, of course.) 284 tons is roughly equivalent to...

- a person taking 120 round trip flights from Los Angeles to London - 2 or 3 NBA teams traveling to all their away games over the course of a season - driving 1 million miles in a car - 42 years of energy usage by a typical U.S. household



284 tons is a lot until you divide by the number of end users. You could say the CPU compute usage for Netflix encoding its video library is comparable, probably more when you consider the permutation of output formats across the size of their library. But the per user emissions is still negligible.


> - 42 years of energy usage by a typical U.S. household

Focus on that one! OpenAI (for example) has approx 375 employees. By your calculations, the CO₂ emissions of those employees driving to work, etc, already dwarfs the quoted 284t CO₂.


Or 500 Bitcoin transactions. Although that would also generate 250kg of e-waste.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: