Hacker News new | past | comments | ask | show | jobs | submit login

It is quantifiable, but it's not easily attributable. That is, you can tell that e.g. New York used up 104 terawatt-hours in one year[0], but it's harder to tell what this energy was used for - and, given telecommunications, how much of it was used to fulfill services around the world that just happen to be using a data center in New York.

Price isn't good approximation either, as it's pretty much arbitrary - prices are determined by trade, fluctuate all the time. Regular people only see a smoothed-out, low frequency version of that.

--

[0] - 8.6 megapersons living in New York * 12 MWh US average electricity use per person per year.




Maybe there's some misunderstanding here. I'm not saying cost is a good, or even viable, way of quantifying energy consumption.

Since energy suppliers meter their output to apply a price, I'm wondering how hidden costs could exist (unless the software is somehow circumventing or undermining the meters?).

I would think it'd be pretty straight-forward experiment to use software from the late 80s / early 90s for a couple weeks, measure machine energy consumption, then compare with the consumption from a machine running the latest version of everything for two weeks. Next month repeat but also use a processor from the 90s for the first two weeks. Rinse, repeat, extrapolate, apply statistics, &c. Maybe try adjusting for or comparing differences in appliances common then and now.

If software energy costs are somehow hidden, I can only think they're hidden because we're deliberately not choosing to look at them.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: