Does anyone have any useful rules-of-thumb or heuristics for balancing this trade off of upfront cost v.s. power cost? e.g. how much does an N100 cost to run for a year v.s. say a i5-2400s (the CPU for the first row on the linked site)?
I used to calculate costs of lightbulbs: 1 Watt running the whole year, at 0,28 eurocent/kWh costs 1 Euro per year. Until someone corrected me and it turned out that every 1 Watt 24/7 will be 2 Euro per year.
In the US electric power might be cheaper. And if it's running only part of the time, you should adjust the calculation.
My desktop/server runs 24/7, so I prefer having a CPU with 65W TDP over one that is 125W TDP. That might run up to 120 Euro per year difference for me (if it would be running at 100% CPU).
Real world energy use is nothing like what you see on spec sheets. And not just because manufacturers differ in how they compute TPD. And TPD is also not a good indicator for energy use at (near) idle. With underclocking/volting in the BIOS you can get a beefier CPU to outperform smaller CPUs per watt. Because CPUs get really inefficient as they use more power undervolted or capped high TPD chips might be much more power efficient in the real world than their low TPD counterparts.
My NUC13 with i3 has a nominal 15w TDP, but while idling on a KDE desktop with a browser open to reuters (1 tab) it hovers around 3 - 4w (5% CPU usage). If there's REALLY nothing going on (no desktop even) it's 1.0 - 1.3w (1% CPU usage).
Edit: I should note that there's no fan drawing power because I put it in an Akasa passively cooled case.
I tried to find this out myself. All I could find easily was the TDP of different processors. But I'm not sure if it's a good measure of how much power it will use.
I went down this rabbit hole earlier this year. Best I came up with was to calculate the TDP at max for the whole year. Full TDP is unrealistic, but it gets us a worst-case "max running cost" . Energy for me is roughly $0.12/kWh, so the yearly max running cost for a 35W TDP is $36.79, 65W is $68.33, and the 95W would be $99.86.
I ended up going with a HP EliteDesk 800 G5 Mini I5-9500T (35W) off of Ebay for $100 and it does the stuff I need it to do just fine. According to my current monthly power usage graph, it's averaged 7W which accounts for $0.61 of this month's power bill.
No, sadly the TDP tells us every little about the idle power cost, which might be where you spend most of your time depending on the workload.
Just from tweaking my laptop, I’ve noticed that when it is really idle (or I’ve intentionally put it in a low frequency mode), the big power drains are the wireless interfaces (don’t forget bluetooth) and the screen (OLED helps as long as the screen is mostly black). Gotta tweak the whole thing.
The only real way of knowing is to measure it. If you already have a system in place an energy monitoring smart plug can help you calculate the current running costs and help estimate the savings of using a lower-power machine.
When I did this I was surprised by how much - or how little - it cost to run various devices. It's quite addictive.
It's not always accurate because a lower-power machine doing the same task will often need to work at its full power more often, so the savings may be less. For example, a Raspberry Pi 5 may often be more power effecient than a Pi 4, despite drawing more power at full capacity on paper, because it spends less time at full capacity than the Pi 4 does.
On the other hand, when I upgraded my work PC I found it used less power but I also had to run my office heater more often in winter, as the new PC wasn't as efficient at heating the space.