Hacker News new | past | comments | ask | show | jobs | submit login

> But I don't see how we're planning to meet such demand with our current power grid. Which of course in turn means that your electric bill will be in for a shock.

We are definitely going to use more electricity — that's why everyone has been planning for major upgrades — but that's part of why solar is so useful since a lot of the existing grid demand is used at times when solar is generating peak capacity (i.e. summer air conditioning), and we've seen a lot of efficiency improvements which are reclaiming some existing capacity (e.g. at the turn of the century, a desktop computer used 1kw, your lighting was 10x more wattage, your AC, fridge, etc. were far less efficient, etc.) — that doesn't solve the problem but it takes some of the sting out of it.

In the case of an EV, my house is entirely electric including a heat pump + resistive heating. Charging a Tesla Model 3 completely is somewhere around one day's usage in the winter (~20℉ outside) and since that's a 200-350 miles range you're unlikely to be doing that every day or every other day. Most importantly, that seems well within the power output a solar array can provide — since cars are idle something like 95% of the time, you have plenty of opportunity to charge them off peak or when renewables like solar or wind are producing well.

That to me doesn't seem like an intractable problem but rather something which can be done incrementally along with other desirable work such as upgrading the grid to be more resistant to things like storm conditions.




> a desktop computer used 1kw

Desktop computers didn't use 1kW on average back in 2000. That would have been an insanely high-end machine and nothing like the average home PC. A lot of components were passively air-cooled or had a tiny heatsink with a small fan, you're not running 1,000W on something that may or may not even have a fan. Most computers I had back then had 200-300W power supplies. Add another 100W for a CRT monitor and that means each desktop really used something closer to 400W max. Usually you don't run at the max rating of your PSU, so really something less than that.

EDIT: The TDP of a Pentium III (released 1999) was 30W. Add another 20W for a hard drive, another 10W for an optical drive, and 100W for the motherboard and RAM, and that's ~160W for a basic home computer in 1999. AGP allowed for ~50W of power, extra power connectors on GPUs back then was pretty much unheard of so even with a fancy GPU you're only looking at ~210W of power for a decent 2000s era PC.


There were plenty of 400-500W power supplies in use back then, and 100W was on the low side for a non-tiny CRT. Don't forget that people commonly had multiple hard drives and/or external drives because storage hadn't saturated for the average user yet (oh, so many people learned the hard way that Iomega was not the answer…), and things like personal printers were more common since home networks weren't. I'm probably biased a bit as the home users I knew were mostly professionals rather than hobbyist but when people were speccing out UPSs they tended to assume numbers which these days we'd only see for beefy servers.

The bigger challenge is that while EnergyStar made a big improvement, it took a number of years to become something you could assume. Problems with firmware and software support meant that a lot of people disabled it to avoid problems and systems didn't spend as much time in lower-power states.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: