Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Why aren't all heaters computers?
32 points by kyleyeats on Dec 21, 2023 | hide | past | favorite | 61 comments
This is kind of a silly question, but waste heat is waste heat, right? So why isn't all waste heat generated by computers?

Obviously I'm talking about electricity here. How do we have something like Bitcoin but then have electric heaters? Won't a Bitcoin miner produce the same amount of heat given the same electricity? You can use heat pumps for extra gains on either, right?

A Bitcoin miner is more expensive to produce and maintain, yes, but over its lifetime shouldn't it pay for itself? I guess it just seems silly that we have datacenters with cooling (which takes even more electricity), and then also heating for homes.




Datacenter heat reuse is a thing. Mostly in European countries with cold climates. Notably, Finland has a few datacenters like that (Remov, Telia). I believe some countries might even mandate the feasibility study for reusing the heat for new datacenters.

I have no idea why it's not everywhere, but I see some issues right off the bat:

- you need a district heating system to dump the heat into, and to be really close to consumers

- the integration into the heating system isn't free

- heating supply doesn't match demand well (not seasonal; datacenter scaling depends on computing demand, heating is just a byproduct)


> heating supply doesn't match demand well

Yes. However you still must run chillers that would dissipate heat into the environment with or without heat re-use.

I was talking to one of DC technicians about this and mentioned - hey, if a chiller ever breaks down (they are redundant), one can (even in summer), put radiators to max to help dissipate heat from DC rooms. He said yes. I asked whether this is some actionable item on some risk plans or whatever - nop, that is not something that we depend on. And it would just extend the time to do something but not prevent from overheating.


The new Azure region they’re building in the Helsinki region is also going to be used for district heating:

https://news.microsoft.com/europe/2022/03/17/microsoft-annou...


as a rule of thumb, electronics will take as much energy to produce as they will use in their lifetime. A silicon chip is a very expensive resistor which has already consumed enormous energy to produce (another rule of thumb, the cost of silicon is double the cost of the energy it took to produce)

disclaimer: i dont know where i heard this

also if you google "bitcoin miner radiator" there's a few attempts. Apparently there's a spa in NYC that heats their water with ASICs

https://kotaku.com/bathhouse-nyc-spa-bitcoin-asics-185095816...


This rule is very wrong for the things like an iPhone which while expensive won’t be using a lot of electricity in its lifetime


I think they are including all the input resources (e.g. power for the machine that is capable of manufacturing the parts for the iPhone, power for the computers of the designers and engineers that designed it). It seems to be some sort of extrapolation of the Second Law of Thermodynamics, which states that the entropy of the universe is always increasing


Especially considering how power-hungry it is to mill/finish the phone chassis, plus the cost of cutting-edge lithography and low-yield wastage.


The specifics of your question have an obvious answer: heat pumps are more efficient, and computers have a high upfront cost relative to their overall heat production, so we wouldn't have enough computers to meet heating demand.

The context of your question is a lot more interesting. Why don't we do something with waste heat? We already have systems to efficiently move it, so why not move it somewhere that it can be captured and put to use? If we were motivated enough to do that, could we?

There are a few realities that get in the way:

- Heat is difficult to trap, and difficult to move. You can't put heat on pause. That puts a hard limit on its travel distance.

- Computers have different distance limits. Sometimes, you want them to be close to each other; other times, you want them close to you. These two things are inversely correlated with the utility of heat production: You are likely to generate more heat from a computer that is near you, but unlikely to have any need to put more computers near it.

If you could game on a datacenter, then that would change this dynamic. If everyone on your block hosted a giant liquid-cooled LAN party, and used the heat to warm a pool...


The most efficient heater is a heat pump, and its not even close to heat generators.


Sure, but if you're gonna have a space heater (and people do) there won't be any difference in efficiency between any possibility, including computers


Sure, but you should only be using space heaters and not heat pumps (or non-electric sources of heat) if you're using them infrequently, at which point the cost of silicon over resistive wire is not worth it.


Exactly. Heat pumps work by concentrating and moving heat, which is cheaper than generating it in the first place. There's still heat in 0°F weather if you concentrate it.


I believe this was from a XKCD What If? question, it was asking how much would sticking a toaster into a freezer effect it? The answer was not much because the toaster heats up to 500F, so from a toasters perspective everywhere is cold and it wouldn't notice much difference from being in 70F or 0F. I remember this when talking about heat pumps since to refrigerant most places would be hot.


We'd need two orders of magnitude more semiconductors than currently exist. We're already building them as fast as we can.

Global energy use for heating: ~150 trillion kWh/year

Waste heat from semiconductors: ~1 trillion kWh/year


Asian countries are already struggling to keep up with energy demands of semiconductor factories.

And then there are the other environmental impacts - waste chemicals, waster water, and greenhouse emissions.


Capital costs matter, especially for anything that's only run part-time. Buying servers that sit idle for half the year (during summer) would effectively mean they take twice as long to pay for themselves compared to servers that run all year. So that's at least twice as expensive.

Maybe the numbers would still work, but you have to actually do the math.


Just send the servers to the southern hemisphere for half the year (when it's summer here and winter there) and then back for the other half. Problem solved!


Well, our datacenter does use heat generated in server room to run heat pumps for our offices. Except there is much more heat in data center and we must still waste energy on cooling.

The premises nearby as far as I know were not much interested in harnessing this heat as that requires doing fair amount of infrastructure.

But I know that a nearby waste plant grows tomatoes and cucumbers by using electricity and heat from biogas (apparently when waste degrades it produces methane gas?!) - sorry for off-topic, just some green thinking went in my mind :)


Computers aren't free, if it costs even $1 to add a computer that generates enough heat (several hundred watts) to be useful as a heater...why not just omit the computer and make an extra dollar of profit, or charge less and undercut competitors that have computers?

Also consider the externalities...making unnecessary computers contributes to e-waste, it increases demand for rare-earth metals that are often mined in extraordinarily bad conditions (slavery and child labor)...

All this for what benefit? Is there a market of people that want to use a computer attached to a space heater? What useful thing is this computer going to do? Maybe it makes sense as a wi-fi repeater or something, but once again as the manufacturer what's easier/cheaper/simpler to design - a computer that generates heat and does some useful software task, or a fat 1500 Watt resistor that converts DC current directly into heat?

All that said, you might be on to something. With some clever advertising and marketing promotionals, I bet you can convert some portion of the space heater market into "premium" users of this contraption, all while make them feel eco-friendly through a greenwashing campaign.

So my answer is, this product doesn't exist for lack of technical creativity - it exists because the advertising industry hasn't sunk to such depths as to market this product yet.


Qarnot, a french startup is selling water heater, and space heater, and are used in some new building in France, and even swimming pools. They also sell cloud services, and market themselves as low carbon compute.

I find the water heater a smart thing, since you also need hot water in summer.

https://qarnot.com/en


Others have explored this space also:

https://braiins.com/blog/guide-home-hvac-heating-bitcoin-min...

https://www.coindesk.com/business/2023/01/03/heatbit-space-h...

https://hackaday.com/2023/05/13/home-heating-with-bitcoin-mi...

Some of the issues are: the noise from the fans, you have to move the heat out of the miner. Miners are computers, they fail in more ways and more often than a space heater does. If you are heating only with miners how do you regulate the heat, by slowing the miner which makes it earn less and is less efficient?

In essence you can easily heat your space with electronics without using crypto miners. Just turn on all your gadgets and lights in the winter and leave them on. It's very convenient to have them on all the time, and the normal heating system will fill in with extra heat to regulate the overall temperature. It may even help your electronics last longer since turning them on/off tends to be what eventually destroys them (at least that was true of electronics 30 years ago, not sure if it's still true today.)


Generally you only want heat for half a year, the other half you want it cool (roughly). In the half where you want it cool you then have extra load by the miner generating extra heat... or you turn it off, which seems like a massive waste of compute power.


There was a company in the Netherlands, can't seem to find the name right now, that rented out GPU clusters as central heaters, while using the GPUs to mine crypto. I believe they went backrupt during the whole crypto crash and energy crisis.



"Space Heater Offers 50% Cash Back On Heating By Training AI In Your Home"

https://www.forbes.com/sites/johnkoetsier/2023/12/09/space-h...


That device&company seems like a joke. Costs around $1000, multiyear payback (~3years) if you use it every day, you only get 400w from the computer(the rest just from a regular heater also built in), and the payback is "up to" $28, which I would assume would be the payback for 24/7 use.

At my rate of using space heaters, I'd expect probably a 10-20 year payback period (pay schedule is not clear, so it is a guess, and generous). Drop $1000 on a computer I don't use, hoping that the company stays in business for decades? No thanks. I'll run a pihole or a server or something and at least get my own compute out of it if I want heat from compute.

But a great answer to the question. Computers are around $1000 for anything approaching heavy compute, you are probably running around a 400 to 750 watt power supply, so you get a few hundred watts of heat out it (depending on load, obviously). Vs the $10 or so you'd pay for that much space heater (you get 1500 for ~20-30). Loooonnnng payback period.


> A Bitcoin miner is more expensive to produce and maintain, yes, but over its lifetime shouldn't it pay for itself?

IIUC based on other news on HN, Bitcoin mining is already at the point where you need specialised hardware for it to pay for itself at all. 10 years ago, yes, it would've paid for itself, but it would've been a gamble.


Yep, heat is heat. If you manage to swap in your hardware for grandma's electric heater, you have free electricity for whatever you do.

But you need to make sure that the hardware is operated for sufficient time to pay for itself. Without end users harvesting your hw, data.

I think there can be a business to be made here, but it is not trivial at all.


Disclaimer: I consider this a scam

There is a company I saw (and I found it again by googling but I think it's the same) that sells you a heater that pays you some portion of the cryptocurrency it mines:

https://heatbit.com/


I can use the tube amplifier from my grandpa as a heater :-)

It is this brand: https://ms-vint-audio.de/kleinhummel-ks-57-restauration-eine...


I had similar results with a Classic 50 4x10 (3x 12AX7 and 4x EL84)

The only amp I've owned that actually had a computer-sized fan inside to keep air flowing by the tubes


I have heated portions of my home with S9s for a while. I know a guy who heats his pool with ASIC miners. It's not hard, but it is more expensive than conventional methods (additional cost not totally offset by proceeds). You should do it if you're interested.



For electric heating: modern heat pumps have a 1:4 to 1:5 energy input to energy output ratio. They spend energy to move heat. Which also means that they can be used in reverse.


Simple. My servers do not sufficiently heat up the room in winter. An entire data center is one thing, but for regular people, heaters do a better job heating.


Conventional heaters are much more BTUs/watt efficient.

EDIT: By conventional I meant typical/normal, my fault for forgetting that conventional has a specific meaning.


Ah...you might want to ask a physicist or engineer about that.


Resistive heaters literally turn 100% of energy into heat—same as a computer. Unless it's a heat pump, it's impossible to get more efficient than that: the heat has to come from somewhere.


I seriously doubt this is true, though I also couldn't find any readily available sources that seem credible. The closest seems to be a DIY comparison by Puget Systems[1]. A computer drawing 1000W is losing some energy to light and sound, but the vast majority is heat.

[1] https://www.pugetsystems.com/labs/articles/gaming-pc-vs-spac...


Even light and sound become heat eventually.


I'm not sure what you mean by "conventional", certainly a plain resistive heater drawing 500W emits the same amount of energy as a computer drawing 500W. If you stretch the definition of efficiency, an infrared heater produces a similar subjective improvement to a room's temperature for less energy, as the infrared light skips heating the surrounding air and warms the subject directly. That is one downside to using computers as heaters - they produce almost no infrared, which can be less efficient for a given subjective experience.


Ah, but you shouldn't be using a plain resistive heater in most circumstances. Heat pumps are much more efficient.


I use a resistive heater. A heat pump would be about 1000 times noisier, and I live in an apartment, so I couldn't put the heat pump where I don't have to hear it.


No longer true since at least 5+ years. New heat pumps are very _very_ silent


Hate to be rude, but I doubt that anything with a compressor can be silent. All heat pumps have compressors, right?

I've been around dozens of air conditioners, every one quite noisy compared to a resistive heater. Are heat pumps quieter than air conditioners?


I have both a heat pump and resistive heating in my house. The compressor goes outside the house so you never hear it (and it's not particularly loud, just sounds like a quiet powerful fan). The indoor split units, as they're called, just blow air over the coils. The fans are very quiet, quieter than a typical standing fan or an A/C vent blowing air.

Resistive heating is usually much quieter, however it can make annoying ticking sounds as the metal expands against the wall, or an annoying electrical buzz as the power is cycled on and off, multiple times a minute. This happens even if they're installed perfectly, houses and walls move over time.


Just remember that an A/C is also a heat pump. Lots of apartments around the world with A/C.


There are lots of resistive heaters out there. In areas where it gets cold only one day a year, it's the preferred option. They are also used as "emergency heat" for heat pump systems, again for the 1 day a year it's below -17F or whatever. These are absolutely no more efficient than cranking out some renders on your GPU. If you plug a computer into electricity and it draws 1000W, the room gets 1000W of heat. If you plug a space heater into electricity and it draws 1000W, the room gets 1000W of heat. Computers cost thousands of dollars, a 1000W space heater costs $9.99.

To answer the OP's original question, the reason that resistive heating is used for space heating is because the heater itself is cheap. A GPU burning power to heat and do computational work is 100x more expensive for the heat output. If you have $1000 to spend on heating, you just get a heat pump or a real furnace.

The reason that people choose inefficient heaters is for the lower capital cost. Let's say that 1 day a year, you need heat. Getting 2kW of heat for 24 hours will cost $20 once to buy the heater, and $15 in electricity each time you use it for the 30 times over the 30 year life of your house, so you're spending $470. Meanwhile, if you wanted to be 3x more efficient and go with a heat pump in that room, you're paying at least $1000 for the heat pump and installation, but only $150 for the electricity. You come out behind even though you saved a ton of energy. Meanwhile if you're heating every day for 3 months in the winter, you wouldn't want to be anywhere near resistive heating, because the higher energy costs add up almost instantly.

TL;DR: the economics of resistive heating don't favor initial capital outlay that would be required to make them do useful computations while heating. People use resistive heating for "an emergency" and don't want to pay a lot of money for something they never use.


Even if capital costs weren't a factor, space heaters make a lot of sense in very common situations involving heating a particular area. It doesn't matter if heat pumps are 3-4x more efficient if one has to heat up 6x-8x the area instead of a small room that's currently being occupied (i.e. a home office).


Yup, very true. Something else I thought of is that if you are a landlord paying the capital costs, but your tenant pays the electricity bill, then you're incentivized to pick the cheapest heating system, not the most efficient one. This factor probably makes resistive heating more widespread than it should be.


I'm not sure I follow? How could they be much more efficient? They turn electricity into heat so pretty much 100% efficiency and a CPU turns electricity into heat and a small amount of computation occurs as a side effect. So all electricity into a computer becomes heat aside from a tiny fraction for the motion of fans etc which heaters also usually have.


Aren't all heaters 100% efficient?


Some amount of energy may be lost to the environment without heating up the intended space/object, although it is usually a relatively small amount.


Heat pumps are more efficient than that.


Ok, but they are moving heat, rather than generating it.


So a heater I have in the house should be a subsidized equipment that does some unknown computation for a third party using electricity provided by me, or what is the idea here? Also, a lot, if not most of the new datacenters utilise the waste heat.


Who's putting up all the $Trillions, to replace large "just a big resistor" electric heaters with fancy new computers?

Where's the market for all that extra computing power? The market for bitcoin is quite limited. And, at scale, the resources poured into manufacturing bitcoin mining rigs are not available to meet other human needs.

For industrial heating applications, where the output temperature of the heater can easily be 500 °C or higher, what sort of computer hardware would work?


>A Bitcoin miner is more expensive to produce and maintain, yes, but over its lifetime shouldn't it pay for itself?

No. They won't pay for themselves at current prices with current difficulty even if you get "free" electricity

And the electricity isn't really free either, a heat pump style heater like a portable AC is about 300% efficient.


They break down too fast being exposed to that much heat so it’s not a good experience


Whatever efficiency you gain in the winter will largely be offset during the Summer.


I think there are a few reasons.

The parts of the country that get cold the most usually don't use electricity as a heating source. Natural gas is a ton cheaper than electric resistive heat. I don't know if a Bitcoin miner will produce heat as efficiently as an electric heater, but even if it does it's still a ton less efficient than natural gas.

If you live in an area that does use electricity for heat, an electric heat pump is going to be around 3x more efficient than electric resistive heat. Instead of creating heat, it's basically transferring heat (from the outside to inside your home). You're never in a situation where "there's no heat outside." It's never zero kelvin. Heat pumps do become less effective as the temperature gets colder, but you can get ones that are still over 2x more efficient even at 0F (-18C).

The part of the country that typically uses electricity for heat is often Maryland and south of that on the East Coast or Washington and Oregon which have temperate winters. Above freezing, a decent heat pump will be around 3x more efficient than electric resistive heating. At 50F (10C), it could be 4x more efficient. Even in a cold city like Boston, the mean daily temperature in December is over 35F. It does dip a bit below freezing for January/February at 29.9F and 31.8F, but at those temperatures a heat pump is likely to be at least 2.5x more efficient if not 3x more efficient. If New England electricity rates weren't so high, it could even be cheaper than natural gas (New England's electric rates are far higher than most of the country at 28.12c per kWh compared to 19.92c for Mid Atlantic, 16.53c for East North Central, 13.29c for West North Central, 15.11 for South Atlantic, 13.5c for East South Central, 14.07 for West South Central, 13.90 for Mountain, and 20.83 for Pacific Contiguous).

Basically, electric resistive heating is incredibly wasteful and using waste heat from computers wouldn't make that an effective heating plan compared to natural gas or heat pumps. Someone else noted that we use 150x more heating energy than computer energy so heat pumps will make a big impact while computer heat won't.

We'd never want to do more computing to harness the heat. We'd want to get that heat from more efficient sources.

> You can use heat pumps for extra gains on either, right?

No. That's not how a heat pump works. A heat pump takes heat from one place and puts it in another place. If the bitcoin miner is inside your home, all of its heat is already inside your home. If the bitcoin miner was outside your home, a heat pump could move that heat inside your home, but you'd be losing some of it along the way. Heat pumps don't multiply heat. They simply move it. If the heat is already inside your place, there's nothing to be moved.

I'd also note that data centers can potentially locate themselves near better sources of power. Some regions have a lot of hydro power which is a cheap and low-carbon way of getting electricity. Plugging in a bitcoin miner in New England where your additional demand will mean burning more natural gas or coal isn't going to be a good way to create heat. You're essentially taking natural gas, turning it into electricity and losing 60% of that heat/energy, transmitting that electricity and losing another 5% of it, and then wanting to turn that electricity back into heat. It'd be better to burn the natural gas in your home and use its heat directly.

So, there's not nearly enough waste heat for it to really move the needle on heating, heat pumps can mean a 50-80% reduction in energy usage rather than a less than 1% savings by re-using the small amount of waste heat, the coldest places typically use more efficient heating sources already, and data centers can locate themselves with good proximity to better/cheaper supplies of power than you typically get in your home.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: