Datacenter heat reuse is a thing. Mostly in European countries with cold climates. Notably, Finland has a few datacenters like that (Remov, Telia). I believe some countries might even mandate the feasibility study for reusing the heat for new datacenters.
I have no idea why it's not everywhere, but I see some issues right off the bat:
- you need a district heating system to dump the heat into, and to be really close to consumers
- the integration into the heating system isn't free
- heating supply doesn't match demand well (not seasonal; datacenter scaling depends on computing demand, heating is just a byproduct)
Yes. However you still must run chillers that would dissipate heat into the environment with or without heat re-use.
I was talking to one of DC technicians about this and mentioned - hey, if a chiller ever breaks down (they are redundant), one can (even in summer), put radiators to max to help dissipate heat from DC rooms. He said yes. I asked whether this is some actionable item on some risk plans or whatever - nop, that is not something that we depend on. And it would just extend the time to do something but not prevent from overheating.
I have no idea why it's not everywhere, but I see some issues right off the bat:
- you need a district heating system to dump the heat into, and to be really close to consumers
- the integration into the heating system isn't free
- heating supply doesn't match demand well (not seasonal; datacenter scaling depends on computing demand, heating is just a byproduct)