Seems like it would be simpler and easier to just build a datacenter near the ocean and pump seawater to it.
Regardless of how it works, I wonder if it's possible to get significantly more performance out of chips that are explicitly designed to operate at a much lower maximum temperature? Let's say you want to build a datacenter somewhere like the Northern coast of Alaska or Antarctica, and you have year-round reliable and abundant access to sea water that's barely above freezing, and suppose that you're willing to order enough CPUs / GPUs / bitcoin mining ASICs or whatever to justify your supplier to design and manufacture custom chips to operate in a lower temperature range. Is there some potential boon to performance or reduced manufacturing cost (due to looser tolerances) that might make such an effort worthwhile?
> Is there some potential boon to performance or reduced manufacturing cost
Guessing, but in the balance, I would assume no. Power generation costs are higher and service is less reliable. Staffing is more costly. Transport for all other resources is also more costly and at a much higher delay than in many other areas that are traditionally used.
From my own experience with high power (~25kW) FM radio transmitters, some of which we have at the top of mountains in the middle of ski-resorts: while cooling costs are reduced throughout the winter, they /aren't/ totally eliminated, so you still need to size and maintain the equipment for summer loads anyways. Getting to the site to do work and maintenance is an absolute chore; replacing the transmitter was a 10 man job where similar rigs in other locations only required 3.
Although, if it was the summer, after we finished our work, we got to take the alpine slide down the hill. As to why we put it there in the first place, signal coverage was excellent, and they already had the power we required available due to the chair-lifts being nearby. If we could have put it somewhere else and got the same coverage, we would have.
Regardless of how it works, I wonder if it's possible to get significantly more performance out of chips that are explicitly designed to operate at a much lower maximum temperature? Let's say you want to build a datacenter somewhere like the Northern coast of Alaska or Antarctica, and you have year-round reliable and abundant access to sea water that's barely above freezing, and suppose that you're willing to order enough CPUs / GPUs / bitcoin mining ASICs or whatever to justify your supplier to design and manufacture custom chips to operate in a lower temperature range. Is there some potential boon to performance or reduced manufacturing cost (due to looser tolerances) that might make such an effort worthwhile?