|Why do computers use so much energy?|
|By Thom Holwerda on 2018-10-05 18:24:15|
Microsoft is currently running an interesting set of hardware experiments. The company is taking a souped-up shipping container stuffed full of computer servers and submerging it in the ocean. The most recent round is taking place near Scotland's Orkney Islands, and involves a total of 864 standard Microsoft data-center servers. Many people have impugned the rationality of the company that put Seattle on the high-tech map, but seriously - why is Microsoft doing this?
There are several reasons, but one of the most important is that it is far cheaper to keep computer servers cool when they're on the seafloor. This cooling is not a trivial expense. Precise estimates vary, but currently about 5 percent of all energy consumption in the U.S. goes just to running computers - a huge cost to the economy as whole. Moreover, all that energy used by those computers ultimately gets converted into heat. This results in a second cost: that of keeping the computers from melting.
I use a custom watercooling loop to keep my processor and videocard cool, but aside from size and scale, datacenters struggle with the exact same problem - computers generate a ton of heat, and that heat needs to go somewhere.
- The Burroughs B5500 emulator hosting site - 2018-09-05
- Our USB-C dongle hell is almost over - 2018-09-04
- Z80 computer wirewrapped on perfboard - 2018-09-02
- Why the future of data storage is (still) magnetic tape - 2018-08-29
- More related articles