The IceCube Neutrino Observatory is located at the end of the world. Literally.
Encompassing a cubic kilometer at the South Pole in Antarctica, IceCube observes bursts of neutrinos from cataclysmic astronomical events in order to study dark matter and neutrino physics. The observatory also maintains a data center with about 150 servers. Even at the southernmost point on Earth, where temperatures are typically between -20 F and -50 F, the servers still produce enough heat to make cooling the data center a major concern.
Servers produce a lot of heat. Making sure they don’t overheat–and stop working–is always top of mind for data center operators. Traditionally, data centers have been cooled through a combination of air conditioning and a ‘hot aisle/cold aisle’ layout for the server racks. In such a layout, the colder sides of the servers face each other in one row, while the hotter sides face each other in another row, creating ‘cold’ and ‘hot’ aisles. The cold aisles face air conditioner output ducts, while the hot aisles face air conditioner return ducts.
Blowers also help maintain appropriate temperatures and control air flow to avoid mixing the cold and hot air. Some data centers use containment systems, consisting of robust physical barriers, to further isolate the hot and cold air.
Hot aisle/cold aisle is by far the most common data center cooling method. Unless you’re located in a sufficiently cold climate, however, cooling the air this way can be energy-intensive.
Luckily, there are a growing number of new approaches to cooling that actually reduce energy consumption and costs.
If the air outside is cold enough, why not use it to cool your servers? As it turns out, when located in sufficiently cold climates (such as Antarctica), data centers can use the outside air, as well as ocean or lake water, for cooling. Several data centers, including Facebook’s data centers in Lulea, Sweden, are located near the Arctic Circle for precisely this reason.
These natural cooling systems do more than simply circulate air or water through a facility, however. Because optimum humidity levels must always be maintained, air has to be properly treated and filtered. Also, if the outside air does get too warm, even those using these systems have to fall back on other cooling methods. Nevertheless, when feasible, building data centers in colder climates can significantly reduce energy usage and costs associated with cooling.
Water conducts heat about 30 times better than air, making it much more efficient for cooling servers. In a chilled-water cooling system, a pump circulates cold water through a tubing system that runs through server cases. Since water conducts electricity and would damage the equipment, it is never brought into direct contact with the server components, but rather runs alongside them, conducting the heat being generated.
Water-cooled servers enable significant performance increases and cost reductions compared to traditional air cooling. In fact, some data centers have reduced their energy costs by over 50% by switching to chilled water cooling.
Liquid Immersion Cooling
Everyone knows you can’t let liquid touch electronic equipment.
Or can you?
A new method of cooling called liquid immersion–where servers are completely submerged in a dielectric fluid that does not conduct electricity–has emerged. This method is hundreds of times more efficient than air cooling. In fact, with liquid immersion cooling, server density can far surpass what is possible in air-cooled data centers. High-performance computing (HPC) environments, where large concentrations of high-end servers produce too much heat for traditional air cooling to be practical, offer one promising use case for liquid immersion cooling. Needless to say, reducing reliance on, or even eliminating, air cooling would reduce power usage and expenses tremendously.
Evaporative cooling, also known as swamp cooling, uses water evaporation to cool data centers. It is a relatively simple technique in which air is funneled across wet pads or filters. Excess heat dissipates into the water, and cooler air gets funneled back into the data center. The technique only requires a fan and a water pump, rather than a compressor (which accounts for a large percentage of the energy used by most other cooling systems). In fact, evaporative cooling has even been nicknamed “free cooling.” While it isn’t exactly free, it is 75% cheaper than traditional air cooling.
Unfortunately, as with natural air and water cooling, evaporative cooling has geographic limits. The process makes the air very humid, which isn’t good for equipment. That’s why the method is only effective in dry climates like the American Southwest. Microsoft, for example, recently purchased land in Arizona, and plans to build data centers there with evaporative cooling systems. So, while evaporative cooling is likely the greenest and most cost-effective cooling method, it isn’t an option for everyone.
No matter where you are in the world, it’s probably colder underground. The concept of geothermal cooling isn’t new; people have long kept perishables cool by storing them underground, such as wine in a wine cellar. The same concept is increasingly being used to cool data centers.
Geothermal cooling uses an underground closed-loop piping system filled with water and/or a coolant. The water or coolant is circulated through the pipes, and carries heat from the data center below the earth’s surface, where it is absorbed into the ground, using the soil as a heatsink. The mass of the earth enables geothermal systems to cool intense sources of heat using very little power. Many data centers have already adopted this cooling method, including the American College Testing data center in Iowa City, Prairie Bunkers Data Center Park in Hastings, Nebraska, and Verne Global in Iceland.
While not technically a cooling method, artificial intelligence (AI)-based smart monitoring is being used in data centers worldwide for more efficient and cost-effective cooling. AI systems can monitor temperatures and humidity levels, detecting energy inefficiencies and making real-time adjustments. Rather than wasting energy by constantly cooling an entire data center, smart sensors can detect individual hot spots and direct cold air to only those areas that need it. Historical data can be used to predict future temperatures in the data center, allowing data center operators to make informed plans that reduce energy consumption and cut costs. Google, for example, used AI to reduce its data center cooling bill by 40%, and eBay’s South Jordan, UT data center saved 50% on all its energy costs thanks to AI.
Conventional data center cooling methods simply won’t hold up under the processing demands of AI, 5G wireless, Internet of Things (IoT), and the rise of Smart Cities. The cooling methods just described, as well as promising new methods, will be needed not only to ensure that data centers can keep up with data processing demands, but also to make data centers ever more sustainable. To learn more about Netrality’s data center cooling techniques, contact us.