What is your Problem with Liquid Cooling?

By Herb Zien, CEO, LiquidCool Solutions

Herb Zien, CEO, LiquidCool Solutions

It’s simple physics. Air is an insulator with negligible heat capacity and thermal mass. Because warm air rises and cold air sinks, does it make sense for a data center to have a raised floor and blow cold air up?

Air cooling is expensive. Up to 15 percent of the total power supplied to a data center can be used to circulate air, and another 15 percent is used by rack and blade fans. Over time oxygen in the air destroys electronic components, and pollutants accelerate deterioration. Not only do fans fail, they are inef ficient and waste space. They also limit power density, a critical factor in reducing the white space footprint, maintenance cost and e-waste.

"Immersive liquid cooling is the best way to cool PFMs by far, because power density can be increased by a factor of five and no water is needed to dissipate heat"

Last year Emerson Network Power conducted a survey of its Data Center Users’ Group, and 36 percent responded that their data centers will reach capacity in less than 24 months. Over a quarter reported that available power prevented them from accommodating additional compute capacity and more than 30 percent indicated that either the data center cooling system or available floor space was holding them back. Additionally, with the emergence of hyperscale computing facilities and escalating trends in cloud computing and big data, there is concern about rising heat loads in the data center.

If you are among that group, you could extend the life of your existing facility by gradually migrating to liquid cooling as your IT equipment is refreshed. There are three liquid cooling technologies to consider: cold plates, in-row cooling and immersive cooling. All three support high power density, which saves space by reducing the number of racks needed to get the job done. However, immersive cooling completely eliminates fans and mechanical refrigeration, so the PUE is lower than all air or liquid alternatives. Also, immersive cooling eliminates the need to provide an air conditioned, humidity controlled room, so infrastructure costs are lower.

Immersive liquid cooling means that electronics are totally immersed in a non-conducting dielectric fluid, thereby eliminating chassis fans and decoupling electronics from the room.The heat capacity of dielectric fluids is more than 1,000 times greater than air , so it does not take much fluid to efficiently move heat out of the chassis. Additionally, dielectric fluids have thermal mass, which mitigates temperature fluctuations.

Replacing a bath of air with a dielectric fluid offers significant benefits. There are no thermal barriers or heat exchangers in the chassis, so cooling efficiency can be more than 90 percent better than air andsignificantly better than cold plates or in-row cooling systems. Almost all root causes of server failure are eliminated:

No rack or chassis fans to fail
No oxidation/corrosion of electrical contacts
Reduction in thermal fluctuations, which drive solder joint failures
Much lower operating temperatures for the board and components
No fretting corrosion of electrical contacts induced by structural vibration
No exposure to electrostatic discharge events
No sensitivity to ambient particulate, humidity, or temperature conditions

Some immersive systems are single phase where the dielectric fluid remains a liquid throughout the heat dissipation cycle. In others a two-phase system is used where the fluid boils and is then condensed. In both cases a closed cycle is used to dissipate heat, which facilitates energy reclamation.

Infrastructure benefits resulting from immersive liquid cooling can add up. For example Facebook data centers rely on evaporative cooling to eliminate chillers and increase energy efficiency. Comparing a recently constructed 7.5 megawatt Facebook data hall with an immersive liquid cooling alternative is illustrative:

In addition to the obvious space and power benefits, the immersive cooling alternative eliminates the need to furnish and maintain large room air handlers, a dehumidification system, water treatment equipment and air filtration. All of these factors contribute to an estimated initial capital cost saving of $40 million per data center and ongoing maintenance cost savings every hour the data center operates.

If your company does not plan to extend the life of its existing data center, but instead plans on new construction, a prefabricated modular data center (PFM) makes sense from a capital efficiency standpoint. Immersive liquid cooling is the best way to cool PFMs by far, because rack density can be increased by a factor of five, power requirements reduced as much as 40 percent and no water is needed to dissipate heat.

There is the example of a Fortune 100 company that spent $160-million to build a 20-megawatt data center, a cost that included power and cooling infrastructure but not IT equipment. The plan was to fill the building over several years, and 5-megawatts of IT equipment were installed the first year. Using immersive cooled PFMs, it would have cost less than $10-million to accommodate 5-megawatts of IT equipment, freeing $140-million; at the lower capital commitment the build decision would have devolved from the Board to the CFO at that company. Furthermore it is probable that IT equipment installed in the fourth year will be incompatible with power and cooling infrastructure built five years before. PFMs enable modular expansion as computing needs arise and equipment evolves.

There is a perception that immersive liquid cooling is expensive and messy. Cost and scalability will be important to widespread adoption, as well as a user-friendly interface that enables quick and neat rack maintenance. There is at least one commercially available immersive cooling technology that has solved these problems, and it costs less than air cooling. It is inevitable that air-cooled data centers will become obsolete as a matter of cost and performance. The question is when your company will make the change.