In this blog, we’ll discuss liquid cooling vs air cooling in data centers.
With the rise of new technologies for data centers, and entirely new types of data centers in all manner of locations, it’s not surprising that systems and procedures change over time.
When it comes to efficiency, particularly liquid cooling vs air cooling in data centers, the latter has dominated in recent years.
But what’s the real difference and which one will have a greater, more positive effect on your business?
This article will explore that difference as well as the specific situations that call for different cooling methods.
For a long time, air cooling has been the preferred method in data centers. And for the most part, air cooling is still a viable option.
Given low electronic densities and affordable energy prices, blowing cold air across electronics generally works.
At the design level, engineers have worked to improve heat exchangers in CPUs.
But Electronics have become more and more compact.
Additionally, high equipment densities are more common, making the need for better cooling methods imperative.
Particularly as energy prices have increased, the inefficiency of air cooling is under speculation.
The trick is to save energy and not lose cooling power; everyone wants to be as energy-efficient as possible.
Some solutions include more targeted air circulation through thermal imaging and computational fluid dynamics.
Often times it’s as simple as re-examining basic data center architecture. But despite its popularity, the flaws in air cooling are starting to show more and more.
Not Enough Energy to Go Around
Simply put, air cooling is just not very effective as a heat transfer medium. And it’s becoming more and more apparent across the industry.
Air cooling usually consists of using large fans. But most data centers are strapped for space as it is, with equipment continually increasing in density.
That means fewer fans in fewer data centers. Which in turn means data centers lacking the ability to move cool air through equipment at an efficient rate.
Not to mention the extra need for fins on hot spots to help transfer that heat away from its source and into the air.
Standard computer room air conditioning often isn’t enough anymore.
Add in the rising costs of energy, and the result is rather expensive.
Utilizing water may be a more cost-effective approach moving forward.
Liquids are more conductive to heat, which means that even a room-temperature liquid can cool more effectively than cold air.
It may very well be the future industry standard.
Though air cooling has been more common over time, liquid cooling is on the rise.
Particularly as technology becomes more advanced, certain situations require more power.
However, generally speaking, computer performance hasn’t changed drastically in the last decade or so.
At the data center level, improvements to individual systems entail the implementation of accelerator processors. More specifically, graphics processing units (GPUs), application-specific integrated circuits (ASICs), and field-programmable gate arrays (FPGAS).
Hardware accelerators are often used in machine learning, but industry trends suggest they’ll be used more often in IT services like:
- Data mining
- Video/live media
- Engineering simulation
- Load balancing
- Fraud detection
The Price of Power
In the context of these trends, there are some important considerations for the future of power/heating.
Hardware accelerators like GPUs need a much higher rate of cooling than normal-performance CPUs, because of significantly higher thermal design points.
A demand for more powerful chips means demand for better cooling and efficiency.
Accelerators help cut down on problems like rack density in data centers, which contribute to cooling needs.
A GPU paired with an Intel processor can provide more output for less density than a strictly CPU setup for many workloads, which also helps to avoid excess power consumption and heat.
Immersion cooling works well with both GPUs and CPUs.
This makes liquid cooling an effective method when using accelerators like GPUs.
Cooling the Edge
Recent years and the demand for better, faster performances from data centers, has pushed for a new type of data center.
These new facilities exist and operate on the network edge.
They can be in remote locations, engineered for machine learning and filled with high-density computer hardware.
Heavy workloads in confined require better cooling methods. Especially in remote spaces.
Not all edge data centers would require liquid cooling, but it offers the opportunity of lower energy consumption.
Meaning more edge data centers could be deployed in areas that offer fewer power sources. Plus traditional air cooling simply won’t always be available in more remote areas.
Did you know that Microsoft has an underground data center? Learn more from our blog on Everything you need to know about the Microsoft Underwater Data Center.
High-Density Storage Cooling
Efficiently cooling storage can be a difficult task. But as storage density continues to increase, liquid cooling may offer a solution.
Though non-sealed hard disk drives cannot be cooled using liquid (and make up a big part of installed storage in data centers), newer trends offer a solution.
Most newer generations of storage hardware require units to be sealed since they’re helium filled.
Since they’re sealed, they would be safe for liquid cooling. Additionally, Solid State Drives can be cooled with full-immersion solutions.
With that in mind, the need to separate air-cooled storage from liquid-cooled processing would essentially be eliminated.
Plus the effects of heat and humidity on components can be reduced by immersing drives in cooling fluids.
Many approaches to liquid cooling are more catchall solutions. Meaning it can be difficult to find the right option or path to allow for a full move to liquid cooling.
Rather than using it on a case by case basis and risking potential shutdowns and setbacks. It needs to be a manageable, feasible operation.
But since water has between 50 and 1,000 times the capacity to remove heat than air, the decision is a no-brainer.
Every data center is different, but operators should at least look into the potential of a switch.
Given a smooth transition, implementing liquid cooling should be cost-effective and non-disruptive.
It might be easier to use forced air in the short term. Especially because a transition to liquid cooling might be more expensive at first.
But over time the effectiveness of liquid cooling combined with lower energy costs and better data center efficiency, and the answer is clear.
The point is, every data center’s situation is unique.
However, most data centers are facing the same obstacles when it comes to improving operational efficiency, particularly when it comes to liquid cooling vs air cooling.
Taking a closer look at your data center’s current situation and future plans will help you decide on what cooling options are best for you moving forward.
And if you find that any cooling improvements will involve a large rehaul, you may require a full-on data center decommission.
In that case, be sure to enlist the help of a certified IT asset disposition company.
At Exit Technologies, we offer full IT equipment services ranging from asset recovery, network equipment sales and recycling, data erasure, and full data center decommission services.
Have something to add? Let us know your thoughts in the comments below!