This Data Center Energy Efficiency Best Practices Guide was made to help data centers curb their primary operational cost: energy consumption. As data consumption continues its exponential growth, data centers will need to maximize their energy efficiency to avoid being crippled by power costs. This article will cover the best practices that data centers can follow to improve their energy efficiency.
First and Foremost: Measure to Improve
As Peter Drucker is so often quoted as saying:
“If you can’t measure it, you can’t improve it.”
Before you consider any of the other strategies in this article, it’s important that you have a measuring system in place already.
Google is so intent on measuring accurately that they sample their power usage effectiveness (PUE) at least once per second. By measuring across the year, you account for seasonal weather variations and their effects on the data center’s cooling consumption. The center of expertise for energy efficiency in data centers has a helpful guide on measuring data center energy efficiency.
Cooling equipment is responsible for a large portion of a data center’s power consumption. For data center infrastructure specialists, finding new ways to improve power usage effectiveness (PUE) is critical, and therefore cooling is of primary interest.
Of the cooling equipment, chillers and CRACs use the most energy, so minimizing their workload is critical for efficiency.
Standard Energy Efficiency Best Practices for Easy Wins
- For most average data centers, using hot aisle/cold aisle containment to improve air cooling efficiency is popular and easy to retrofit in existing data centers. It helps isolate and eliminate heating chaos and hot spots.
- To help better segregate the aisles, common best practice is to block off holes in rack slots with “blanking plates” and prevent heat leakages.
- In recent years, using racks with attached liquid cooling units can take some of the load off of air cooling for a fraction of the energy cost, without needing to retrofit the data center
- To bolster traditional containment style air cooling, economizers are easily available to utilize “free” cooling where the environment is permitting, and reduce power consumption further. Just keep in mind that outside air can contain contaminants and humidity and filter or abstain accordingly.
- Another common method is to simply raise the data center’s ambient temperature. Most equipment functions perfectly fine with a cold aisle above 26°C, and raising the ambient temperature can greatly reduce cooling needs
- Computational fluid dynamics can be used to create a thermal model of your data center and optimize cooling & air flow without necessarily needing to reorganize the data center. CFT can be used to test how the system responds to various conditions, and help identify problem areas for improvement.
- CFD can also be used to great effect when designing a data center. Keep in mind that modelling is an incredibly complex process that interpolates vast amounts of data, and the tools do have limitations as of writing in 2018. They are not really meant for real time monitoring, and some inaccuracies are inevitable with more heavy computations.
While the aforementioned bits are widely adopted as standard best practices, many data centers haven’t updated their infrastructure recently or acquired the easy efficiency wins. Here’s a diagram from submer.com to illustrate the cold aisle/hot aisle concept:
Aside from optimizing existing cooling paradigms, there are several other methods of cooling which can provide a significant efficiency boost beyond the standard practices.
One of these methods is direct evaporative cooling.
Direct evaporative cooling
Direct evaporative cooling (DEC) uses misting to provide the substrate for evaporation. If you’ve ever seen the fans blowing mist in las vegas to cool off pool-side vacationers, you’ve seen this mechanism in action.
Here’s a diagram of how direct evaporative cooling works from dchuddle.com:
According to Anne Wood, an executive at Phoenix MFG, inc, it is not uncommon to realize a 50% increase in efficiency from DEC. However, there are a few considerations:
Obviously to use water for cooling, the data center will need access to a reasonable volume of water.
Direct evaporative cooling also requires a system to purify the water, store backup water, pump the mist, and regulators to control water flow and pressure.
Additionally, direct evaporative cooling is introducing humidity into the data center, which makes direct evaporative cooling more viable in certain use-cases than others. In particular, data centers in drier climes with access to water would likely benefit from a DEC solution. Before considering one, humidity sensors and predictive modeling can help predict whether DEC will push ambient room humidity levels past the standard limits of 60%.
Indirect evaporative cooling is a somewhat less common, but equally viable solution using the same concept.
Indirect Evaporative Cooling
It basically takes warm air from outside the data center and brings it through a heat exchanger. The heat exchanger facilitates evaporation, cooling the air as it’s sent into the data center. At the same time, humidity and heat are excreted back out from the exchanger. The con of this is that a heat exchanger does lose a few degrees of cooling over direct evaporative cooling. Indirect evaporative cooling also requires two fans instead of one, like in direct evaporative cooling.
The benefit of Indirect evaporative cooling is that it does not introduce any humidity or outside elements into the data center environment, which may or may not be a concern.
- Where outside particulates or humidity is a concern, Indirect evaporative cooling is generally more appropriate
- Data centers with access to water and unconcerned with outside elements will enjoy greater efficiency from direct evaporative cooling.
The most innovative of recent cooling strategies however, is immersion cooling.
Alibaba has committed to using immersion cooling in its data centers, and estimates it will save space by 75%, increase power density, and reduce operational costs by 20%. As far as data center energy efficiency best practices, immersion cooling will very soon be a standard on that list.
Alibaba’s immersion cooling tech from datacenterdynamics.com:
Despite showing such significant promise to reduce data center PUE, the idea of liquid immersion is uncomfortable for many IT companies.
As technology improves and more early adopters come forward, it’s become apparent that immersion cooling is not only viable, but is imperative if data centers are going to keep up with the rising efficiency needs of modern data centers.
In high-density data center deployments, immersion cooling will likely be essential, but existing data centers are slower to adopt the technology.
Some of the considerations with Immersion cooling are expense, mess, infrastructure retrofitting, Hard Drive compatibility, Vendor Compatibility weight, safety, floor space, and resource consumption. While valid concerns with extensive retrofit projects, they no longer present sufficient drawbacks to prevent widespread adoption in future deployments.
We’ll take a high level look at each one.
Immersion cooling came with cost premiums that offset many of its energy efficiency advantages, especially for retrofit use cases. Factors responsible for these cost premiums include:
The fact of the matter is that oil and other dielectric fluids can increase maintenance labor requirements. This varies depending on the solution provider used.
In the past, more universally viable immersion solutions were not available. This necessitated a total data center retrofit or from-scratch build project to accommodate the liquid cooling technologies. While it may still be impractical to gut more legacy data center for new cooling tech, immersion cooling is the more efficient choice for most new data center builds.
Hard Drive Compatibility
Standard hard drives cannot be submerged in liquid cooling systems. However, sealed spinning disk drives are an option, and solid state drives are taking up more floor space in modern data centers. Additionally, modified drive caddys can be used to keep drives above the oil’s surface.
In the past, vendors would void warranties, but as immersion cooling has become more established in data center infrastructure, vendors no longer void warranties with immersion cooling usage.
Additionally, many immersion cooling solutions, like GRCs, are compatible with every major server vendor and most rack setups.
Weight has historically been a challenge in liquid immersion cooling, and more specifically for rack mounted solutions. The weight can be a valid concern, but more as a result of system density than the weight of the fluid.
Additionally, air cooling infrastructure is no longer necessary with immersion cooling, which significantly reduces total weight on account of replacing heavy CRACs/chillers/optimizers, etc.
With the floor loading capabilities of modern data centers and space savings of immersion cooling, weight shouldn’t be a preventative concern.
Most immersion cooling uses non-flammable, non-toxic fluids. Slips from fluid spills are valid to be conscientious of.
One of the bigger value adds from immersion cooling is the floor space savings they provide. Immersion cooling is space efficient, and eliminates the need for all of the infrastructure required for air cooling.
Resource Consumption Recap: Cooling Systems
Immersion cooling consumes radically less energy and water than other cooling options. As data centers’ contribution to world resource consumption increases, it will be ever more critical to improve resource consumption efficiency in our data center infrastructure systems.
Based on all of the relevant considerations, many corporations are realizing the potential superiority of immersion cooling, with one going as far as building an entire data center underwater. While it may not be viable for your data center, give serious consideration to the potential of immersion cooling for your hardware.
It’s estimated that a third of server energy is wasted before it gets used for computation.
Much of it is lost at the power supply, where AC is converted to DC, and the voltage regulator, where the PSU’s output is converted to the voltages that microchips use. As a result, investing in efficient power supplies and voltage regulators is key.
One small change is to place the backup batteries on the server racks themselves and cut out one AC to DC conversion stage.
Another best practice is to arrange the higher voltages closer to the PSU than the lower voltages, reducing line loss.
Batteries: Li-ion batteries, VRLA, or Nickel-Zinc?
Though likely not the first thing you think about when data center energy comes up, batteries are an integral part of power within data centers. In this section we’ll go over three of the primary battery options.
Valve regulated lead acid (VRLA) batteries have been the standard for years, but do have drawbacks over other options:
VRLA is less energy efficient than other options
- VRLA is significantly hazardous – the electrolyte (sulfuric acid) within is corrosive, while the lead component can cause nervous system damage among other issues
- Lower battery life
- Slower to recharge
- Less discharge cycles
Li-ion batteries on the other hand:
- Have a greater up front cost
- Provide better efficiency
- Last longer- often the length of the UPS
- Less discharge, faster charging, and more discharge cycles
- Are flammable (can’t exceed 104 degrees Fahrenheit)
Nickel Zinc Batteries (From ZincFive):
- Charge faster than either
- Have more discharge cycles than both
- Comparable battery life to Li-ion
- Higher power density
- Low heat output
- Not flammable or toxic
- Comparable cost to high end lead systems, lower pricetag than high-output Li-Ion batteries
While Nickel-Zinc technology itself is nothing new, it has never been applied to data centers before in the same way. It may prove a safer, more environmentally friendly option for data centers.
With server virtualization, a data center doesn’t need as many servers to handle its workload. With less servers, the total energy consumption can be greatly reduced. While beyond the scope of this article, Gartner recently put out an update of the virtualization landscape.
AI can be invaluable in the data center, with Google citing a 40% increase in cooling efficiency after letting loose DeepMind on its data center. While it’s not likely that Google will be sharing Deepmind as an open-source giveaway any time soon, other companies like Verdigris offer potential solutions to leverage deep learning for data center energy efficiency:
It is inevitable that other solutions will pop up to provide similar capabilities in the future, though the pool is fairly dry for now.
It’s important to note that deep learning and AI-based systems to improve energy efficiency are not the same as Data Center Infrastructure Management (DCIM) tools.
DCIM is fundamentally different than AI, in that it still places all of the agency in the hands of the humans. DCIM tools can be used to optimize data center operations, but with the gargantuan mass of data they cope with, the tools simply can’t process and act on the insights like an AI can.
That being said, energy-specific DCIM systems like Schneider Electric’s Ecostruxure IT can prove incredibly helpful and more accessible than the fruits of AI’s labors. With increased visibility into the goings on of your data center, you can make more informed decisions; at least until the deep-learning programs can handle it all for us.
Many organizations have servers powered on but not doing anything of use; DCIM tools allow managers and administrators to find these servers and clean up poorly optimized workloads, as well as easily manage building and environment controls for better energy efficiency. Don’t simply rely on the tools however: best practices for data center energy efficiency involve manual inspections for insights potentially missed on network tools.
Conclusions: Improving Efficiency and Power Cost Savings
By utilizing the technique, strategy, and technology in this guide for data center energy efficiency best practices, you can steadily move closer towards that perfect 1.0 PUE index near companies like Google and Facebook, and with that, enjoy the operational cost savings and lower environmental impact that comes with a more efficient data center. While there is a lot here, there’s no need to tackle everything at once.
Do you know your cooling falls behind industry standards? Perhaps reach out to solutions providers to improve that area.
Do you lack true visibility to help you manage your data center effectively? Begin by evaluating the DCIM tools market to help you make informed decisions.
Need to clear floor space and have aging servers? Look into your server virtualization options with the team. Regardless of the area, moving forward with consistent effort towards lagging areas will nearly always pay off in OPEX or environmental impact improvements.
Moving Forward: Everything Must be Considered for Future Proofing of New Data Centers
With data centers consuming more and more energy, we need to be cognizant of our impact on infrastructure, our energy systems and the world. Retrofitting existing data centers may only make sense in limited capacities, as many solutions are not cost effective nor worth the effort for existing facilities.
Where these practices are really important are after liquidating existing data centers and building out the new ones. If you’re considering what type of servers to get for a new data center, make sure to check out our posts on white box servers and comparing HPE vs Dell servers to find what’s best for your needs.
More energy efficient data centers are going to be a part of making them more cost-effective both for data center operators and end users, as well as making each individual data center lower impact. As demand for data center capacity continues to rise it will be increasingly important to have more efficient systems in place to balance their power needs and keep costs down for everyone.
As long as we put serious effort to improve our energy efficiency, we can keep our environmental impact to a minimum and enjoy this big blue marble in the sky we call home for years to come.