Ice cream and servers both need care, maybe not the same environment.
Ice cream season is here in full force this hot, sticky Boston August. My favorite is pistachio, not the green stuff but the white kind, like Ben & Jerry’s Pistachio Pistachio. Pistachio is among my wife’s least favorite flavors, which means I get it all to myself; she gets to pick her own which these days has the words salted caramel in it. What does ice cream have to do with data center servers? Apparently data center managers keep their server rooms very cold, which means both require a significant amount of energy to maintain their integrity.
Figure 1 (Left, Link to Source) and Figure 2 (Right, Link to Source) have something in common, they both require a significant amount of energy to keep them cool and happy.
According to the July/August 2015 issue of Data Center Dynamics magazine (Link to Source) one data center professional states that server rooms with server inlet temperatures below 24°C (75°F) is a sign of paranoia. Author Peter Judge notes data center managers are overworking their cooling systems, overcooling their spaces, “wasting vast quantities of energy and - ironically - contributing to global warming and melting the world’s polar ice caps.” Judge points out that despite ASHRAE’s widely accepted guidelines for server inlet temperatures of 27°C (80°F), enterprise data centers are “seriously lagging” in raising temperatures. If polar ice caps are melting, can ice cream be far behind?
An early 2015 IDC survey contacted 404 US data center managers with 100 or more physical servers. Results found that 75% were operating below 24°C (75°F) with only 5% at or above 27°C (80°F). The article further notes that these sites have PUEs between 2.4 to 2.8, therefore 60% to 65% of the power they use is not used for IT equipment, but rather cooling systems.
The paranoia stems from being risk-averse, a common trait for data center operators that increasingly guarantee greater and greater uptime and availability to their customers as a competitive advantage. There’s a significant amount of data and inertia to change something without knowing the consequences, and experimenting on a money making data center is not generally allowed. Additionally, the author notes an upcoming study that surprisingly shows raising data center temperatures can actually increase energy usage likely due to increased server fan usage to assist server cooling at higher temperatures.
The fact is that complex systems and environments like data centers can present surprises. The granularity of temperature mapping needed to help visualize hot and cold spots in data centers is often not available and even if it is, server utilization can change it over time. Modeling such a system may be very complex, possibly too complex for existing modeling tools; simpler static models may not be as useful as expected.
Figure 3 (Left Link to Source) Rube Goldberg competition entry looks complex but is very simple and predictable compared to Figure 4 (Right Link to Source) showing a dynamic data center temperature and airflow CFD model.
The author provides some alternatives to consider such as ensuring you are using hot and cold aisle containment which will help improve the efficiency of traditional air conditioning technology. Employing direct outside air cooling can help in many if not most locations, even if the approach is used only for a portion of the day or year. And adding water evaporation on the cooling coils when outside temperatures can help; such an upgrade may not be physically feasible and can require increased maintenance.
The fact is data center cooling is a complex system with hundreds if not thousands of moving parts including site climate and weather. What works in one location and with one set of equipment may not provide the same results at a different site. Server and IT HVAC and energy equipment manufacturers can perform experiments but not everywhere and under every condition. The semiconductor industry found that industry consortia can provide significant insight to advanced practices in building design and environmental control. This requires equipment suppliers to work together in a pre-competitive environment as well as support by regional governmental entities. LBL performs experiments for the data center industry but it is only one organization and one location. Perhaps as the industry continues to move toward standardization and consolidates there will be the will to cooperate. For the sake of the industry’s image, the environment and ice cream lovers worldwide, one can only say that sooner will be better.
Temperature@lert provides cost-effective, reliable, fault-tolerant wired and wireless temperature monitoring solution for organizations of all sizes. Our products and services can help bring peace of mind to small and mid-sized companies and their data centers with minimal training or effort. For information about Temperature@lert’s Cellular and SensorCloud offerings, visit our website at http://www.temperaturealert.com/ or call us at +1-866-524-3540.