Free Air, Hot Racks, and Cool Liquids

Handling our gear's heat has always been an issue for installations large and small. ICT equipment typical took 1x-2x again more energy to remove its heat as it took to power it in the first place (PUE of 2.0+), driving both energy costs and carbon footprints. Early efforts focused on the two obvious tactics: make both the ICT gear and the air conditioning more efficient. We now see these augmented by innovative new approaches to the problem, ranging from seawater cooling to variable-speed fan retrofits.

'Free air cooling', 'air side cooling', or 'air side economization' goes beyond efficient chilling of recirculated interior air to use of outside air to provide low-energy cooling. This has made locations where the air is below 13°C most hours of the year particularly attractive.

Running equipment rooms under 25°C has always been the standard of data center cooling, but new technologies and operating practices are pushing that limit. This allows for both less cooling using conventional techniques and the use of free air at higher exterior temperatures. The American Society of Heating, Refrigerating and Air Conditioning Engineer's Technical Committee 9.9 on Mission Critical Facilities, Technology Spaces and Electronic Equipment (ASHRAE  TC9.9) has raised it's recommended maximum to 27°C, but the industry is already pushing beyond that.  sgi/Rackable, for example, claims its Cloudtrack C2 cabinet can support server room temperatures up to 40°C and Microsoft's new Dublin data center is said to be running at 35°C.* Google, one the other hand, typically runs its data centers at ~27°C.

Industry progress is reflected in Emerson Network Power's 2014 report Datacenter 2025. The top concern of surveyed operators in 2007 was "Heat Density". This had fallen to the number two concern by 2010 and to number three by 2013.

A related topic, covered in other posts, is the re-use of data center waste heat by external facilities.

Cautionary Notes

Innovative cooling technology alone will not lower energy costs if not properly operated. US EPA data suggests many data centers to not properly operate their economizers.

White paper authors from Dell and APC take issue with the "higher is better" hypothesis: "It is anticipated that increasing data center temperatures will reduce the…energy consumption by the data center cooling infrastructure. However, the dynamic nature of IT Equipment cooling fans may diminish or even negate the cooling system gains. Server fans will typically respond to a demand for increased airflow as inlet temperature to the server reaches or exceeds 25°C (77°F), consequentially increasing server energy consumption…In the three scenarios studied, the lowest energy use occurred anywhere between 24°C (76°F) and 27°C (81°F). The trigger in each case is the IT fan power increasing and exceeding the incremental decreases in the energy required to cool."

Facebook's Prineville data center (below) had some initial problems with humidity and condensation, causing server shut down. The problems and their solutions are presented in an Open Compute (OCP) blog. Innovation is rarely without challenges and the payoff in PUE for Facebook has been substantial.

Field Implementations

2015

Uptime Institute awarded Swisscom AG a 2015 Brill Award for data center design, EMEA region. "Swisscom's Wankdorf data center combines adiabatic re-cooling with rainwater, which completely eliminates the need for mechanical chillers and harmful refrigerants, and significantly reduces energy use. Swisscom cools using redundant air-conditioning units equipped with separate cold-water networks to cool its ICT rooms. Hybrid recoolers and redundant circulating pumps produce and transfer cool water for CRACs. When the outside temperature exceeds 21° C, treated water is introduced into the stream of warm air to extract heat through water evaporation (adiabatic cooling)." Wankdorf is also the latest European data center to feed its waste heat into a district heating utility. Visit Swisscom Wankdorf Data Center pages for more information.

Several vendors now offer liquid-cooled servers, an older concept making a comeback.

2012

Data centers are not the only ICT facilities using innovative cooling. A British broadcaster is using free air cooling to lessen the HVAC load from studio lighting and equipment. A California postproduction company is using cold air bypass values in its edit suites to cool the office space. More

2011

Facebook's Prineville data center: "The built-in penthouse houses the chiller-less air conditioning system that uses 100% airside economization and evaporative cooling to maintain the operating environment." More.

You can see examples of various hot/cold aisle containment techniques in this article's slideshow of German data centers.

Six Degrees Group won the 2011 Datacenter Leaders Award for Improved Data Center Efficiency. "...retrofit improvements...led to a 32% reduction in Power Usage Efficiency (PUE). Six Degrees Group undertook a series of innovative improvements to reduce energy consumption, including:
-Replacing air conditioning units with Ambicool free-cooling system
-Introducing indirect and direct evaporation cooling
-Replacing legacy CRAC fans
Together, these improvements reduced PUE by one-third from 1.85 to 1.26 and delivered a return on investment in less than 12 months."

Click here for Canadian data center design that proposes "seasonal ice cooling" and a Norwegian facility that cools with seawater.

2010

Voonami (UT-USA) has a new data center using evaporative cooling that is "along with other engineering innovations, …expected to trim energy costs by 80 percent over a typical giant data center." The technology, particularly suited to dry climates, is planned to be the sole cooling mechanism about eight months of the year.

European operator Telecity received the 2010 Datacenter Leaders Award for Improved Data Center Efficiency. "...energy-saving intelligent fans [could] be retro-fitted easily. The bottom line was that by upgrading fans that run slower or can switch off at times, a 10% energy saving could be achieved. An investment of £1m on variable speed drives brought these savings return within a year."

Click here for Canadian data center design that proposes "seasonal ice cooling" and a Norwegian facility that cools with seawater.

2009

The Google/Belgium and Microsoft/Dublin data centers both combine higher operating temperatures and air side cooling to eliminate the massive air conditioning units (chillers) typically associated with large data centers. They are referred to as 'chillerless' data centers. (See more about these and other mega data centers

According to HP, its Wynyard (UK) data center uses the "legendary cold wind blowing off England's North Sea. This glacier-cooled coastal air, often bone-chillingly icy, is being innovatively harnessed into a new technology tool: lowering temperatures of IT equipment and plant rooms for an anticipated annual energy saving of 40 percent compared to conventional data centres." HP lists these features:
- Eight 2.2m diameter fans in each of the four halls in the data centre used to supply air and another eight used to exhaust air
- A mixing chamber in the facility recirculates air to maintain conditions in the 5m-high pressurised plenum below the computer equipment
- Humidification and cooling coils in the data centre to tune the outside air condition and remove contaminants

Innovative cooling solutions are not just for larger centers. Associated Banc-Corp, a regional bank serving the upper midwestern United States, converted cooling at two data centers from compressors to heat exchangers and expects to save US$115,000 per year in energy costs.

Interxion's Stockholm data center combines free air cooling and seawater cooling with waste heat reuse and 100% renewable energy. "[In 2009, we] designed and implemented one of the first seawater cooling systems that not only reduced energy costs on the Stockholm campus by 80 percent, but also lowered our PUE from 1.6 to 1.09." This PUE is better than many mega data centers which can use scale to drive efficiency.