The term ‘free cooling’ has been used more and more in recent years in relation to data centre cooling. However, on the basis that there is no such thing as a free lunch, we suspect there’s no such thing as FREE cooling either. In this blog, Alan Beresford, Technical and Managing Director at EcoCooling explains the true costs of so-called free cooling.
If we took a rack of servers and put it in field in a cool environment like the UK you could just about claim you had totally free cooling. However, in truth, there is the cost and power usage of the two or more blower fans in every server. So even on a cool day, in a cool field, free cooling isn’t actually free!
Let’s move into the real data centre scenario. Provided the external fresh-air is below 18-20C and provided we can force enough of fresh-air through the data hall, then we have nearly-free cooling going on.
However, we now we need big fans to blow around 10 cubic metres per second of fresh-air through the data hall for every 100kW of IT load (about 30 to 50 racks). So, in addition to the server blower-fans we’re going to need power for these big air-movers.
In practice we also need filtration. This increases air resistance which adds to the fan power requirement. We also need to add evaporative cooling (where the fresh-air is cooled by the effects of water evaporation) to deal with the few days per year where the outside temperature is over 20C.
So the power budget is now up to 3-4kW per 100kW of IT load – a PUE of 1.03 to 1.04. Whilst this is still not ‘free’, it’s massively cheaper than the conventional refrigeration-based cooling systems that have been deployed for the last twenty years or more.
As yet, probably less than two per cent of data centres are cooled with fresh-air and evaporative cooling. And whilst a lot more could be, it’s not appropriate for every data centre. But we’ll cover that later.
Refrigeration dominates:
Refrigeration–based cooling systems come in a number of formats, the main examples are:
DX CRACS – where there is a DX (direct expansion) compressor and heat exchanger inside a CRAC (computer room air conditioning) unit within the data centre hall. Pipework containing a refrigerant connects the CRAC to a fan-assisted condenser unit outside. The refrigerator unit inside the CRAC extracts the heat from the data centre hot air and then transports the hot refrigerant to the condenser where the heat is expelled into the atmosphere.
Chilled-Water systems, where a refrigeration unit generally sits outside the data centre. This uses the standard compressor, evaporator and condenser-plus-fans model of refrigeration. However, it requires an additional heat exchanger to chill a water circuit that transports low temperature water to either a data-hall CRAH unit (computer room air handling) or to in-rack solutions like rear door coolers (where another heat exchanger with fans extract the heat from the data hall air).
A legacy chilled water refrigeration system can use up to 100 per cent of the IT load – that’s 100kW of cooling power per 100kW of IT load!
Modern refrigeration systems have benefited significantly from variable-speed fans and consume somewhat less.
Refrigeration with free cooling:
A lot of manufacturers have realised, righty, that there are a large number of days each year, in temperate countries, where the outside air is theoretically low enough to cool the data centre without the refrigeration system being used – and hence save significant power and energy cost.
The trouble with this idea is however two-fold: Firstly, you still need to power internal and external fans and pumps.
Secondly, in ‘free cooling’ mode a system designed for refrigeration is fairly inefficient. As a result, ‘free cooling days’ are not those with temperatures up to 18-20C, the inlet temperature to the air cooler unit would in theory need to be below 14C.
In practice, however, external chiller units are generally installed in ‘chiller-farms’ either on the roof or on the ground. There can be significant leakage of hot exhaust-air back into the chiller inlet. This means that the inlet is almost never below 14C. So, in some installations, despite the theory, you’ll practically get zero ‘free-cooling’ days.
Higher temperature, hidden cost:
Theoretically, you can get more ‘free cooling’ days from a system if you increase the server inlet temperature. Under recently relaxed rules from ASHRAE (which sets data centre cooling standards) the input temperature to the servers can be elevated from 18C temperature to 27C.
With say a 10C heat differential from front to back of the servers, this means the exhaust air will be at around 37C.
However, from ASHRAE’s own published figures component reliability on the servers is quite badly reduced – typically 30% more heat related failures at 27C compared with 20C inlet air temperature. Not great in a mission-critical facility.
It’s a little known fact that the servers themselves use more energy at higher supply air temperatures. A consequence of using high inlet temperatures to maximise cooling plant efficiency can increase server energy use by 3%. The PUE may look good but the actual operating cost may go up.
And then there are the implications for the engineers and all the power and data cabling, patching and network switches that are housed at the back of the rack – which were never designed to work with ambient temperatures near 40C.
Indirect air cooling:
Some data centre operators are beginning to understand that in many situations, though not all, it’s OK to blow filtered fresh-air through the data hall and servers.
Indirect air systems are a compromise with an air-to-air heat exchanger to keep the data hall air separate from the external fresh-air.
If, say, your data centre is close to a chemical works, or an inner city full of exhaust fumes, it can make sense. But the downside is that, with two air circuits, you need two sets of fans and the convoluted airflow path increases the air resistance in both circuits and the practical power use is more like 10-15kW when you add up all of the fan power usage.
Most indirect air systems, even those that use evaporative cooling, also need refrigeration for some days of the year, adding the cost of a refrigeration system to the expensive heat exchangers.
Nearest to free:
I’d love to be able to tell you that direct fresh-air cooling is the universal panacea for nearly-free cooling. But, sadly, I have to say it’s only for some people – because you can’t deploy direct fresh-air cooling at every site, nor in every climate.
EcoCooling now has evidence from 200 installations and from research studies by Cambridge University that show internal data hall air can meet clean-room standards and ASHRAE humidity requirements without any need for dehumidification.
And all of those 200 data centres have been able to operate for 365 days/year without any need for refrigeration back-up.
But even at PUE of 1.05 to 1.10, it’s still not quite free!
Guest blog by Alan Beresford, Technical and Managing Director at EcoCooling
For more information please contact sales@ecocooling.org or +44 (0) 1284 810586
0 Comments