Performance?

Yes, we did monitor performance. But it simply was not worth talking about: the results at 20°C inlet are almost identical to those at 40°C inlet. The only difference that lower temperatures could make is a slight increase in the amount of time spent at higher Turbo Boost frequencies, but we could not measure any significant difference. The reason is of course that some of our VMs are also somewhat disk intensive.

Conclusion

The PUE optimized servers can sustain up to 40°C inlet temperature without a tangible increase in power consumption. It may not seem spectacular but it definitely is. The "PUE optimized" servers are simply improved versions; they do not need any expensive technology to sustain high inlet temperatures. As a result, the Supermicro Superserver 6027R-73DARF cost is around $1300.

That means that even an older data center can save a massive amount of money by simply making sure that some sections only contain servers that can cope with higher inlet temperatures. An investment in air-side or water-side economizers could result in very large OPEX savings.

Reliability was beyond the scope of this article and the budget of our lab. But previous studies, for example by IBM and Google, have also shown that reasonably high inlet temperatures (lower than 40°C) have no significant effect on the reliability of the electronics.

Modern data centers should avoid servers that cannot cope with higher inlet temperature at all cost as the cost savings of free cooling range from significant to enormous. We quote a study done on a real-world data center by Intel:

"67% estimated power savings using the (air) economizer 91% of the time—an estimated annual savings of approximately USD 2.87 million in a 10MW data center"

A simple, solid and very affordable server without frills that allows you to lower the cooling costs is a very good deal. 

Other Components
Comments Locked

48 Comments

View All Comments

  • iTzSnypah - Tuesday, February 11, 2014 - link

    I wonder why nobody has tried geothermal liquid cooling. You could do it 2 ways. Either with a geothermal heat pump set up or cut out the middle man and just use the earth like you would a radiator in a liquid cooling loop. The only problem would be how many wells you would have to drill to cool up to 100MW (I'm thinking 20+ at a depth of at least 50ft).
  • ShieTar - Tuesday, February 11, 2014 - link

    Its kind of easier to just use a nearby river than dig for and pump up ground water. That's what power stations and big chemical factories do. For everybody else, air-cooling is just easier and less expensive.
  • iTzSnypah - Tuesday, February 11, 2014 - link

    You wouldn't be drilling for water. You drill a well so you can put pipe in it, fill it back up and then pump water through the pipes using the earth's constant temp (~20c) to cool your liquid which is warmer (>~30c).
  • looncraz - Tuesday, February 11, 2014 - link

    I experimented with this (mathematically) and found that heat soak is a serious, variable, concern. If the new moisture is coming from the surface, this is not as much of an issue, but if it isn't, you could have a problem in short order. Then there are the corrosion and maintenance issues...

    The net result is that it is cheaper and easier to just install a few ten thousand gallon coolant holding tanks and keep them cool (but above ambient) and to cool the air in the server room(s). These tanks can be put inside a hill or in the ground for extra installation and a surface radiator system could allow using cold outside air to save energy.
  • superflex - Wednesday, February 12, 2014 - link

    You obviously dont know have a clue about drilling costs.
    For a 2,000 s.f. home, a geothermal driller needs between 200-300 lineal feet of well bore to cool the house. In unconsolidated material, drilling costs per foot range from $15-$30/foot, depending on the rig. For drilling in rock, up the cost to $45/foot.
    For something that uses 80,000x more power than a typical home, what do you think the drilling costs would be?
    Go back to heating up Hot Pockets.
  • chadwilson - Wednesday, February 19, 2014 - link

    That last statement was totally unnecessary. Your perfectly valid point was tarnished by your awful attitude.
  • nathanddrews - Tuesday, February 11, 2014 - link

    Small scale, but really cool. Use PV to power your pumps...

    http://www.overclockers.com/forums/showthread.php?...
  • Sivar - Tuesday, February 11, 2014 - link

    Geothermal heat pumps are only moderately more efficient than standard air conditioning and require an enormous amount of area. 20 holes at a depth of 50ft would handle the cooling requirements for a large residential home, but wouldn't even approach the requirements for a data center.
    One related possibility is to drill to a nearby aquifer and draw cool water, run it through a heat exchanger, then exhaust warm water into the same aquifer. Unfortunately, water overuse has been drained aquifers such that even the pumping costs would be substantial, and the aquifers will eventually be drained to the point that vacuum-based pumps can no longer draw water.
  • rkcth - Tuesday, February 11, 2014 - link

    They are a lot more efficient at heating, but only mildly more efficient at cooling. They also are really storing heat in the ground in the summer and taking it back in the winter, so if you only store heat you can actually have a problem long-term. Your essentially using the ground as a long-term heat storage device since the ground is between 50-60 degrees depending on your area of the country, but use of the geothermal changes that temperature. An air source makes much more sense since you share the air with everyone else and it essentially just blows away.
  • biohazard918 - Tuesday, February 11, 2014 - link

    Wells don't use vacuum based pumps most aquifers are much to deep for that instead you stick the pump in the bottom of the well and push the water to the service.

Log in

Don't have an account? Sign up now