Loading the Server

The server first gets a few warm-up runs and then we start measuring during a period of about 1000 seconds. The blue lines represent the measurements done with the Xeon E5-2650L, the orange/red lines represent the Xeon E5-2697 v2. We test with three settings:

  • No heating. Inlet temperature is about 20-21°C, regulated by the CRAC
  • Moderate heating. We regulate until the inlet temperature is about 35°C
  • Heavy heating. We regulate until the inlet temperature is about 40°C

First we start with a stress test: what kind of CPU load do we attain? Our objective is to be able to test a realistic load for a virtualized host between 20 and 80% CPU load. Peaks above 80% are acceptable but long periods of 100% CPU load are not.

There are some small variations between the different tests, but the load curve is very similar on the same CPU. The 2.4GHz 12-core Xeon E5-2697 v2 has a CPU load between 1% and 78%. During peak load, the load is between 40% and 80%.

The 8-core 1.8GHz Xeon E5-2650L is not as powerful and has a peak load of 50% to 94%. Let's check out the temperatures. The challenge is to keep the CPU temperature below the specified Tcase.

The low power Xeon stays well below the specified Tcase. Despite the fact that it starts at 55°C when the inlet is set to 40°C, the CPU never reaches 60°C.

The results on our 12-core monster are a different matter. With an inlet temperature up to 35°C, the server is capable of keeping the CPU below 75°C (see red line). When we increase the inlet temperature to 40°C, the CPU starts at 61°C and quickly rises to 80°C. Peaks of 85°C are measured, which is very close to the specified 86°C maximum temperature. Those values are acceptable, but at first sight it seems that there is little headroom left.

The most extreme case would be to fill up all disk bays and DIMM slots and to set inlet temperature to 45°C. Our heating element is not capable of sustaining an inlet of 45°C, but we can get an idea of what would happen by measuring how hard the fans are spinning.

Benchmark Configuration Power Results
Comments Locked

48 Comments

View All Comments

  • lwatcdr - Thursday, February 20, 2014 - link

    Here in south florida it would probably be cheaper. The water table is very high and many wells are only 35 feet deep.
  • rrinker - Tuesday, February 11, 2014 - link

    It's been done already. I know I've seen it in an article on new data centers in one industry publication or another.
    A museum near me recently drilled dozens of wells under their parking lot for geothermal cooling of the building. Being large with lots of glass area, it got unbearably hot during the summer months. Now, while it isn't as cool as you might set your home air conditioning, it is quite comfortable even on the hottest days, and the only energy is for the water pumps and fans. Plus it's better for the exhibits, reducing the yearly variation in temperature and humidity. Definitely a feasible approach for a data center.
  • noeldillabough - Tuesday, February 11, 2014 - link

    I was actually talking about this today; the big cost for our data centers is Air Conditioning; what if we had a building up north (arctic) where the ground is alway frozen even in summer? Geothermal cooling for free, by pumping water through your "radiator".

    Not sure about the environmental impact this would do, but the emptiness that is the arctic might like a few data centers!
  • superflex - Wednesday, February 12, 2014 - link

    The enviroweenies would scream about you defrosting the permafrost.
    Some slug or bacteria might become endangered.
  • evonitzer - Sunday, February 23, 2014 - link

    Unfortunately, the cold areas are also devoid of people and therefore internet connections. You'll have to figure the cost of running fiber to your remote location, as well as how your distance might affect latency. If you go into permafrost area, there are additional complications as constructing on permafrost is a challenge. A datacenter high in the Mountains but close to population centers would seem a good compromise.
  • fluxtatic - Wednesday, February 12, 2014 - link

    I proposed this at work, but management stopped listening somewhere between me saying we'd need to put a trench through the warehouse floor to outside the building, and that I'd need a large, deep hole dug right next to building, where I would bury several hundred feet of copper pipe.

    I also considered using the river that's 20' from the office, but I'm not sure the city would like me pumping warm water into their river.
  • Varno - Tuesday, February 11, 2014 - link

    You seem to be reporting on the junction temperature which is reported by most measurement programs rather than the cast temperature that is impossible to measure directly without interfering with the results. How have you accounted for this in your testing?
  • JohanAnandtech - Tuesday, February 11, 2014 - link

    Do you mean case temperature? We did measure the outlet temperature, but it was significantly lower than Junction temperature. For the Xeon 2697 v2, it was 39-40 °C at 35°C inlet, 45°C at 40°C inlet.
  • Kristian Vättö - Tuesday, February 11, 2014 - link

    Google's usage of raw seawater for cooling of their data center in Hamina, Finland is pretty cool IMO. Given that the specific heat capacity of water is much higher than air's, it more efficient for cooling, especially in our climate where seawater is always relatively cold.
  • JohanAnandtech - Tuesday, February 11, 2014 - link

    I admit, I somewhat ignored the Scandinavian datacenters as "free cooling" is a bit obvious there. :-)

    I thought some readers would be surprised to find out that even in Sunny California free cooling is available most of the year.

Log in

Don't have an account? Sign up now