Loading the Server

The server first gets a few warm-up runs and then we start measuring during a period of about 1000 seconds. The blue lines represent the measurements done with the Xeon E5-2650L, the orange/red lines represent the Xeon E5-2697 v2. We test with three settings:

  • No heating. Inlet temperature is about 20-21°C, regulated by the CRAC
  • Moderate heating. We regulate until the inlet temperature is about 35°C
  • Heavy heating. We regulate until the inlet temperature is about 40°C

First we start with a stress test: what kind of CPU load do we attain? Our objective is to be able to test a realistic load for a virtualized host between 20 and 80% CPU load. Peaks above 80% are acceptable but long periods of 100% CPU load are not.

There are some small variations between the different tests, but the load curve is very similar on the same CPU. The 2.4GHz 12-core Xeon E5-2697 v2 has a CPU load between 1% and 78%. During peak load, the load is between 40% and 80%.

The 8-core 1.8GHz Xeon E5-2650L is not as powerful and has a peak load of 50% to 94%. Let's check out the temperatures. The challenge is to keep the CPU temperature below the specified Tcase.

The low power Xeon stays well below the specified Tcase. Despite the fact that it starts at 55°C when the inlet is set to 40°C, the CPU never reaches 60°C.

The results on our 12-core monster are a different matter. With an inlet temperature up to 35°C, the server is capable of keeping the CPU below 75°C (see red line). When we increase the inlet temperature to 40°C, the CPU starts at 61°C and quickly rises to 80°C. Peaks of 85°C are measured, which is very close to the specified 86°C maximum temperature. Those values are acceptable, but at first sight it seems that there is little headroom left.

The most extreme case would be to fill up all disk bays and DIMM slots and to set inlet temperature to 45°C. Our heating element is not capable of sustaining an inlet of 45°C, but we can get an idea of what would happen by measuring how hard the fans are spinning.

Benchmark Configuration Power Results
Comments Locked

48 Comments

View All Comments

  • extide - Tuesday, February 11, 2014 - link

    Yeah there is a lot of movement in this these days, but the hard part of doing this is at the low voltages used in servers <=24v, you need a massive amount of current to feed several racks of servers, so you need massive power bars and of course you can lose a lot of efficiency on that side as well.
  • drexnx - Tuesday, February 11, 2014 - link

    afaik, the Delta DC stuff is all 48v, so a lot of the old telecom CO stuff is already tailor-made for use there.

    but yes, you get to see some pretty amazing buswork as a result!
  • Ikefu - Tuesday, February 11, 2014 - link

    Microsoft is building a massive data center in my home state just outside Cheyenne, WY. I wonder why more companies haven't done this yet? Its very dry and days above 90F are few and far between in the summer. Seems like an easy cooling solution versus all the data centers in places like Dallas.
  • rrinker - Tuesday, February 11, 2014 - link

    Building in the cooler climes is great - but you also need the networking infrastructure to support said big data center. Heck for free cooling, build the data centers in the far frozen reaches of Northern Canada, or in Antarctica. Only, how will you get the data to the data center?
  • Ikefu - Tuesday, February 11, 2014 - link

    Its actually right along the I-80 corridor that connects Chicago and San Francisco. Several major backbones run along that route and its why many mega data centers in Iowa are also built along I-80. Microsoft and the NCAR Yellowstone super computer are there so the large pipe is definitely accessible.
  • darking - Tuesday, February 11, 2014 - link

    We've used free cooling in our small datacenter since 2007. Its very effective from september to april here in Denmark.
  • beginner99 - Tuesday, February 11, 2014 - link

    That map from Europe is certainly plain wrong. Especially in Spain btu also Greece and italy easily have some day above 35. It also happens couple of days per year were I live, a lot more north than any of those.
  • ShieTar - Thursday, February 13, 2014 - link

    Do you really get 35°C, in the shade, outside, for more than 260 hours a year? I'm sure it happens for a few hours a day in the two hottest months, but the map does cap out at 8500 out of 8760 hours.
  • juhatus - Tuesday, February 11, 2014 - link

    What about wear&tear at running the equipment at hotter temperatures? I remember seeing the chart where higher temperature = shorter life span. I would imagine the OEM's have engineered a bit over this and warranties aside, it should be basic physics?
  • zodiacfml - Wednesday, February 12, 2014 - link

    You just need constant temperature and equipment that works at that temperature. Wear and tear happens significantly at temperature changes.

Log in

Don't have an account? Sign up now