How We Tested

To determine the optimal point between data center temperature and system cooling performance, we created a controlled temperature testing environment, called a "HotBox". Basically, we placed a server inside an insulated box. The box consists of two main layers: at the bottom is the air inlet where a heating element is placed. The hot air is blown inside the box and is then sucked into the front of the server on the second layer. This way we can simulate that inlet air comes from below, as in most data centers. Inlet and outlet are separated and insulated from each other, simulating the hot and cold aisles. Two thermistors measure the temperature of the inlet, one on the right and on the left, just behind the front panel.

Just behind the motherboard, close to back of the server, a pair of thermistors monitors the outlet temperature. And we'd like to thank Wannes De Smet who designed the hotbox!

The servers is fed by a standard European 230V (16 Amps max.) power line. We use the Racktivity ES1008 Energy Switch PDU to measure power consumption. Measurement circuits of most PDUs assume that the incoming AC is a perfect sine wave, but it never is. However, the Rackitivity PDU measures true RMS current and voltage at a very high sample rate: up to 20,000 measurements per second for the complete PDU.

Datamining on Hardware

Building the "Hotbox" was one thing; getting all the necessary data on the other hand is a serious challenge. A home-made PCB collects the data of the thermistors. Our vApus stress testing software interfaces with ESXi to collect hardware usage counters and temperatures; fan speeds are collected from the BMC; and power numbers from the Racktivity PDU. This is all done while placing a realistic load on the ESXi virtual machines. The excellent programming work of Dieter of the Sizing Servers Lab resulted in a large amount of data in our Excel sheets.

To put a realistic load on the machine we use our own real-life load generator called vApus. With vApus we capture real user interaction with a website, add some parameters that can be randomized, and then replay that log a number of times.

The workload consists of four VMs:

  • Drupal LAMP VM running sizingservers.be website
  • Zimbra 8 VM
  • phpBB LAMP VM running clone of real website
  • OLAP (news aggregator database)

The Drupal site gets regular site visitors mixed with the posting of new blog entries and sending email, resulting in a moderate system load. The Zimbra load is disk-intensive, consisting of users creating and sending emails, replying, creating appointments, tasks and contacts. The phpBB workload has a moderate CPU and network load, viewing and creating forum threads with rich content. Finally, the OLAP workload is based on queries from a news aggregator and is mostly CPU bound. These four VMS form one Tile (similar to VmMark "tiles"). We ran two tiles in each test, resulting in a load of 10% to 80%.

The Supermicro "PUE-Optimized" Server Benchmark Configuration
Comments Locked

48 Comments

View All Comments

  • lwatcdr - Thursday, February 20, 2014 - link

    Here in south florida it would probably be cheaper. The water table is very high and many wells are only 35 feet deep.
  • rrinker - Tuesday, February 11, 2014 - link

    It's been done already. I know I've seen it in an article on new data centers in one industry publication or another.
    A museum near me recently drilled dozens of wells under their parking lot for geothermal cooling of the building. Being large with lots of glass area, it got unbearably hot during the summer months. Now, while it isn't as cool as you might set your home air conditioning, it is quite comfortable even on the hottest days, and the only energy is for the water pumps and fans. Plus it's better for the exhibits, reducing the yearly variation in temperature and humidity. Definitely a feasible approach for a data center.
  • noeldillabough - Tuesday, February 11, 2014 - link

    I was actually talking about this today; the big cost for our data centers is Air Conditioning; what if we had a building up north (arctic) where the ground is alway frozen even in summer? Geothermal cooling for free, by pumping water through your "radiator".

    Not sure about the environmental impact this would do, but the emptiness that is the arctic might like a few data centers!
  • superflex - Wednesday, February 12, 2014 - link

    The enviroweenies would scream about you defrosting the permafrost.
    Some slug or bacteria might become endangered.
  • evonitzer - Sunday, February 23, 2014 - link

    Unfortunately, the cold areas are also devoid of people and therefore internet connections. You'll have to figure the cost of running fiber to your remote location, as well as how your distance might affect latency. If you go into permafrost area, there are additional complications as constructing on permafrost is a challenge. A datacenter high in the Mountains but close to population centers would seem a good compromise.
  • fluxtatic - Wednesday, February 12, 2014 - link

    I proposed this at work, but management stopped listening somewhere between me saying we'd need to put a trench through the warehouse floor to outside the building, and that I'd need a large, deep hole dug right next to building, where I would bury several hundred feet of copper pipe.

    I also considered using the river that's 20' from the office, but I'm not sure the city would like me pumping warm water into their river.
  • Varno - Tuesday, February 11, 2014 - link

    You seem to be reporting on the junction temperature which is reported by most measurement programs rather than the cast temperature that is impossible to measure directly without interfering with the results. How have you accounted for this in your testing?
  • JohanAnandtech - Tuesday, February 11, 2014 - link

    Do you mean case temperature? We did measure the outlet temperature, but it was significantly lower than Junction temperature. For the Xeon 2697 v2, it was 39-40 °C at 35°C inlet, 45°C at 40°C inlet.
  • Kristian Vättö - Tuesday, February 11, 2014 - link

    Google's usage of raw seawater for cooling of their data center in Hamina, Finland is pretty cool IMO. Given that the specific heat capacity of water is much higher than air's, it more efficient for cooling, especially in our climate where seawater is always relatively cold.
  • JohanAnandtech - Tuesday, February 11, 2014 - link

    I admit, I somewhat ignored the Scandinavian datacenters as "free cooling" is a bit obvious there. :-)

    I thought some readers would be surprised to find out that even in Sunny California free cooling is available most of the year.

Log in

Don't have an account? Sign up now