Test Bed

As per our testing policy, we take a high end CPU suitable for the motherboard that was released during the socket’s initial launch, and equip the system with a suitable amount of memory running at the processor maximum supported frequency. This is also typically run at JEDEC subtimings where possible. It is noted that some users are not keen on this policy, stating that sometimes the maximum supported frequency is quite low, or faster memory is available at a similar price, or that the JEDEC speeds can be prohibitive for performance. While these comments make sense, ultimately very few users apply memory profiles (either XMP or other) as they require interaction with the BIOS, and most users will fall back on JEDEC supported speeds - this includes home users as well as industry who might want to shave off a cent or two from the cost or stay within the margins set by the manufacturer. Where possible, we will extend out testing to include faster memory modules either at the same time as the review or a later date.

Test Setup
Processor Intel Core i7-6700K (ES, Retail Stepping), 91W, $350
4 Cores, 8 Threads, 4.0 GHz (4.2 GHz Turbo)
Motherboards GIGABYTE X170-Extreme ECC
Cooling Cooler Master Nepton 140XL
Power Supply OCZ 1250W Gold ZX Series
Corsair AX1200i Platinum PSU
Memory Corsair DDR4-2133 C15 2x8 GB 1.2V or
G.Skill Ripjaws 4 DDR4-2133 C15 2x8 GB 1.2V
Memory Settings JEDEC @ 2133
Video Cards ASUS GTX 980 Strix 4GB
MSI GTX 770 Lightning 2GB (1150/1202 Boost)
ASUS R7 240 2GB
Hard Drive Crucial MX200 1TB
Optical Drive LG GH22NS50
Case Open Test Bed
Operating System Windows 7 64-bit SP1

Readers of our motherboard review section will have noted the trend in modern motherboards to implement a form of MultiCore Enhancement / Acceleration / Turbo (read our report here) on their motherboards. This does several things, including better benchmark results at stock settings (not entirely needed if overclocking is an end-user goal) at the expense of heat and temperature. It also gives in essence an automatic overclock which may be against what the user wants. Our testing methodology is ‘out-of-the-box’, with the latest public BIOS installed and XMP enabled, and thus subject to the whims of this feature. It is ultimately up to the motherboard manufacturer to take this risk – and manufacturers taking risks in the setup is something they do on every product (think C-state settings, USB priority, DPC Latency / monitoring priority, overriding memory sub-timings at JEDEC). Processor speed change is part of that risk, and ultimately if no overclocking is planned, some motherboards will affect how fast that shiny new processor goes and can be an important factor in the system build.

For reference, the GIGABYTE X170-Extreme ECC, on our testing BIOS F2c, MCT was enabled by default. Also, the FCLK 10x ratio was not present in the BIOS tested at the time of testing.

Many thanks to...

We must thank the following companies for kindly providing hardware for our multiple test beds. Some of this hardware is not in this test bed specifically, but is used in other testing.

First up is Akitio, who kindly provided a Thunder3, one of their external Thunderbolt 3 based PCIe 3.0 x4 drive caddies, along with an Intel SSD 750 inside. Ganesh reviewed their unit earlier this year, and I was pleased to say it worked flawlessly with our testing. Inside the unit we were able to find the Intel Alpine Ridge controller for conversion, along with a few other ICs I’ve put in the gallery below.

  
  

After plugging the device in and granting it permission to access to the system, I did a quick CrystalDiskMark to verify the base speeds of the connection. Initially I was getting quite low speeds, and realized that I should go look for the latest Intel Alpine Ridge driver update. With that installed, we achieved comparable results to the 1.2 TB Intel SSD 750 drive over U.2 tested earlier in the year.

Again, thanks to Akitio for the short-term loan. The Thunder3 PCIe SSD is currently available direct from Amazon. Hopefully in the future we can source a TB3 device such that the onboard controller is the limiting factor, similar with our USB 3.1 testing.

Thank you to AMD for providing us with the R9 290X 4GB GPUs. These are MSI branded 'Gaming' models, featuring MSI's Twin Frozr IV dual-fan cooler design and military class components. Bundled with the cards is MSI Afterburner for additional overclocking, as well as MSI's Gaming App for easy frequency tuning.

The R9 290X is a second generation GCN card from AMD, under the Hawaii XT codename, and uses their largest Sea Islands GPU die at 6.2 billion transistors at 438mm2 built at TSMC using a 28nm process. For the R9 290X, that means 2816 streaming processors with 64 ROPs using a 512-bit memory bus to GDDR5 (4GB in this case). The official power rating for the R9 290X is 250W.

The MSI R9 290X Gaming 4G runs the core at 1000 MHz to 1040 MHz depending on what mode it is in (Silent, Gaming or OC), and the memory at 5 GHz. Displays supported include one DisplayPort, one HDMI 1.4a, and two dual-link DVI-D connectors.

Further Reading: AnandTech's AMD R9 290X Review

Thank you to ASUS for providing us with GTX 980 Strix GPUs. At the time of release, the STRIX brand from ASUS was aimed at silent running, or to use the marketing term: '0dB Silent Gaming'. This enables the card to disable the fans when the GPU is dealing with low loads well within temperature specifications. These cards equip the GTX 980 silicon with ASUS' Direct CU II cooler and 10-phase digital VRMs, aimed at high-efficiency conversion. Along with the card, ASUS bundles GPU Tweak software for overclocking and streaming assistance.

The GTX 980 uses NVIDIA's GM204 silicon die, built upon their Maxwell architecture. This die is 5.2 billion transistors for a die size of 298 mm2, built on TMSC's 28nm process. A GTX 980 uses the full GM204 core, with 2048 CUDA Cores and 64 ROPs with a 256-bit memory bus to GDDR5. The official power rating for the GTX 980 is 165W.

The ASUS GTX 980 Strix 4GB (or the full name of STRIX-GTX980-DC2OC-4GD5) runs a reasonable overclock over a reference GTX 980 card, with frequencies in the range of 1178-1279 MHz. The memory runs at stock, in this case 7010 MHz. Video outputs include three DisplayPort connectors, one HDMI 2.0 connector and a DVI-I.

Further Reading: AnandTech's NVIDIA GTX 980 Review

Thank you to Cooler Master for providing us with Nepton 140XL CLCs. The Nepton 140XL is Cooler Master's largest 'single' space radiator liquid cooler, and combines with dual 140mm 'JetFlo' fans designed for high performance, from 0.7-3.5mm H2O static pressure. The pump is also designed to be faster, more efficient, and uses thicker pipes to assist cooling with a rated pump noise below 25 dBA. The Nepton 140XL comes with mounting support for all major sockets, as far back as FM1, AM2 and 775.

Further Reading: AnandTech's Cooler Master Nepton 140XL Review

Thank you to Corsair for providing us with an AX1200i PSU. The AX1200i was the first power supply to offer digital control and management via Corsair's Link system, but under the hood it commands a 1200W rating at 50C with 80 PLUS Platinum certification. This allows for a minimum 89-92% efficiency at 115V and 90-94% at 230V. The AX1200i is completely modular, running the larger 200mm design, with a dual ball bearing 140mm fan to assist high-performance use. The AX1200i is designed to be a workhorse, with up to 8 PCIe connectors for suitable four-way GPU setups. The AX1200i also comes with a Zero RPM mode for the fan, which due to the design allows the fan to be switched off when the power supply is under 30% load.

Further Reading: AnandTech's Corsair AX1500i Power Supply Review

Thank you to Crucial for providing us with MX200 SSDs. Crucial stepped up to the plate as our benchmark list grows larger with newer benchmarks and titles, and the 1TB MX200 units are strong performers. Based on Marvell's 88SS9189 controller and using Micron's 16nm 128Gbit MLC flash, these are 7mm high, 2.5-inch drives rated for 100K random read IOPs and 555/500 MB/s sequential read and write speeds. The 1TB models we are using here support TCG Opal 2.0 and IEEE-1667 (eDrive) encryption and have a 320TB rated endurance with a three-year warranty.

Further Reading: AnandTech's Crucial MX200 (250 GB, 500 GB & 1TB) Review

Thank you to G.Skill for providing us with memory. G.Skill has been a long-time supporter of AnandTech over the years, for testing beyond our CPU and motherboard memory reviews. We've reported on their high capacity and high-frequency kits, and every year at Computex G.Skill holds a world overclocking tournament with liquid nitrogen right on the show floor. One of the most recent deliveries from G.Skill was their 4x16 GB DDR4-3200 C14 Kit, which we are planning for an upcoming review.

Further Reading: AnandTech's Memory Scaling on Haswell Review, with G.Skill DDR3-3000

Thank you to Corsair for providing us with memory. Similarly, Corsair (along with PSUs) is also a long-time supporter of AnandTech. Being one of the first vendors with 16GB modules for DDR4 was a big deal, and now Corsair is re-implementing LEDs back on its memory after a long hiatus along with supporting specific projects such as ASUS ROG versions of the Dominator Platinum range. We're currently looking at our review pipeline to see when our next DRAM round-up will be, and Corsair is poised to participate.

Further Reading: AnandTech's Memory Scaling on Haswell-E Review

Board Features, Visual Inspection Benchmark Overview
Comments Locked

31 Comments

View All Comments

  • SetiroN - Monday, October 17, 2016 - link

    There is only one thing that's worse than camo: pixelized camo.

    I honestly fail to understand who in the world would ever buy a socket 1150 Xeon solution instead of socket 2011.
  • dave_the_nerd - Monday, October 17, 2016 - link

    1) Digital camo has been standard-issue in the military for a while now.

    2) Anybody who only needs a 4c/8t system, but is otherwise doing "workstation" or server-grade work. (Uptime requirements, longevity requirements, need ECC ram for data crunching, need virtualization features, etc.)
  • zepi - Monday, October 17, 2016 - link

    4c/8t LGA2011 solution hardly costs much more, especially since this board is approaching the pricing of workstation mobos...
  • Einy0 - Monday, October 17, 2016 - link

    2) The supposed advantages are 95% marketing. Uptime is more about your OS if you select quality components to go with the CPU. Longevity, seriously??? I can show you desktops built 30+ years ago that run today the same as they did then. How many CPUs actually die? I personally have had one die, it was 7 years old. Virtualization, again no more features on the 1151 Xeon versus the i7. ECC, that's the one feature an 1151 Xeon has over a similar i7. Now when we start talking multi-socket and what not well that's obvious. I've had these conversations in the past with engineers and developers at work. Everyone just assumes that when Intel says they need a Xeon to do something there is a reason. Yup, there is a reason, they can make more money from the same chip with a Xeon badge on it.
  • HollyDOL - Tuesday, October 18, 2016 - link

    Yep, you can show 30 old desktops still working, but how many of them were running 24/7? None.
  • mkaibear - Tuesday, October 18, 2016 - link

    Up until very recently I had a desktop of about that vintage running SCO Unix. That ran 24/7. In fact we were scared to turn it off because it ran chunks of the factory...
  • devol - Saturday, October 22, 2016 - link

    There are more differences than just ECC memory. For instance i7 cpu's don't support hugetlb/hugepages, and several other 'server' focused virtualization extensions. Until Skylake though, the PCH had basically no support for needed features for SR-IOV.
  • bigboxes - Monday, October 17, 2016 - link

    I'm sorry. I can't see the motherboard. Where is it in the picture?
  • stardude82 - Friday, November 18, 2016 - link

    I think it's generally acknowledged now that the digital camouflage was a failure.
    https://en.wikipedia.org/wiki/MultiCam#United_Stat...
  • BrokenCrayons - Monday, October 17, 2016 - link

    Yeah, it's really off-putting to see camo. I think they're going for some kind of military/tactical thing, but Gigabyte's failed to realize that camo just makes a product look trashy and redneck to people in the US these days.

Log in

Don't have an account? Sign up now