Test Setup and Comparison Points

In our review kit from AMD, we were supplied with almost complete systems for testing. Inside the box of goods, AMD included:

  • AMD Threadripper 2990WX (32C, 250W, $1799)
  • AMD Threadripper 2950X (16C, 180W, $899)
  • ASUS ROG Zenith Extreme motherboard, rev 2
  • MSI X399 MEG Creation motherboard
  • 4x8 GB of G.Skill FlareX DDR4-3200 14-14-14
  • Wraith Ripper Cooler, co-developed with Cooler Master
  • Enermax Liqtech 240 TR4 Liquid Cooler, rated to 500W

For our usual testing, we stick to the same power supplies, the same storage, ideally the same motherboard within a range of processors, and always use the latest BIOS. Despite AMD shipping us some reasonably fast memory, our standard policy is to test these systems at the maximum supported frequency as promoted by the processor manufacturer, or in this case DDR4-2933 for the new Threadripper 2000-series processors.

For our testing we compared the first generation Threadripper processors with the second generation parts. We also have the Intel 18-core Core i9-7980XE, some results from the Core i7-7900X (10-core), and also two mainstream processors, one Intel and one AMD. This is due to our new CPU testing suite, which takes effect today.

Due to an industry event occuring in the middle of our testing, we had to split some of the testing up and take 30 kg of kit half-way around the world to test in a hotel room during Flash Memory Summit. On the downside, it means there is some discontinuity in our testing, although not that much - on the plus side, the hardware tested in the hotel room had a good amount of air-conditioning to keep cool.

AMD Test Setup
CPUs TR 2990WX ASUS ROG Zenith 0078 Liqtech TR4 4x8GB DDR4-2933
  TR 2950X ASUS ROG Zenith 0078 Liqtech TR4 4x8GB DDR4-2933
  TR 1950X ASUS X399-A Prime 0806 TRUE Cu 4x4GB DDR4-2666
  TR 1920X ASUS ROG Zenith 0078 Liqtech TR4 4x8GB DDR4-2666
  TR 1900X ASUS X399-A Prime 0806 TRUE Cu 4x4GB DDR4-2666
  R7 2700X ASUS Crosshair VI Hero 0508 Wraith Max 4x8GB DDR4-2933
  EPYC 7601 GIGABYTE MZ31-AR0   Fryzen 8x128GB DDR4-2666
GPU Sapphire RX 460 2GB (CPU Tests)
PSU Corsair AX860i
Corsair AX1200i
SSD Crucial MX300 1TB
OS Windows 10 x64 RS3 1709
Spectre and Meltdown Patched

The memory for our test suites was mostly G.Skill, with some Crucial. For the EPYC system, Micron sent us some LRDIMMs, so we fired up 1TB of memory to get all eight channels working.

On the Intel side, we are still getting up to speed on our testing.

Intel Test Setup
CPUs i9-7980XE ASRock X299 OC Formula P1.40 TRUE Cu 4x8GB DDR4-2666
  i9-7900X ASRock X299 OC Formula P1.40 TRUE Cu 4x8GB DDR4-2666
  i7-8700K ASRock Z370 Gaming i7 P1.70 AR10-115XS 4x4GB DDR4-2666
GPU Sapphire RX 460 2GB (CPU Tests)
PSU Corsair AX860i
Corsair AX1200i
SSD Crucial MX300 1TB
OS Windows 10 x64 RS3 1709
Spectre and Meltdown Patched

Over time we will be adding to our Intel CPUs tested.

Many thanks to...

We must thank the following companies for kindly providing hardware for our multiple test beds. Some of this hardware is not in this test bed specifically, but is used in other testing.

Thank you to Crucial for providing us with MX200 SSDs and to Micron for LRDIMMs. Crucial stepped up to the plate as our benchmark list grows larger with newer benchmarks and titles, and the 1TB MX200 units are strong performers. Based on Marvell's 88SS9189 controller and using Micron's 16nm 128Gbit MLC flash, these are 7mm high, 2.5-inch drives rated for 100K random read IOPs and 555/500 MB/s sequential read and write speeds. The 1TB models we are using here support TCG Opal 2.0 and IEEE-1667 (eDrive) encryption and have a 320TB rated endurance with a three-year warranty.

Further Reading: AnandTech's Crucial MX200 (250 GB, 500 GB & 1TB) Review

Thank you to Corsair for providing us with an AX1200i PSU. The AX1200i was the first power supply to offer digital control and management via Corsair's Link system, but under the hood it commands a 1200W rating at 50C with 80 PLUS Platinum certification. This allows for a minimum 89-92% efficiency at 115V and 90-94% at 230V. The AX1200i is completely modular, running the larger 200mm design, with a dual ball bearing 140mm fan to assist high-performance use. The AX1200i is designed to be a workhorse, with up to 8 PCIe connectors for suitable four-way GPU setups. The AX1200i also comes with a Zero RPM mode for the fan, which due to the design allows the fan to be switched off when the power supply is under 30% load.

Further Reading: AnandTech's Corsair AX1500i Power Supply Review

Thank you to G.Skill for providing us with memory. G.Skill has been a long-time supporter of AnandTech over the years, for testing beyond our CPU and motherboard memory reviews. We've reported on their high capacity and high-frequency kits, and every year at Computex G.Skill holds a world overclocking tournament with liquid nitrogen right on the show floor.

Further Reading: AnandTech's Memory Scaling on Haswell Review, with G.Skill DDR3-3000

Feed Me: Infinity Fabric Requires More Power Our New Testing Suite for 2018 and 2019: Spectre and Meltdown Hardened
Comments Locked

171 Comments

View All Comments

  • plonk420 - Tuesday, August 14, 2018 - link

    worse for efficiency?

    https://techreport.com/r.x/2018_08_13_AMD_s_Ryzen_...
  • Railgun - Monday, August 13, 2018 - link

    How can you tell? The article isn’t even finished.
  • mapesdhs - Monday, August 13, 2018 - link

    People will argue a lot here about performance per watt and suchlike, but in the real world the cost of the software and the annual license renewal is often far more than the base hw cost, resulting in a long term TCO that dwarfs any differences in some CPU cost. I'm referring here to the kind of user that would find the 32c option relevant.

    Also missing from the article is the notion of being able to run multiple medium scale tasks on the same system, eg. 3 or 4 tasks each of which is using 8 to 10 cores. This is quite common practice. An article can only test so much though, at this level of hw the number of different parameters to consider can be very large.

    Most people on tech forums of this kind will default to tasks like 3D rendering and video conversion when thinking about compute loads that can use a lot of cores, but those are very different to QCD, FEA and dozens of other tasks in research and data crunching. Some will match the arch AMD is using, others won't; some could be tweaked to run better, others will be fine with 6 to 10 cores and just run 4 instances testing different things. It varies.

    Talking to an admin at COSMOS years ago, I was told that even coders with seemingly unlimited cores to play with found it quite hard to scale relevant code beyond about 512 cores, so instead for the sort of work they were doing, the centre would run multilple simulations at the same time, which on the hw platform in question worked very nicely indeed (1856 cores of the SandyBridge-EP era, 14.5TB of globally shared memory, used primarily for research in cosmology, astrophysics and particle physics; squish it all into a laptop and I'm sure Sheldon would be happy. :D) That was back in 2012, but the same concepts apply today.

    For TR2, the tricky part is getting the OS to play nice, along with the BIOS, and optimised sw. It'll be interesting to see how 2990WX performance evolves over time as BIOS updates come out and AMD gets feedback on how best to exploit the design, new optimisations from sw vendors (activate TR2 mode!) and so on.

    SGI dealt with a lot of these same issues when evolving its Origin design 20 years ago. For some tasks it absolutely obliterated the competition (eg. weather modelling and QCD), while for others in an unoptimised state it was terrible (animation rendering, not something that needs shared memory, but ILM wrote custom sw to reuse bits of a frame already calculated for future frame, the data able to fly between CPUs very fast, increasing throughput by 80% and making the 32-CPU systems very competitive, but in the long run it was easier to brute force on x86 and save the coder salary costs).

    There are so many different tasks in the professional space, the variety is vast. It's too easy to think cores are all that matter, but sometimes having oodles of RAM is more important, or massive I/O (defense imaging, medical and GIS are good examples).

    I'm just delighted to see this kind of tech finally filter down to the prosumer/consumer, but alas much of the nuance will be lost, and sadly some will undoubtedly buy based on the marketing, as opposed to the golden rule of any tech at this level: ignore the publish benchmarks, the ony test that actually matters is your specific intended task and data, so try and test it with that before making a purchasing decision.

    Ian.
  • AbRASiON - Monday, August 13, 2018 - link

    Really? I can't tell if posts like these are facetious or kidding or what?

    I want AMD to compete so badly long term for all of us, but Intel have such immense resources, such huge infrastructure, they have ties to so many big business for high end server solutions. They have the bottom end of the low power market sealed up.

    Even if their 10nm is delayed another 3 years, AMD will only just begin to start to really make a genuine long term dent in Intel.

    I'd love to see us at a 50/50 situation here, heck I'd be happy with a 25/75 situation. As it stands, Intel isn't finished, not even close.
  • imaheadcase - Monday, August 13, 2018 - link

    Are you looking at same benchmarks as everyone else? I mean AMD ass was handed to it in Encoding tests and even went neck to neck against some 6c intel products. If AMD got one of these out every 6 months with better improvements sure, but they never do.
  • imaheadcase - Monday, August 13, 2018 - link

    Especially when you consider they are using double the core count to get the numbers they do have, its not very efficient way to get better performance.
  • crotach - Tuesday, August 14, 2018 - link

    It's happened before. AMD trashes Intel. Intel takes it on the chin. AMD leads for 1-2 years and celebrates. Then Intel releases a new platform and AMD plays catch-up for 10 years and tries hard not to go bankrupt.

    I dearly hope they've learned a lesson the last time, but I have my doubts. I will support them and my next machine will be AMD, which makes perfect sense, but I won't be investing heavily in the platform, so no X399 for me.
  • boozed - Tuesday, August 14, 2018 - link

    We're talking about CPUs that cost more than most complete PCs. Willy-waving aside, they are irrelevant to the market.
  • Ian Cutress - Monday, August 13, 2018 - link

    Hey everyone, sorry for leaving a few pages blank right now. Jet lag hit me hard over the weekend from Flash Memory Summit. Will be filling in the blanks and the analysis throughout today.

    But here's what there is to look forward to:

    - Our new test suite
    - Analysis of Overclocking Results at 4G
    - Direct Comparison to EPYC
    - Me being an idiot and leaving the plastic cover on my cooler, but it completed a set of benchmarks. I pick through the data to see if it was as bad as I expected

    The benchmark data should now be in Bench, under the CPU 2019 section, as our new suite will go into next year as well.

    Thoughts and commentary welcome!
  • Tamz_msc - Monday, August 13, 2018 - link

    Are the numbers for test LuxMark C++ test correct? Seems they've been swapped(2900WX and 2950X).

Log in

Don't have an account? Sign up now