Thermal Comparisons and XFR2: Remember to Remove the CPU Cooler Plastic!

Every machine build has some targets: performance, power, noise, thermal performance, or cost. It is certainly hard to get all of them, so going after two or three is usually a good target. Well it turns out that there is one simple error that can make you lose on ALL FIVE COUNTS. Welcome to my world of when I first tested the 32-core AMD Ryzen Threadripper 2990WX, where I forgot to remove the plastic from my CPU liquid cooler.

Don’t Build Systems After Long Flights

Almost all brand new CPU coolers, either air coolers, liquid coolers, or water blocks, come pre-packaged with padding, foam, screws, fans, and all the instructions. Depending on the manufacturer, and the packaging type, the bottom of the CPU cooler will have been prepared in two ways:

  1. Pre-applied thermal paste
  2. A small self-adhesive plastic strip to protect the polishing during shipping

In our review kit, the Wraith Ripper massive air cooler, made by Cooler Master but promoted by AMD as the ‘base’ air cooler for new Threadripper 2 parts, had pre-applied thermal paste. It was across the whole base, and it was thick. It made a mess when I tried to take photos.

Also in our review kit was the Enermax Liqtech TR4 closed loop liquid cooler, with a small tube of thermal paste included. The bottom of the CPU block for the liquid cooler was covered in a self-adhesive plastic strip to protect the base in the packaging.  


Example from TechTeamGB's Twitter

So confession time. Our review kit landed the day before I was travelling from the UK to San Francisco, to cover Flash Memory Summit and Intel’s Datacenter Summit. In my suitcases, I took an X399 motherboard (the ASUS ROG Zenith), three X399 chips (2990WX, 2950X, 1950X), an X299 motherboard (ASRock X299 OC Formula), several Skylake-X chips, a Corsair AX860i power supply, an RX 460 graphics card, mouse, keyboard, cables – basically two systems and relying on the monitor in the hotel room for testing. After an 11 hour direct flight, two hours at passport control, a one hour Uber to my hotel, I set up the system with the 2990WX.

I didn’t take off the plastic on the Enermax cooler. Well, I didn’t realize it at the time. I even put thermal paste on the processor, and it still didn’t register when I tightened the screws.

I set the system up at the maximum supported memory frequency, installed Windows, installed the security updates, installed the benchmarks, and set it to run overnight while I slept. I didn’t even realize the plastic was still attached. Come mid-morning, the benchmark suite had finished. I did some of the extra testing, such as base frequency latency measurements, and then went to replace the processor with the 2950X. It was at this time I performed a facepalm.

It was at that point, with thermal paste all over the processor and the plastic, I realized I done goofed. I took the plastic off, re-pasted the processor, and set the system up again, this time with a better thermal profile. But rather than throw the results away, I kept them.

Thermal Performance Matters

The goal of any system is to keep it with a sufficient thermal window to maintain operation: most processors are rated to work properly from normal temperatures up to 105C, at which point they shut down to avoid permanent thermal damage. When a processor shuttles electrons around and does things, it consumes power. That power is lost as heat, and it dissipates from the silicon out into two main areas: the socket and the heatspreader.

For AMD’s Threadripper processors, the thermal interface material between the silicon dies and the heatspreader is an indium-tin solder, a direct metal-to-metal bonding for direct heat transfer. Modern Intel processors use a silicone thermal grease instead, which is not as great, but has one benefit – it lasts longer through thermal cycling. As metals heat up, they expand: with two metals bonded together, with different thermal expansion coefficients, with enough heat cycles will crack and be ineffective – thermal grease essentially eliminates that issue. Thermal grease also happens to be cheaper. So it’s a trade-off between price/longevity and performance.

Above the heatspreader is the CPU cooler, but between the two is another thermal interface which the user can decide. The cheapest options involve nasty silicone thermal grease that costs cents per gallon, however performance enthusiasts might look towards a silver based thermal paste or a compound with good heat transfer characteristics – usually the ability for a paste to spread under pressure is a good quality. Extreme users can implement a liquid metal policy, similar to that of the solder connection, which binds the CPU to the CPU cooler pretty much permanently.

So what happens if you suddenly put some microns of thermally inefficient plastic between the heatspreader and the CPU cooler?

First of all, the conductive heat transfer is terrible. This means that the thermal energy stays in the paste and headspreader for longer, causing heat soak in the processor, raising temperatures. This is essentially the same effect when a cooler is overwhelmed by a large processor – heat soak is real and can be a problem. It typically leads to a runaway temperature rise, until the temperature gradient can equal the heat energy output. This is when a processor gets too hot, and typically a thermal emergency power state kicks in, reducing voltage and frequency to super low levels. Performance ends down the drain.

What does the user see in the system? Imagine a processor hitting 600 MHz while rendering, rather than a nice 3125 MHz at stock (see previous page). Base temperatures are higher, load temperatures are higher, case temperatures are higher. Might as well dry some wet clothes in there while you are at it. A little thermal energy never hurt a processor, but a lot can destroy an experience.

AMD’s XFR2

Ultimately this issue hurts AMD more than you might think. The way AMD implements its turbo modes is not a look-up-table where cores loaded equals turbo frequency – it relies on power, current, and thermal limits of a given chip. Where there is room, the AMD platform is designed to add frequency and voltage where possible. The thermal aspect of this is what AMD calls XFR2, or eXtended Frequency Range 2.

At AMD’s Tech Day for Threadripper 2, we were presented with graphs showing the effects of using better coolers on performance: around 10% better benchmark results due to having higher thermal headroom. Stick the system in an environment with a lower ambient temperature as well, and AMD quoted a 16% performance gain over a ‘stock’ system.

However, the reverse works too. By having that bit of plastic in there, what this effectively did was lower that thermal ceiling, from idle to load, which should result in a drop in performance.

Plastic Performance

So despite being in a nice air-conditioned hotel room, that additional plastic did a number on most of our benchmarks. Here is the damage:

3D Particle Movement v2.1Agisoft Photoscan 1.3.3, Complex TestCorona 1.3 BenchmarkBlender 2.79b bmw27_cpu BenchmarkPOV-Ray 3.7.1 BenchmarkWinRAR 5.60b3PCMark10 Extended ScoreHandbrake 1.1.0 - 720p60 x264 6000 kbps FastFCAT Processing ROTR 1440p GTX980Ti Data

For all of our multi-threaded tests, where the CPU is hammered hard, there is a significant decrease in performance as expected. Blender saw a 20% decrease in throughput, POV-Ray was 10% lower, 3DPM was 19%. PCMark was only slightly lower, as it has a lot of single threaded tests, and annoyingly in some benchmarks we saw it swing the other way, such as WinRAR, which is more DRAM bound. Other benchmarks not listed include our compile test, where the plasticated system was 1% slower, or Dolphin, where there was a one-second difference.

What Have I Learned?

Don’t be a fool. Building a test bed with new components when super tired may lead to extra re-tests.

Overclocking: 4.0 GHz for 500W Conclusions: Not All Cores Are Made Equal
Comments Locked

171 Comments

View All Comments

  • MattZN - Monday, August 20, 2018 - link

    If its idling at 80-85W that implies you are running the memory fabric at 2800 or 3000MHz or higher. Try running the fabric at 2666MHz.

    Also keep in mind that a 2990WX running all 64 threads with a memory-heavy workload is almost guaranteed to be capped out by available memory bandwidth, so there's no point overclocking the CPU for those sorts of tests. In fact, you could try setting a lower PPT limit for the CPU core along with running the memory at 2666... you can probably chop 50-100W off the power consumption without changing the test results much (beyond the difference between 3000 and 2666).

    It's a bit unclear what you are loading the threads with. A computation-intensive workload will not load down the fabric much, meaning power will shift to the CPU cores and away from the fabric. A memory-intensive workload, on the otherhand, will stall-out the CPU cores (due to hitting the memory bandwidth cap that 4 memory channels gives you), and yet run the fabric at full speed. This is probably why you are seeing the results you are seeing. The CPU cores are likely hitting so many stalls they might as well be running at 2.8GHz instead of 3.4GHz, so they won't be using nearly as much power as you might expect.

    -Matt
  • XEDX - Monday, August 20, 2018 - link

    What happened to the Chromium compile rate for the 7980XE? On it's own review posted on Sep 25th 2017, it achieved 36.35 compiles per day, but in this review it dropped all the way down to 21.1.
  • jcc5169 - Saturday, August 25, 2018 - link

    Intel Will Struggle For Years And AMD Will Reap The Benefits-- SegmentNext https://segmentnext.com/
  • SWAPNALI - Tuesday, August 28, 2018 - link

    nice place here thanks alot for this information please do more post here
    <a href="http://clash-of-royale.com/">play clash of royale</a>
  • Relic74 - Wednesday, August 29, 2018 - link

    Regardless of the outcome, I went ahead and bought the 32 Core version. As I run SmartOS, an OS designed to run and manage Virtual Machines, I decided to go this route over the Epyc 24. My setup includes the new MSI MEG X399, 32 Core TR, 128GB DDR4 RAM, 3x Vega Frontier (used, $1000 for all three, no one wants them but I love them), 1 X Nvidia Titan Z (used for only $700, an amazing find from a pawn shop, did not know what he had, had it marked as an XP). Storage is 2 x 1TB Samsung 970 Pro in Raid 0 and 5x 8TB SATA in Raid 5 with 8GB of cache on card.

    The system is amazing and cost me much, much less than the iMac Pro I was about to buy. Now though, I can run any OS in VM, including OSX, with a designated GPU per VM and cores allocated to them. This setup is amazing, SmartOS is amazing, I have stopped running OS's with every application installed, Instead I create single purpose VM's and just install one or maybe two applications per. So for instance when I'm playing a game like DCS, a fantastic flight simulator, only has DCS and Steam installed on the VM. Allowing for the best performance possible, no, the lost of any performance by running things in VM are so minuscule that it's a none issue. DCS with the Titan V runs at over 200 FPS at 4K with everything turned to their max values. I have to actually cap games to my gaming monitors 144Hz refresh rate. Not only that but I can be playing the most demanding game their is, even in VR, while encoding a media file, while rendering something in Blender, while compiling an application, all tasks running under their own VM like a orchestra of perfection.

    Seriously, I will never go back to a one OS at a time machine again, not when SmartOS exists and especially not when 32 Cores are available at your command. In fact, anyone who buys this CPU and just runs one single OS at a time is an idiot as you will never, ever harness it's full intention as no one single application really can at the moment or at least not to the point where it's worth doing it.

    Most games dont need more than 4 cores, most design applications can't even use more than 2 cores, rendering applications use more of the GPU than CPU, in fact the only thing that really tasks my CPU is SmartOS that is controlling everything but even that doesn't need more than 6 cores to function perfectly, heck, I even had it at 12 cores but it didn't utilize it. So I have cores coming out of the yin-yang and more GPU's than I know what to do with. Aaaaahhhh poor, poor me.

    This computer will be with me for at least 10 years without ever feeling that I need an upgrade, which is why I spent the money, get it right the first time and than leave it alone I say.

    Oh and the memory management for SmartOS is incredible, I have set it up where if a VM needs more RAM, it will just grab it from another that isn't using it at the moment, it's all dynamic. Man, I am in love.

    Anyway.....
  • Phaedra - Sunday, March 3, 2019 - link

    Hi Relic74,

    I enjoyed reading your lengthy post on the technical marvel that is SmartOS and the 32 Core TR.

    I am very much interested in the technical details of how you got SmartOS to work with AMD hardware. Which version of SmartOS, Windows, KVM (or BHYVE) with PCI passthrough etc?

    I am in the process of preparing my own threadripper hyper computer and would love some advice regarding the KVM + PCI passthrough process.

    You mention gaming in a VM so I assume that you used a Windows 10 guest via KVM with PCI passthrough?

    The following says SmartOS doesn't support KVM on AMD hardware: https://wiki.smartos.org/display/DOC/SmartOS+Techn...

    Did you build the special module with amd-kvm support:
    https://github.com/jclulow/illumos-kvm/tree/pre-ep...
    or
    https://github.com/arekinath/smartos-live

    I would appreciate any insight or links to documentation you could provide. I am familiar with Windows/Linux/BSD so you can let me have the nitty-gritty details, thanks
  • gbolcer - Wednesday, September 19, 2018 - link

    Curious why virtualization disabled?
  • Ozymankos - Sunday, January 27, 2019 - link

    Your tests are typical for a single core machine which is laughable
    please try to download a game with steam,play some music,watch tv on a tvtuner card,play a game on 6 monitors or 8 or 4 ,do some work like computing something in the background(not virus scanners,something intelligent like life on other planets)
    then you shall see the truth
  • intel352 - Thursday, July 18, 2019 - link

    Old article obviously, but wth, numerous benchmark graphics are excluding 2950x in the results. Pretty bad quality control.
  • EthanWalker28 - Monday, February 24, 2020 - link

    If you are looking for custom writing firm to help you out with your academic writing issues, then you have just found the right one. Now, you don’t have to worry about getting a failing mark simply because you have been accused of plagiarizing someone else’s work. Check this <a href="https://ewriters.pro/" rel="nofollow">ewriters.pro</a> Order essay online staying 100% safe and confidential.

Log in

Don't have an account? Sign up now