AMD Stock Coolers: Wraith v2

When AMD launched the Wraith cooler last year, bundled with the premium FX CPUs and highest performing APUs, it was a refreshing take on the eternal concept that the stock cooler isn’t worth the effort of using if you want any sustained performance. The Wraith, and the 125W/95W silent versions of the Wraith, were built like third party coolers, with a copper base/core, heatpipes, and a good fan. In our roundup of stock coolers, it was clear the Wraith held the top spot, easily matching $30 coolers in the market, except now it was being given away with the CPUs/APUs that needed that amount of cooling.

That was essentially a trial run for the Ryzen set of Wraith coolers. For the Ryzen 7 launch, AMD will have three models in play.

These are iterative designs on the original, with minor tweaks and aesthetic changes, but the concept is still the same – a 65W near silent design (Stealth), a 95W near silent design (Spire), and a 95W/125W premium model (Max). The 125W models come with an RGB light (which can be disabled), however AMD has stated that the premium model is currently destined for OEM and SI designs only. The other two will be bundled with the CPUs or potentially be available at retail. We have asked that we get the set in for review, to add to our Wraith numbers.

Memory Support

With every generation of CPUs, each one comes with a ‘maximum supported memory frequency’. This is typically given as a number, with the number aligning with the industry standard JEDEC sub-timings. Technically most processors will go above and beyond the memory frequency as the integrated memory controller supports a lot more; but the manufacturer only officially guarantees up to the maximum supported frequency on qualified memory kits.

The frequency, for consumer chips, is usually given as a single number no matter how many memory slots are populated. In reality when more memory modules are in play, it puts more strain on the memory controller so there is a higher potential for errors. This is why qualification is important – if the vendor has a guaranteed speed, any configuration for a qualified kit should work at that speed.

In the server market, a CPU manufacturer might list support a little differently – a supported frequency depending on how many memory modules are in play, and what type of modules. This arguably makes it very confusing when applied at a consumer level, but on a server level it is expected that OEMs can handle the varying degree of support.

For Ryzen, AMD is taking the latter approach. What we have is DDR4-2666 for the simplest configuration – one module per channel of single rank UDIMMs. This moves through to DDR4-1866 for the most strenuous configuration at two modules per channel with dual-rank UDIMMs. For our testing, we were running the memory at DDR4-2400, for lack of a fixed option, however we will have memory scaling numbers in due course. At present, ECC is not supported ECC is supported.

Chipsets and Motherboards Benchmarking Suite 2017
Comments Locked

574 Comments

View All Comments

  • ABR - Sunday, March 5, 2017 - link

    Are there any examples of games at 1080p where this actually matters? (I.e., not a drop from 132 to 108 fps, but from 65 to 53 or 42 to 34?)
  • ABR - Monday, March 6, 2017 - link

    I mean at 1080p. (Edit, edit...)
  • 0ldman79 - Monday, March 6, 2017 - link

    That's my thought as well.

    Seriously, it isn't like we're talking unplayable, it is still ridiculous gaming levels. It is almost guaranteed to be a scheduler problem in Windows judging by the performance deficit compared to other applications. If it isn't, it is still running very, very well.

    Hell, I can play practically anything I can think of on my FX 6300, I don't really *need* a better CPU right now, I'm just really, really tempted and looking for excuses (I can't encode at the same speed in software as my Nvidia encoder, damn, I need to upgrade...)
  • Outlander_04 - Monday, March 6, 2017 - link

    Do you think anyone building a computer with a $500 US chip is going to just be spending $120 on a 1080p monitor?
    More likely they will be building it for higher resolutions
  • Notmyusualid - Tuesday, March 7, 2017 - link

    I've seen it happen...
  • mdriftmeyer - Tuesday, March 7, 2017 - link

    Who gives a crap if you've seen it happen. Your experience is an anomaly relative to the totality of statistical data.
  • Notmyusualid - Wednesday, March 8, 2017 - link

    Or somebody was just happy with their existing screen?

    I can actually point to two friends with 1080 screens, both lovely water cooled rigs, one is determined to keep his high-freq 1080 screen, and the other one just doesn't care. So facts is facts son.

    I guess it is YOU that gives that crap afterall.
  • Zaggulor - Thursday, March 9, 2017 - link

    Statistical data suggests that people don't actually often get a new display when they change a GPU and quite often that same display will be moved to a new rig too.

    Average upgrade times for components are:

    CPU: ~4.5 years
    GPU: ~2.5 years
    Display: ~7 years

    These days you can also use any unused GPU resources for downsampling even if your CPU can't push any more frames. Both GPU vendors have build in support for it (VSR/DSR).
  • hyno111 - Wednesday, March 8, 2017 - link

    Or a $200 1080p/144Hz/Freesync monitor.
  • Marburg U - Sunday, March 5, 2017 - link

    I guess it's time to retire my Core 2 Quad.

Log in

Don't have an account? Sign up now