The Ryzen Die

Throughout the time leading up to the launch of Ryzen, AMD reaffirmed its commitment to at least +40% IPC improvement over Excavator. This was specifically listed as a goal relating to performance, at an equivalent energy per cycle, resulting in a 40% increase in efficiency. At the Tech Day, AMD listed an overall 2.7x (or 270%) performance per watt improvement, split into the following:

Obviously a number of benefits come from moving the 28nm TSMC process to GloFo’s 14nm FinFET process which is used via a Samsung license. Both the smaller node and FinFET improvements have been well documented so we won’t go over them here, but AMD is stating that Zen is much more than this as a direct improvement to immediate performance, not just efficiency. While Zen is initially a high-performance x86 core at heart, it is designed to scale all the way from notebooks to supercomputers, or from where the Cat cores (such as Jaguar and Puma) were all the way up to the old Opterons and beyond, all with at least +40% IPC.

The first immediate image out of the presentation is the CPU Complex (a CCX), which shows the Zen core design as a four-CPU cluster with caches. This shows the L2/L3 cache breakdown, and also confirms 2MB of L3 per core with 8 MB of L3 per CCX. It also states that the L3 is mostly inclusive of the L2 cache, which stems from the L3 cache as a victim cache for L2 data. AMD is stating that the protocols involved in the L3 cache design allow each core to access the L3 of each other core with an average (but range) of latencies.

Over the next few pages, we’ll go through the slides. They detail more information about the application of Simultaneous Multithreading (SMT), New Instructions, the size of various queues and buffers, the back-end of the design, the front-end of the design, fetch, decode, execute, load/store and retire segments.

Zen: New Core Features The High Level Zen Overview
Comments Locked

574 Comments

View All Comments

  • ABR - Sunday, March 5, 2017 - link

    Are there any examples of games at 1080p where this actually matters? (I.e., not a drop from 132 to 108 fps, but from 65 to 53 or 42 to 34?)
  • ABR - Monday, March 6, 2017 - link

    I mean at 1080p. (Edit, edit...)
  • 0ldman79 - Monday, March 6, 2017 - link

    That's my thought as well.

    Seriously, it isn't like we're talking unplayable, it is still ridiculous gaming levels. It is almost guaranteed to be a scheduler problem in Windows judging by the performance deficit compared to other applications. If it isn't, it is still running very, very well.

    Hell, I can play practically anything I can think of on my FX 6300, I don't really *need* a better CPU right now, I'm just really, really tempted and looking for excuses (I can't encode at the same speed in software as my Nvidia encoder, damn, I need to upgrade...)
  • Outlander_04 - Monday, March 6, 2017 - link

    Do you think anyone building a computer with a $500 US chip is going to just be spending $120 on a 1080p monitor?
    More likely they will be building it for higher resolutions
  • Notmyusualid - Tuesday, March 7, 2017 - link

    I've seen it happen...
  • mdriftmeyer - Tuesday, March 7, 2017 - link

    Who gives a crap if you've seen it happen. Your experience is an anomaly relative to the totality of statistical data.
  • Notmyusualid - Wednesday, March 8, 2017 - link

    Or somebody was just happy with their existing screen?

    I can actually point to two friends with 1080 screens, both lovely water cooled rigs, one is determined to keep his high-freq 1080 screen, and the other one just doesn't care. So facts is facts son.

    I guess it is YOU that gives that crap afterall.
  • Zaggulor - Thursday, March 9, 2017 - link

    Statistical data suggests that people don't actually often get a new display when they change a GPU and quite often that same display will be moved to a new rig too.

    Average upgrade times for components are:

    CPU: ~4.5 years
    GPU: ~2.5 years
    Display: ~7 years

    These days you can also use any unused GPU resources for downsampling even if your CPU can't push any more frames. Both GPU vendors have build in support for it (VSR/DSR).
  • hyno111 - Wednesday, March 8, 2017 - link

    Or a $200 1080p/144Hz/Freesync monitor.
  • Marburg U - Sunday, March 5, 2017 - link

    I guess it's time to retire my Core 2 Quad.

Log in

Don't have an account? Sign up now