Performance - An Update

The Chipworks PS4 teardown last week told us a lot about what’s happened between the Xbox One and PlayStation 4 in terms of hardware. It turns out that Microsoft’s silicon budget was actually a little more than Sony’s, at least for the main APU. The Xbox One APU is a 363mm^2 die, compared to 348mm^2 for the PS4’s APU. Both use a similar 8-core Jaguar CPU (2 x quad-core islands), but they feature different implementations of AMD’s Graphics Core Next GPUs. Microsoft elected to implement 12 compute units, two geometry engines and 16 ROPs, while Sony went for 18 CUs, two geometry engines and 32 ROPs. How did Sony manage to fit in more compute and ROP partitions into a smaller die area? By not including any eSRAM on-die.

While both APUs implement a 256-bit wide memory interface, Sony chose to use GDDR5 memory running at a 5.5GHz data rate. Microsoft stuck to more conventionally available DDR3 memory running at less than half the speed (2133MHz data rate). In order to make up for the bandwidth deficit, Microsoft included 32MB of eSRAM on its APU in order to alleviate some of the GPU bandwidth needs. The eSRAM is accessible in 8MB chunks, with a total of 204GB/s of bandwidth offered (102GB/s in each direction) to the memory. The eSRAM is designed for GPU access only, CPU access requires a copy to main memory.

Unlike Intel’s Crystalwell, the eSRAM isn’t a cache - instead it’s mapped to a specific address range in memory. And unlike the embedded DRAM in the Xbox 360, the eSRAM in the One can hold more than just a render target or Z-buffer. Virtually any type of GPU accessible surface/buffer type can now be stored in eSRAM (e.g. z-buffer, G-buffer, stencil buffers, shadow buffer, etc…). Developers could also choose to store things like important textures in this eSRAM as well, there’s nothing that states it needs to be one of these buffers just anything the developer finds important. It’s also possible for a single surface to be split between main memory and eSRAM.

Obviously sticking important buffers and other frequently used data here can definitely reduce demands on the memory interface, which should help Microsoft get by with only having ~68GB/s of system memory bandwidth. Microsoft has claimed publicly that actual bandwidth to the eSRAM is somewhere in the 140 - 150GB/s range, which is likely equal to the effective memory bandwidth (after overhead/efficiency losses) to the PS4’s GDDR5 memory interface. The difference being that you only get that bandwidth to your most frequently used data on the Xbox One. It’s still not clear to me what effective memory bandwidth looks like on the Xbox One, I suspect it’s still a bit lower than on the PS4, but after talking with Ryan Smith (AT’s Senior GPU Editor) I’m now wondering if memory bandwidth isn’t really the issue here.

Microsoft Xbox One vs. Sony PlayStation 4 Spec comparison
  Xbox 360 Xbox One PlayStation 4
CPU Cores/Threads 3/6 8/8 8/8
CPU Frequency 3.2GHz 1.75GHz 1.6GHz
CPU µArch IBM PowerPC AMD Jaguar AMD Jaguar
Shared L2 Cache 1MB 2 x 2MB 2 x 2MB
GPU Cores   768 1152
GCN Geometry Engines   2 2
GCN ROPs   16 32
GPU Frequency   853MHz 800MHz
Peak Shader Throughput 0.24 TFLOPS 1.31 TFLOPS 1.84 TFLOPS
Embedded Memory 10MB eDRAM 32MB eSRAM -
Embedded Memory Bandwidth 32GB/s 102GB/s bi-directional (204GB/s total) -
System Memory 512MB 1400MHz GDDR3 8GB 2133MHz DDR3 8GB 5500MHz GDDR5
System Memory Bus 128-bits 256-bits 256-bits
System Memory Bandwidth 22.4 GB/s 68.3 GB/s 176.0 GB/s
Manufacturing Process   28nm 28nm

In order to accommodate the eSRAM on die Microsoft not only had to move to a 12 CU GPU configuration, but it’s also only down to 16 ROPs (half of that of the PS4). The ROPs (render outputs/raster operations pipes) are responsible for final pixel output, and at the resolutions these consoles are targeting having 16 ROPs definitely puts the Xbox One as the odd man out in comparison to PC GPUs. Typically AMD’s GPU targeting 1080p come with 32 ROPs, which is where the PS4 is, but the Xbox One ships with half that. The difference in raw shader performance (12 CUs vs 18 CUs) can definitely creep up in games that run more complex lighting routines and other long shader programs on each pixel, but all of the more recent reports of resolution differences between Xbox One and PS4 games at launch are likely the result of being ROP bound on the One. This is probably why Microsoft claimed it saw a bigger increase in realized performance from increasing the GPU clock from 800MHz to 853MHz vs. adding two extra CUs. The ROPs operate at GPU clock, so an increase in GPU clock in a ROP bound scenario would increase performance more than adding more compute hardware.

The PS4's APU - Courtesy Chipworks

Microsoft’s admission that the Xbox One dev kits have 14 CUs does make me wonder what the Xbox One die looks like. Chipworks found that the PS4’s APU actually features 20 CUs, despite only exposing 18 to game developers. I suspect those last two are there for defect mitigation/to increase effective yields in the case of bad CUs, I wonder if the same isn’t true for the Xbox One.

At the end of the day Microsoft appears to have ended up with its GPU configuration not for silicon cost reasons, but for platform power/cost and component availability reasons. Sourcing DDR3 is much easier than sourcing high density GDDR5. Sony managed to obviously launch with a ton of GDDR5 just fine, but I can definitely understand why Microsoft would be hesitant to go down that route in the planning stages of Xbox One. To put some numbers in perspective, Sony has shipped 1 million PS4s thus far. That's 16 million GDDR5 chips, or 7.6 Petabytes of RAM. Had both Sony and Microsot tried to do this, I do wonder if GDDR5 supply would've become a problem. That's a ton of RAM in a very short period of time. The only other major consumer of GDDR5 are video cards, and the number of cards sold in the last couple of months that would ever use that RAM is a narrow list. 

Microsoft will obviously have an easier time scaling its platform down over the years (eSRAM should shrink nicely at smaller geometry processes), but that’s not a concern to the end user unless Microsoft chooses to aggressively pass along cost savings.

Introduction, Hardware, Controller & OS Image Quality - Xbox 360 vs. Xbox One
Comments Locked

286 Comments

View All Comments

  • PhatTran - Friday, November 22, 2013 - link

    My sincere advice is you should buy a PS4 now. Why? You can see here: http://lovingtheclassicsreviewsite.net/trend/6-rea...
  • lilkwarrior - Friday, November 22, 2013 - link

    It is a shame hearing about the reported use of SATAII and the lack of 802.11ac from both consoles.

    Given some of the title of the game are over 40GB in size, something tells me that'll need to be addressed with the inevitable XboxOne Slim and PS4 slim models that'll come out about 2-3 years from now.

    Especially Microsoft odd stance on not allowing the hard drive to be removed. PS4 wireless limitations are sort of an odd decision given the stigma of their PSN network being slower and unresponsive; any help from the hardware to be at current standards and future-proof.

    Excluding China obviously, some of the fastest broadband infrastructures in the world (i.e. South Korea) are based in Asia. I would think that they would have at least took Microsoft's route to have a dual-band 802.11n connections being available to them.

    It's weird that even Sony's standard phones connect and download to the internet using WiFi faster than their flagship console. It makes little sense.

    Disappointed I'll have to wait this gen out for 2-3 years. By then, hopefully the ripple effect of SteamOS, Steam Controller, G-Sync, Mantle, Oculus Rift, 4K gaming, and so on will be evident enough to even consider buying either console outside of exclusives.
  • lilkwarrior - Friday, November 22, 2013 - link

    *any help from the hardware to be at least accommodate common American wireless speeds and be a bit more future-proof would have been helpful to improve the perception of PSN for Sony.
  • vol7ron - Friday, November 22, 2013 - link

    In the Gravity demo, 0:02s in. It was interesting to see the difference in the astronaut falling.

    To me, it appeared that the 360 had higher contrast, but there were also other inconsistencies. A black bar ran across the leg of another astronaut in the scene -- I suspect this was debris -- but more notably the 360's face shield was blacked out, whereas the XB1 showed the astronaut's full face.

    In terms of quality, due to the higher contrast, it actually seemed like the 360 won out there. However, as expressed, in all the other scenes despite brighter lighting, the XB1 had much better detail and noticeable edges -- the 360 was much softer and less defined.

    What I don't understand is the naming convention. Why XB1? It's not the first XBox,
  • Blaster1618 - Friday, November 22, 2013 - link

    No Minecraft yet on PS4, thats a deal breaker for our family. I want to drive a Mclaren P1.
  • blitzninja - Saturday, November 23, 2013 - link

    All this talk about specs and even the "higher spec console loses the war" non-sense is so stupid, just stop.

    You guys here on AnandTech need to realize that you live in your own little bubble and while you may know a lot about the consoles, the casual consumer market (which makes up most people) have different priorities. So why did Nintendo products beat it's competitors with the Wii while having horrible specs? The experience.

    Yes, there is a performance difference between the PS4 and the XO but what really matters is how the console feels and does what people want it to do. This is where the Wii comes in (the Wii U was a flop because they actually went backwards in this regard). Most of the console market is made up of casual gamers. Casual gamers like to invite their friends over and have a LAN party or party game, play with their family (this includes younger audiences), watch movies together and play music at times. The Wii dominated the market because of it's new control interface(s) that added the missing point to this market, it was extremely versatile and made playing it all that more fun than the other consoles.

    This is why Nintendo didn't really beef up the Wii U, they just added the extra power to allow for more advanced and precise gesture computation.

    So why isn't the Wii U dominating again? Well for starters, most people who have a Wii are satisfied with it and are not out to buy a new one, the Wii U doesn't add anything spectacular that would make the majority of it's target market want to upgrade.

    The reason the higher spec console ended up losing is because when the company developed the console, they focused their resources on the performance and as a result cut back on the usability and experience aspect. But that isn't necessarily the case, it all depends on what the focus experience of the console is (the market) and how well polished that experience is.

    If Microsoft want's to win the war it needs to pander to the needs of the casual market, not to say it should copy Nintendo but it has another market. The all-in-one market, that is to say make the XO a future PVR, set-top-box, media/streaming centre. Replace the HTPC with a low cost alternative. Most descent HTPCs fall into the $500-$700 market for those who want some light gaming too. The XO would absolutely destroy this market with the proper hardware and software support. Being a console for mid-high end gaming while still being a multimedia powerhouse that does a multitude of things. This includes the voice recognition, a killer feature, if done right.

    If I could say "latest episode of the walking dead" or some other show and it worked, then gg Sony, you just got rolled.

    @AnandTech: Fix your forum/comment software, not having an edit button is really annoying
  • Hixbot - Sunday, November 24, 2013 - link

    The Wii dominated sales at first, they captured a market of casual gamers that otherwise wouldn't have a bought a console. That market didn't buy many games, attach rate and they grew tired of the Wii, with all the smartphone and Facebook games etc.
    The Wii sales slumped, and in the end, x360 and PS3 each surpassed it in total by 2012.
    For us hardcore gamers who also are Nintendo fans, the Wii was bought but it then left a bad taste in our mouths. The outstanding titles were few and far between, and the rest was shovelware. True motion control never really materialized in many games, most just made use of a "waggle" gimmick.
    Wii-u comes out, casual gamers have already moved on, and the hardcores are reluctant to jump into another gimmick "tablet" just for the Nintendo software.

    Disclosure: As a big N fan, I bought a wii-u for the Nintendo 1st party titles. Others like me are the only people buying this thing.
  • Exodite - Saturday, November 23, 2013 - link

    Thanks for the mini-review, much appreciated! Some interesting technical information no doubt.

    Personally I'm more keen on the PS4, primarily due to having good experiences with Sony equipment in general as well as the price. We currently have a Sony BluRay player (the BDP-470S) and I'd have loved to replace it with a gaming-capable alternative (that also does Netflix) but alas that's unlikely unless Sony can squeeze in CD, MP3 and most importantly DLNA support in the machine.

    Anyway, I'm also concerned about the sound levels of the machines as I have quite sensitive ears and I find even my current BluRay player to be something of a hair dryer when playing back discs. BD discs in particular.
  • ol1bit - Saturday, November 23, 2013 - link

    That was an awesome mini review! One of the best review's I've read about these new titans.

    'm really surprised that 8 year old 360 hardware is as close as it is! A tad old now, but a great book to read on the old hardware is "The Race for a New Game Machine". This book really shows how MS pulled some fast ones on Sony and ended up with the better plan.

    This time looks like Sony really kept everything under wraps better and has at least a slight upper hand. There is no way MS can make it's hardware better/faster at this point. Good Enough? Maybe...Time will tell.
  • nerd1 - Saturday, November 23, 2013 - link

    It's funny that those 'next gen' consoles are actually on par with some gaming laptops and nothing better. PC is the best gaming console again.

Log in

Don't have an account? Sign up now