Performance - An Update

The Chipworks PS4 teardown last week told us a lot about what’s happened between the Xbox One and PlayStation 4 in terms of hardware. It turns out that Microsoft’s silicon budget was actually a little more than Sony’s, at least for the main APU. The Xbox One APU is a 363mm^2 die, compared to 348mm^2 for the PS4’s APU. Both use a similar 8-core Jaguar CPU (2 x quad-core islands), but they feature different implementations of AMD’s Graphics Core Next GPUs. Microsoft elected to implement 12 compute units, two geometry engines and 16 ROPs, while Sony went for 18 CUs, two geometry engines and 32 ROPs. How did Sony manage to fit in more compute and ROP partitions into a smaller die area? By not including any eSRAM on-die.

While both APUs implement a 256-bit wide memory interface, Sony chose to use GDDR5 memory running at a 5.5GHz data rate. Microsoft stuck to more conventionally available DDR3 memory running at less than half the speed (2133MHz data rate). In order to make up for the bandwidth deficit, Microsoft included 32MB of eSRAM on its APU in order to alleviate some of the GPU bandwidth needs. The eSRAM is accessible in 8MB chunks, with a total of 204GB/s of bandwidth offered (102GB/s in each direction) to the memory. The eSRAM is designed for GPU access only, CPU access requires a copy to main memory.

Unlike Intel’s Crystalwell, the eSRAM isn’t a cache - instead it’s mapped to a specific address range in memory. And unlike the embedded DRAM in the Xbox 360, the eSRAM in the One can hold more than just a render target or Z-buffer. Virtually any type of GPU accessible surface/buffer type can now be stored in eSRAM (e.g. z-buffer, G-buffer, stencil buffers, shadow buffer, etc…). Developers could also choose to store things like important textures in this eSRAM as well, there’s nothing that states it needs to be one of these buffers just anything the developer finds important. It’s also possible for a single surface to be split between main memory and eSRAM.

Obviously sticking important buffers and other frequently used data here can definitely reduce demands on the memory interface, which should help Microsoft get by with only having ~68GB/s of system memory bandwidth. Microsoft has claimed publicly that actual bandwidth to the eSRAM is somewhere in the 140 - 150GB/s range, which is likely equal to the effective memory bandwidth (after overhead/efficiency losses) to the PS4’s GDDR5 memory interface. The difference being that you only get that bandwidth to your most frequently used data on the Xbox One. It’s still not clear to me what effective memory bandwidth looks like on the Xbox One, I suspect it’s still a bit lower than on the PS4, but after talking with Ryan Smith (AT’s Senior GPU Editor) I’m now wondering if memory bandwidth isn’t really the issue here.

Microsoft Xbox One vs. Sony PlayStation 4 Spec comparison
  Xbox 360 Xbox One PlayStation 4
CPU Cores/Threads 3/6 8/8 8/8
CPU Frequency 3.2GHz 1.75GHz 1.6GHz
CPU µArch IBM PowerPC AMD Jaguar AMD Jaguar
Shared L2 Cache 1MB 2 x 2MB 2 x 2MB
GPU Cores   768 1152
GCN Geometry Engines   2 2
GCN ROPs   16 32
GPU Frequency   853MHz 800MHz
Peak Shader Throughput 0.24 TFLOPS 1.31 TFLOPS 1.84 TFLOPS
Embedded Memory 10MB eDRAM 32MB eSRAM -
Embedded Memory Bandwidth 32GB/s 102GB/s bi-directional (204GB/s total) -
System Memory 512MB 1400MHz GDDR3 8GB 2133MHz DDR3 8GB 5500MHz GDDR5
System Memory Bus 128-bits 256-bits 256-bits
System Memory Bandwidth 22.4 GB/s 68.3 GB/s 176.0 GB/s
Manufacturing Process   28nm 28nm

In order to accommodate the eSRAM on die Microsoft not only had to move to a 12 CU GPU configuration, but it’s also only down to 16 ROPs (half of that of the PS4). The ROPs (render outputs/raster operations pipes) are responsible for final pixel output, and at the resolutions these consoles are targeting having 16 ROPs definitely puts the Xbox One as the odd man out in comparison to PC GPUs. Typically AMD’s GPU targeting 1080p come with 32 ROPs, which is where the PS4 is, but the Xbox One ships with half that. The difference in raw shader performance (12 CUs vs 18 CUs) can definitely creep up in games that run more complex lighting routines and other long shader programs on each pixel, but all of the more recent reports of resolution differences between Xbox One and PS4 games at launch are likely the result of being ROP bound on the One. This is probably why Microsoft claimed it saw a bigger increase in realized performance from increasing the GPU clock from 800MHz to 853MHz vs. adding two extra CUs. The ROPs operate at GPU clock, so an increase in GPU clock in a ROP bound scenario would increase performance more than adding more compute hardware.

The PS4's APU - Courtesy Chipworks

Microsoft’s admission that the Xbox One dev kits have 14 CUs does make me wonder what the Xbox One die looks like. Chipworks found that the PS4’s APU actually features 20 CUs, despite only exposing 18 to game developers. I suspect those last two are there for defect mitigation/to increase effective yields in the case of bad CUs, I wonder if the same isn’t true for the Xbox One.

At the end of the day Microsoft appears to have ended up with its GPU configuration not for silicon cost reasons, but for platform power/cost and component availability reasons. Sourcing DDR3 is much easier than sourcing high density GDDR5. Sony managed to obviously launch with a ton of GDDR5 just fine, but I can definitely understand why Microsoft would be hesitant to go down that route in the planning stages of Xbox One. To put some numbers in perspective, Sony has shipped 1 million PS4s thus far. That's 16 million GDDR5 chips, or 7.6 Petabytes of RAM. Had both Sony and Microsot tried to do this, I do wonder if GDDR5 supply would've become a problem. That's a ton of RAM in a very short period of time. The only other major consumer of GDDR5 are video cards, and the number of cards sold in the last couple of months that would ever use that RAM is a narrow list. 

Microsoft will obviously have an easier time scaling its platform down over the years (eSRAM should shrink nicely at smaller geometry processes), but that’s not a concern to the end user unless Microsoft chooses to aggressively pass along cost savings.

Introduction, Hardware, Controller & OS Image Quality - Xbox 360 vs. Xbox One
Comments Locked

286 Comments

View All Comments

  • F00L1Sh - Friday, November 22, 2013 - link

    I found this explanation very helpful.
  • beefgyorki - Wednesday, November 20, 2013 - link

    Anand, when MS initially talked about the Xbox One OS design from their description it certainly sounded like the Xbox OS (i.e. the gaming OS) was just a VM running on top of a hypervisor. Given that, then in theory that VM could be modified to be made runnable on say a Windows Desktop PC or potentially even a Tablet.

    With one in hand now, is there anything that can be done to shed some light on that possibility?

    To me the most intriguing aspect of XB1 is the OS if it truly is just a VM because that could open up some really interesting possibilities down the road.
  • flyingpants1 - Wednesday, November 20, 2013 - link

    What do you mean "just a VM", don't you realise the Xbox 360 OS was running in a VM too?
  • Elooder2 - Thursday, November 21, 2013 - link

    This. Was Xbox360 on an x86 CPU? No. But Xbone is. Therefore it seems logical to consider that if there is a possibility of somehow "extracting" the actual VM from the XBone, it could be made to run on a normal Windows PC with much less modification and hassle than the Xbox360 VM because there's no need to worry about the difference in architecture. Basically, I perceive that the biggest deterrent to making an "emulator" of the XBone (via a VM) is some form of software or hardware DRM. The Mac has a similar mechanism in Mac OS which will not let you install that OS on a regular PC because the regular PC doesn't have some extra chip that the boot code of the OS install disc looks for. As we all know, this was quite successfully cracked and Hackintoshes are plentiful. Ok, so Microsoft is not Apple and they may go down on anyone releasing an XBone emulator, but it doesn't mean it can't be done. It would seem much easier to produce an emulator for a console that uses, basically, almost, off-the-shelf parts.
  • PliotronX - Wednesday, November 20, 2013 - link

    Good lord the Xbone falls short. The embedded SRAM is irrelevant, trading outright strength in 3D for faster operations tied to the subsystem is a failing strategy dating back to the PSX and Sega Saturn.
  • Teknobug - Wednesday, November 20, 2013 - link

    Looks like PS4 wins not only in hardware specs, but graphics visuals. The only difference maker between the two seems to be game titles. I would have bought the Playstation 4 if Gran Turismo 6 was coming out for it but nope they released it for the PS3, bummer. I have Forza 2, 3, 4 for X360 and will not get Forza 5 after how Turn10 turned Forza 4 into a cash cow with DLC cars.
  • warezme - Wednesday, November 20, 2013 - link

    Exactly, it is huge failure on the MS side and I suspect many a game developer will eventually reveal just how limiting their decision has been. Overall for the two consoles that I would consider to be a modern investment of 3 to 5 years, these are pretty pathetic hardware examples. Current gen PC's are already way ahead and the difference will only continue to surpass these consoles.
  • Homeles - Wednesday, November 20, 2013 - link

    Actually, what's wrong with you? It's pretty common knowledge that ROPs are huge consumers of memory bandwidth in a GPU, and with the Xbone having half of them, memory bandwidth becomes far less of an issue.

    Get educated.
  • Spunjji - Tuesday, November 26, 2013 - link

    Less of an issue at a given performance level. Your performance becomes gated by the ROPs instead, so it's still a bloody stupid design decision for a "next gen" console.
  • Sabresiberian - Wednesday, November 20, 2013 - link

    Frankly, I'm disappointed in both of them In an age where PCs are moving to 2560x1440 as a standard, 120Hz, and G-sync. These consoles are simply already dated, even more so than at the release of the Xbox 360 and PS3. Good on the upgrades, but I simply can't see buying one over a PC I can build for around $500. (To be fair, it would cost you closer to $700 if you buy pre-made, but I'll point out that almost every one already has a PC. $500 for a PC and $400 for a console means spending more money, not less, for less capability; it only makes sense if you need 2 different pieces of hardware so one person in the family can use one while the other uses something else.)

    The only thing consoles offer is existing community. If all your friends play on an Xbox, or Playstation, it is hard to buy a PC instead. However, that isn't a plus, it is a minus because it sets apart gamers that want to play together. It polarizes those gamers that are emotionally attached to one or the other, and that is just bad for everyone. Good news is that Microsoft is talking about making it so PC players can play with Xbone players - but how is that going to effect the quality of the PC versions? Are they going to have to be capped in terms of game responsiveness and frame rates in order to level the playing field?

    Don't get me wrong; I'm not bashing console players themselves. And, I get the attraction to cool hardware, I'm even tempted a bit myself, just cause "cool hardware" despite the limitations involved. And, there's the whole playing with others thing, havng both consoles would mean I didn't have to exclude people I want to game with. But, I'd feel like I'd be supporting a part of gaming that I really believe is bad for gamers in this day and age, so I won't be buying a console.

    (And, don't give me any freakin tired, old arguments about controllers and a "different experience". It simply is not true, you can use any console controller on a PC. There is absolutely, categorically nothing you can do on a console that you can't do on a PC, except connect with exclusive communities and play exclusive games. Exclusive communities are bad for gamers as a whole, exclusive games are bad for gamers, too. Crappy hardware is bad for everyone.)

Log in

Don't have an account? Sign up now