Performance - An Update

The Chipworks PS4 teardown last week told us a lot about what’s happened between the Xbox One and PlayStation 4 in terms of hardware. It turns out that Microsoft’s silicon budget was actually a little more than Sony’s, at least for the main APU. The Xbox One APU is a 363mm^2 die, compared to 348mm^2 for the PS4’s APU. Both use a similar 8-core Jaguar CPU (2 x quad-core islands), but they feature different implementations of AMD’s Graphics Core Next GPUs. Microsoft elected to implement 12 compute units, two geometry engines and 16 ROPs, while Sony went for 18 CUs, two geometry engines and 32 ROPs. How did Sony manage to fit in more compute and ROP partitions into a smaller die area? By not including any eSRAM on-die.

While both APUs implement a 256-bit wide memory interface, Sony chose to use GDDR5 memory running at a 5.5GHz data rate. Microsoft stuck to more conventionally available DDR3 memory running at less than half the speed (2133MHz data rate). In order to make up for the bandwidth deficit, Microsoft included 32MB of eSRAM on its APU in order to alleviate some of the GPU bandwidth needs. The eSRAM is accessible in 8MB chunks, with a total of 204GB/s of bandwidth offered (102GB/s in each direction) to the memory. The eSRAM is designed for GPU access only, CPU access requires a copy to main memory.

Unlike Intel’s Crystalwell, the eSRAM isn’t a cache - instead it’s mapped to a specific address range in memory. And unlike the embedded DRAM in the Xbox 360, the eSRAM in the One can hold more than just a render target or Z-buffer. Virtually any type of GPU accessible surface/buffer type can now be stored in eSRAM (e.g. z-buffer, G-buffer, stencil buffers, shadow buffer, etc…). Developers could also choose to store things like important textures in this eSRAM as well, there’s nothing that states it needs to be one of these buffers just anything the developer finds important. It’s also possible for a single surface to be split between main memory and eSRAM.

Obviously sticking important buffers and other frequently used data here can definitely reduce demands on the memory interface, which should help Microsoft get by with only having ~68GB/s of system memory bandwidth. Microsoft has claimed publicly that actual bandwidth to the eSRAM is somewhere in the 140 - 150GB/s range, which is likely equal to the effective memory bandwidth (after overhead/efficiency losses) to the PS4’s GDDR5 memory interface. The difference being that you only get that bandwidth to your most frequently used data on the Xbox One. It’s still not clear to me what effective memory bandwidth looks like on the Xbox One, I suspect it’s still a bit lower than on the PS4, but after talking with Ryan Smith (AT’s Senior GPU Editor) I’m now wondering if memory bandwidth isn’t really the issue here.

Microsoft Xbox One vs. Sony PlayStation 4 Spec comparison
  Xbox 360 Xbox One PlayStation 4
CPU Cores/Threads 3/6 8/8 8/8
CPU Frequency 3.2GHz 1.75GHz 1.6GHz
CPU µArch IBM PowerPC AMD Jaguar AMD Jaguar
Shared L2 Cache 1MB 2 x 2MB 2 x 2MB
GPU Cores   768 1152
GCN Geometry Engines   2 2
GCN ROPs   16 32
GPU Frequency   853MHz 800MHz
Peak Shader Throughput 0.24 TFLOPS 1.31 TFLOPS 1.84 TFLOPS
Embedded Memory 10MB eDRAM 32MB eSRAM -
Embedded Memory Bandwidth 32GB/s 102GB/s bi-directional (204GB/s total) -
System Memory 512MB 1400MHz GDDR3 8GB 2133MHz DDR3 8GB 5500MHz GDDR5
System Memory Bus 128-bits 256-bits 256-bits
System Memory Bandwidth 22.4 GB/s 68.3 GB/s 176.0 GB/s
Manufacturing Process   28nm 28nm

In order to accommodate the eSRAM on die Microsoft not only had to move to a 12 CU GPU configuration, but it’s also only down to 16 ROPs (half of that of the PS4). The ROPs (render outputs/raster operations pipes) are responsible for final pixel output, and at the resolutions these consoles are targeting having 16 ROPs definitely puts the Xbox One as the odd man out in comparison to PC GPUs. Typically AMD’s GPU targeting 1080p come with 32 ROPs, which is where the PS4 is, but the Xbox One ships with half that. The difference in raw shader performance (12 CUs vs 18 CUs) can definitely creep up in games that run more complex lighting routines and other long shader programs on each pixel, but all of the more recent reports of resolution differences between Xbox One and PS4 games at launch are likely the result of being ROP bound on the One. This is probably why Microsoft claimed it saw a bigger increase in realized performance from increasing the GPU clock from 800MHz to 853MHz vs. adding two extra CUs. The ROPs operate at GPU clock, so an increase in GPU clock in a ROP bound scenario would increase performance more than adding more compute hardware.

The PS4's APU - Courtesy Chipworks

Microsoft’s admission that the Xbox One dev kits have 14 CUs does make me wonder what the Xbox One die looks like. Chipworks found that the PS4’s APU actually features 20 CUs, despite only exposing 18 to game developers. I suspect those last two are there for defect mitigation/to increase effective yields in the case of bad CUs, I wonder if the same isn’t true for the Xbox One.

At the end of the day Microsoft appears to have ended up with its GPU configuration not for silicon cost reasons, but for platform power/cost and component availability reasons. Sourcing DDR3 is much easier than sourcing high density GDDR5. Sony managed to obviously launch with a ton of GDDR5 just fine, but I can definitely understand why Microsoft would be hesitant to go down that route in the planning stages of Xbox One. To put some numbers in perspective, Sony has shipped 1 million PS4s thus far. That's 16 million GDDR5 chips, or 7.6 Petabytes of RAM. Had both Sony and Microsot tried to do this, I do wonder if GDDR5 supply would've become a problem. That's a ton of RAM in a very short period of time. The only other major consumer of GDDR5 are video cards, and the number of cards sold in the last couple of months that would ever use that RAM is a narrow list. 

Microsoft will obviously have an easier time scaling its platform down over the years (eSRAM should shrink nicely at smaller geometry processes), but that’s not a concern to the end user unless Microsoft chooses to aggressively pass along cost savings.

Introduction, Hardware, Controller & OS Image Quality - Xbox 360 vs. Xbox One
Comments Locked

286 Comments

View All Comments

  • IKeelU - Thursday, November 21, 2013 - link

    Those "obsessions" in the PC-sphere are academic exercises to underline the differences between otherwise very similar pieces of silicon. Good GPU reviews (and good PC builders) focus on actual game performance and overall experience, incl. power and noise.

    And of course it matters that the PS4 is has a better GPU. It's just that native 1080p vs upscaled 720p (+AA) isn't a world of difference when viewed from 8-10 feet away (don't take my word for it, try for yourself).

    But like Anand states in the article, things might get interesting when PS4 devs use this extra power to do more than just bump up the res. I, for one, would trade 1080p for better effects @ 60fps.
  • chelsea2889 - Thursday, November 21, 2013 - link

    Great comparison of both products! Has anyone else heard of Why Remote though? I heard it has face and hand gesture recognition and apparently integrates with different types of social media and streaming apps. It seems pretty cool, I'm looking forward to seeing them at the upcoming CES convention!
  • greywolf0 - Thursday, November 21, 2013 - link

    Wow and I thought the Xbox One was just significantly handicapped in both memory bandwidth and GPU cores. Now I learn about this magical third thing called ROP where the Xbox One literally has only half that of the PS4 and it noticeably affects perceived resolution and is even lower than the standard AMD configuration for proper 1080p output. More nails in the Microsoft coffin.

    If you want to talk exclusive games and variety, the PS4 has more than enough bald headed space marine games and yet-another-space-marine-FPS-OMG-oversaturation to satiate any Halo desires, if you even had one to begin with. What you won't find on the Xbox One, however, is all the exclusive Japanese-made games, because lets face it, the Xbox is gonna sell poorly in Japan regardless, and that means no incentive to even make a half-ass port for the Xbox. This means all the JRPG fans and quirky Japanese adventure and indie games are not coming to Xbox, just like last gen.

    And Microsoft just opened a Scroogled store selling more anti-Google paraphernalia, a continuation of their assinine and low-brow tactics and culture. They continue to be nothing but assholes day in and day out. They may have curbed their evil corporation ambitions with the backlash from their Xbox mind-control "features", but they show no sign of letting up anywhere else. I didn't think I could care much about tech companies, as they are all in it for money, but Microsoft continues to be the most morally reprehensible one around. A company not worth supporting or saving. To be shunned. It helps that all their recent products have been absolute out of touch flops, from Windows Phone to Windows RT and 8. Ditto Xbox power grab.
  • UltraTech79 - Thursday, November 21, 2013 - link

    >More nails in the Microsoft coffin.

    Drama queen. This shit just doesnt matter in consoles unless youre a fanboy of one side or another. What matters is how good the game plays when they are done and its in your hands.
  • immanuel_aj - Friday, November 22, 2013 - link

    Have to agree with you on the Japanese exclusives. They either take forever to get ported or don't get ported at all, unless it's a big title. I never got a PS3, but the PS4 seems like a good place to start and hopefully there'll be more indie stuff from Japan as well. I'm just waiting for a limited edition console to be released before getting one! Though using a Japanese PSN account is a bit of a pain sometimes.

    However, I don't think the PS4 has that many bald headed space marines ;)
  • jonjonjonj - Thursday, November 21, 2013 - link

    i agree there's always someone crying about power costs. if the $5 a year in power is that big of a deal then you probably shouldn't be spending $500 on an xbox and $60 a year on xbox live.
  • tuxfool - Friday, November 22, 2013 - link

    Or alternatively they might care for the environment. Multiply all that "wasted" power by everyone and it adds up. This is doubly true when the apparent tasks this power is used on don't really require it.
  • maxpwr - Friday, November 22, 2013 - link

    Both "next generation" systems are increadibly weak and outdated. Not enough performance for Oculus Rift, let alone 4K displays.
  • cheshirster - Friday, November 22, 2013 - link

    Please, stop your squarephobia.
  • Origin64 - Friday, November 22, 2013 - link

    I acnually feel this generation is pretty bad for innovation. The PS3 And 360 made sense, at the time. They were very fast machines for the money. Sony sold PS3s at a loss for years. MS I dunno.

    I feel like time has kind of caught up with that kind of console. What's the use of building a whole new OS when these machines are x86 and fast enough to run Linux? Why focus on all kinds of closed proprietary online features when all that has been done before - and better - by volunteers building freeware. You build a PC thats comparable performance-wise and competitive on price with these machines, if you rip some parts out of an old one and replace PSU/mobo/cpu/gfx. Everyone can find a battered old pc that you can screw new parts in. People throw the things away if they get a little slow.

    Then you can have the power of running what you want on the machine you paid for. Complete control. It'll save money in the long run.

Log in

Don't have an account? Sign up now