Performance - An Update

The Chipworks PS4 teardown last week told us a lot about what’s happened between the Xbox One and PlayStation 4 in terms of hardware. It turns out that Microsoft’s silicon budget was actually a little more than Sony’s, at least for the main APU. The Xbox One APU is a 363mm^2 die, compared to 348mm^2 for the PS4’s APU. Both use a similar 8-core Jaguar CPU (2 x quad-core islands), but they feature different implementations of AMD’s Graphics Core Next GPUs. Microsoft elected to implement 12 compute units, two geometry engines and 16 ROPs, while Sony went for 18 CUs, two geometry engines and 32 ROPs. How did Sony manage to fit in more compute and ROP partitions into a smaller die area? By not including any eSRAM on-die.

While both APUs implement a 256-bit wide memory interface, Sony chose to use GDDR5 memory running at a 5.5GHz data rate. Microsoft stuck to more conventionally available DDR3 memory running at less than half the speed (2133MHz data rate). In order to make up for the bandwidth deficit, Microsoft included 32MB of eSRAM on its APU in order to alleviate some of the GPU bandwidth needs. The eSRAM is accessible in 8MB chunks, with a total of 204GB/s of bandwidth offered (102GB/s in each direction) to the memory. The eSRAM is designed for GPU access only, CPU access requires a copy to main memory.

Unlike Intel’s Crystalwell, the eSRAM isn’t a cache - instead it’s mapped to a specific address range in memory. And unlike the embedded DRAM in the Xbox 360, the eSRAM in the One can hold more than just a render target or Z-buffer. Virtually any type of GPU accessible surface/buffer type can now be stored in eSRAM (e.g. z-buffer, G-buffer, stencil buffers, shadow buffer, etc…). Developers could also choose to store things like important textures in this eSRAM as well, there’s nothing that states it needs to be one of these buffers just anything the developer finds important. It’s also possible for a single surface to be split between main memory and eSRAM.

Obviously sticking important buffers and other frequently used data here can definitely reduce demands on the memory interface, which should help Microsoft get by with only having ~68GB/s of system memory bandwidth. Microsoft has claimed publicly that actual bandwidth to the eSRAM is somewhere in the 140 - 150GB/s range, which is likely equal to the effective memory bandwidth (after overhead/efficiency losses) to the PS4’s GDDR5 memory interface. The difference being that you only get that bandwidth to your most frequently used data on the Xbox One. It’s still not clear to me what effective memory bandwidth looks like on the Xbox One, I suspect it’s still a bit lower than on the PS4, but after talking with Ryan Smith (AT’s Senior GPU Editor) I’m now wondering if memory bandwidth isn’t really the issue here.

Microsoft Xbox One vs. Sony PlayStation 4 Spec comparison
  Xbox 360 Xbox One PlayStation 4
CPU Cores/Threads 3/6 8/8 8/8
CPU Frequency 3.2GHz 1.75GHz 1.6GHz
CPU µArch IBM PowerPC AMD Jaguar AMD Jaguar
Shared L2 Cache 1MB 2 x 2MB 2 x 2MB
GPU Cores   768 1152
GCN Geometry Engines   2 2
GCN ROPs   16 32
GPU Frequency   853MHz 800MHz
Peak Shader Throughput 0.24 TFLOPS 1.31 TFLOPS 1.84 TFLOPS
Embedded Memory 10MB eDRAM 32MB eSRAM -
Embedded Memory Bandwidth 32GB/s 102GB/s bi-directional (204GB/s total) -
System Memory 512MB 1400MHz GDDR3 8GB 2133MHz DDR3 8GB 5500MHz GDDR5
System Memory Bus 128-bits 256-bits 256-bits
System Memory Bandwidth 22.4 GB/s 68.3 GB/s 176.0 GB/s
Manufacturing Process   28nm 28nm

In order to accommodate the eSRAM on die Microsoft not only had to move to a 12 CU GPU configuration, but it’s also only down to 16 ROPs (half of that of the PS4). The ROPs (render outputs/raster operations pipes) are responsible for final pixel output, and at the resolutions these consoles are targeting having 16 ROPs definitely puts the Xbox One as the odd man out in comparison to PC GPUs. Typically AMD’s GPU targeting 1080p come with 32 ROPs, which is where the PS4 is, but the Xbox One ships with half that. The difference in raw shader performance (12 CUs vs 18 CUs) can definitely creep up in games that run more complex lighting routines and other long shader programs on each pixel, but all of the more recent reports of resolution differences between Xbox One and PS4 games at launch are likely the result of being ROP bound on the One. This is probably why Microsoft claimed it saw a bigger increase in realized performance from increasing the GPU clock from 800MHz to 853MHz vs. adding two extra CUs. The ROPs operate at GPU clock, so an increase in GPU clock in a ROP bound scenario would increase performance more than adding more compute hardware.

The PS4's APU - Courtesy Chipworks

Microsoft’s admission that the Xbox One dev kits have 14 CUs does make me wonder what the Xbox One die looks like. Chipworks found that the PS4’s APU actually features 20 CUs, despite only exposing 18 to game developers. I suspect those last two are there for defect mitigation/to increase effective yields in the case of bad CUs, I wonder if the same isn’t true for the Xbox One.

At the end of the day Microsoft appears to have ended up with its GPU configuration not for silicon cost reasons, but for platform power/cost and component availability reasons. Sourcing DDR3 is much easier than sourcing high density GDDR5. Sony managed to obviously launch with a ton of GDDR5 just fine, but I can definitely understand why Microsoft would be hesitant to go down that route in the planning stages of Xbox One. To put some numbers in perspective, Sony has shipped 1 million PS4s thus far. That's 16 million GDDR5 chips, or 7.6 Petabytes of RAM. Had both Sony and Microsot tried to do this, I do wonder if GDDR5 supply would've become a problem. That's a ton of RAM in a very short period of time. The only other major consumer of GDDR5 are video cards, and the number of cards sold in the last couple of months that would ever use that RAM is a narrow list. 

Microsoft will obviously have an easier time scaling its platform down over the years (eSRAM should shrink nicely at smaller geometry processes), but that’s not a concern to the end user unless Microsoft chooses to aggressively pass along cost savings.

Introduction, Hardware, Controller & OS Image Quality - Xbox 360 vs. Xbox One
Comments Locked

286 Comments

View All Comments

  • airmantharp - Wednesday, November 20, 2013 - link

    Having actual CPU resources, a unified GPU architecture with desktops (and many mobile SoCs), and tons of RAM are all big differences over the last generation's introduction.

    The Xbox expounds on that by adding in co-processors that allow for lots of difficult stuff to happen in real-time without affecting overall performance.
  • mikeisfly - Thursday, November 21, 2013 - link

    Thank god people didn't think like this when computers first started with switches and paper tape. Remember we have to start some where to move the technology forward. I want the Jarvis computer in Iron Man! You don't get there by making a console that can play games. You get there by making a console that can play games and has voice recognition and gestures and ......
    People get use to interacting with new input sources and then you find your self in a situation when you say how did I ever live without this. You guys sound like I did in the 80s when Microsoft was coming out with this stupid gui crap. "You will have to rip the command line from my cold dead fingers!" Where would we be today if everyone thought like me. Where would the Internet be if it was just command line. I for one applaud Microsoft for trying to expand the gaming market not just for hard core gamers but people like my girl too. I know the PS4 might have more power in terms of compute performance but that is not what games are about, it's about story line, immersiveness (made-up word), and to some extent graphics. Truth is there is really no difference between 1080 and 720 on a Big Screen, remember people this is not a PC monitor. And the X1 can do 1080p. I'm looking forward to what both systems can offer in this next generation but I'm more interested in the X1 due to it's forward thinking aspects. Only time will tell though.
  • douglord - Thursday, November 21, 2013 - link

    Rule of thumb is you need a 10x increase in power to get a 100% increase in visual fidelity. Look at 360 vs One. 6x the power and maybe games look 50% better. So we are talking about the PS4 looking 5% better than Xbox One. In this gen, it really is about who has the exclusives you want.

    And if you are looking out 5+ years you have to take into account Xbox's cloud initiative. Have you used OnLive? II can play Borderlands 2 on an Intel Atom. If MS puts the $ behind it, those 8 cores and pitiful CPU could be used just to power the OS and cloud terminal. Only way these consoles can keep up with midrange PCs.
  • Revdarian - Sunday, November 24, 2013 - link

    Interesting that you use numbers referring to visual fidelity, when it is a non quantifiable, perceptual, quality.

    Also there is no such Rule of Thumb regarding it, but what is known is that in certain games like CoD:Ghosts due to certain choices the xb1 is able to pump less than half the pixels that the ps4 can.

    If you believe in the Cloud for that kind of gaming, Sony has bought Gaikai and it is a project that started sooner than the MS counterpart, heck the MS counterpart hasn't been named.
  • RubyX - Wednesday, November 20, 2013 - link

    How do the noise levels of the consoles compare?
    According to other reviews they both seem to be fairly quiet, which is great, but is there a noticable difference between them?
  • szimm - Wednesday, November 20, 2013 - link

    I'm wondering the same - I've seen lots of people point out the fact that the Xbox One is designed to be bigger, but more cool and quiet. However, I haven't seen any confirmation that it is in fact more quiet than the PS4.
  • bill5 - Wednesday, November 20, 2013 - link

    15w standby, seems a bit high.

    Lets say you leave it on standby 24/7, as you would, that's 360 watts a day, almost 11 KWh/s month. I pay ~10cent poer Kwh in general, so 1.10/month.

    Could add up to $60+ over 5 years. More if the EPA enforces more regulations rising the cost of electricity as they typically are doing.
  • ydeer - Thursday, November 21, 2013 - link

    Yes, the standby power of the XBone and PS4 bothers me too. I often leave my TV and Consoles untouched for weeks, so the only sensible thing is to put them on a Master/Slave powerstrip which cuts them off the grid when the TV isn’t on.

    Of course that defeats the entire standby background downloads, but in the case of Sony, I have to wonder why they put a whole proprietary ARM SoC* (with 2GB of DDR3 RAM) on the board for "low power standby and background downloads" and then end up with unbelievable 70W figures.

    This is essentially a mobile phone without a display, I don’t think it should use more than 3 Watt idle with the HD spun down.

    My only explanation is that they couldn’t get the ARM software/OS side if things wrapped up in time for the launch, so for now they use the x86 CPU for background downloads even though it was never intended to do that.

    * http://www.ifixit.com/Teardown/PlayStation+4+Teard...
  • ydeer - Thursday, November 21, 2013 - link

    Correction, the SoC only has access to 2Gb (= 256 MB) of DDR3 RAM.

    However, I found a document that seems to confirm that the ARM Subsystem did not work as planned and Sony currently uses the APU for all standby/background tasks.

    Maybe somebody who is fluent in Japanese could give us a short abstract of the part that talks about the subsystem.

    http://translate.google.com/translate?u=http%3A//p...
  • tipoo - Wednesday, November 20, 2013 - link

    Hey Anand, did you see the Wii U GPU die shots? How many shaders do you think are in there? I think it's almost certainly 160 at this point, but there are a few holdouts saying 320 which seems impossible with the shader config/size. They are basing that off the clusters being a bit bigger than normal shader cores, but that could be down to process optimization.

Log in

Don't have an account? Sign up now