Performance - An Update

The Chipworks PS4 teardown last week told us a lot about what’s happened between the Xbox One and PlayStation 4 in terms of hardware. It turns out that Microsoft’s silicon budget was actually a little more than Sony’s, at least for the main APU. The Xbox One APU is a 363mm^2 die, compared to 348mm^2 for the PS4’s APU. Both use a similar 8-core Jaguar CPU (2 x quad-core islands), but they feature different implementations of AMD’s Graphics Core Next GPUs. Microsoft elected to implement 12 compute units, two geometry engines and 16 ROPs, while Sony went for 18 CUs, two geometry engines and 32 ROPs. How did Sony manage to fit in more compute and ROP partitions into a smaller die area? By not including any eSRAM on-die.

While both APUs implement a 256-bit wide memory interface, Sony chose to use GDDR5 memory running at a 5.5GHz data rate. Microsoft stuck to more conventionally available DDR3 memory running at less than half the speed (2133MHz data rate). In order to make up for the bandwidth deficit, Microsoft included 32MB of eSRAM on its APU in order to alleviate some of the GPU bandwidth needs. The eSRAM is accessible in 8MB chunks, with a total of 204GB/s of bandwidth offered (102GB/s in each direction) to the memory. The eSRAM is designed for GPU access only, CPU access requires a copy to main memory.

Unlike Intel’s Crystalwell, the eSRAM isn’t a cache - instead it’s mapped to a specific address range in memory. And unlike the embedded DRAM in the Xbox 360, the eSRAM in the One can hold more than just a render target or Z-buffer. Virtually any type of GPU accessible surface/buffer type can now be stored in eSRAM (e.g. z-buffer, G-buffer, stencil buffers, shadow buffer, etc…). Developers could also choose to store things like important textures in this eSRAM as well, there’s nothing that states it needs to be one of these buffers just anything the developer finds important. It’s also possible for a single surface to be split between main memory and eSRAM.

Obviously sticking important buffers and other frequently used data here can definitely reduce demands on the memory interface, which should help Microsoft get by with only having ~68GB/s of system memory bandwidth. Microsoft has claimed publicly that actual bandwidth to the eSRAM is somewhere in the 140 - 150GB/s range, which is likely equal to the effective memory bandwidth (after overhead/efficiency losses) to the PS4’s GDDR5 memory interface. The difference being that you only get that bandwidth to your most frequently used data on the Xbox One. It’s still not clear to me what effective memory bandwidth looks like on the Xbox One, I suspect it’s still a bit lower than on the PS4, but after talking with Ryan Smith (AT’s Senior GPU Editor) I’m now wondering if memory bandwidth isn’t really the issue here.

Microsoft Xbox One vs. Sony PlayStation 4 Spec comparison
  Xbox 360 Xbox One PlayStation 4
CPU Cores/Threads 3/6 8/8 8/8
CPU Frequency 3.2GHz 1.75GHz 1.6GHz
CPU µArch IBM PowerPC AMD Jaguar AMD Jaguar
Shared L2 Cache 1MB 2 x 2MB 2 x 2MB
GPU Cores   768 1152
GCN Geometry Engines   2 2
GCN ROPs   16 32
GPU Frequency   853MHz 800MHz
Peak Shader Throughput 0.24 TFLOPS 1.31 TFLOPS 1.84 TFLOPS
Embedded Memory 10MB eDRAM 32MB eSRAM -
Embedded Memory Bandwidth 32GB/s 102GB/s bi-directional (204GB/s total) -
System Memory 512MB 1400MHz GDDR3 8GB 2133MHz DDR3 8GB 5500MHz GDDR5
System Memory Bus 128-bits 256-bits 256-bits
System Memory Bandwidth 22.4 GB/s 68.3 GB/s 176.0 GB/s
Manufacturing Process   28nm 28nm

In order to accommodate the eSRAM on die Microsoft not only had to move to a 12 CU GPU configuration, but it’s also only down to 16 ROPs (half of that of the PS4). The ROPs (render outputs/raster operations pipes) are responsible for final pixel output, and at the resolutions these consoles are targeting having 16 ROPs definitely puts the Xbox One as the odd man out in comparison to PC GPUs. Typically AMD’s GPU targeting 1080p come with 32 ROPs, which is where the PS4 is, but the Xbox One ships with half that. The difference in raw shader performance (12 CUs vs 18 CUs) can definitely creep up in games that run more complex lighting routines and other long shader programs on each pixel, but all of the more recent reports of resolution differences between Xbox One and PS4 games at launch are likely the result of being ROP bound on the One. This is probably why Microsoft claimed it saw a bigger increase in realized performance from increasing the GPU clock from 800MHz to 853MHz vs. adding two extra CUs. The ROPs operate at GPU clock, so an increase in GPU clock in a ROP bound scenario would increase performance more than adding more compute hardware.

The PS4's APU - Courtesy Chipworks

Microsoft’s admission that the Xbox One dev kits have 14 CUs does make me wonder what the Xbox One die looks like. Chipworks found that the PS4’s APU actually features 20 CUs, despite only exposing 18 to game developers. I suspect those last two are there for defect mitigation/to increase effective yields in the case of bad CUs, I wonder if the same isn’t true for the Xbox One.

At the end of the day Microsoft appears to have ended up with its GPU configuration not for silicon cost reasons, but for platform power/cost and component availability reasons. Sourcing DDR3 is much easier than sourcing high density GDDR5. Sony managed to obviously launch with a ton of GDDR5 just fine, but I can definitely understand why Microsoft would be hesitant to go down that route in the planning stages of Xbox One. To put some numbers in perspective, Sony has shipped 1 million PS4s thus far. That's 16 million GDDR5 chips, or 7.6 Petabytes of RAM. Had both Sony and Microsot tried to do this, I do wonder if GDDR5 supply would've become a problem. That's a ton of RAM in a very short period of time. The only other major consumer of GDDR5 are video cards, and the number of cards sold in the last couple of months that would ever use that RAM is a narrow list. 

Microsoft will obviously have an easier time scaling its platform down over the years (eSRAM should shrink nicely at smaller geometry processes), but that’s not a concern to the end user unless Microsoft chooses to aggressively pass along cost savings.

Introduction, Hardware, Controller & OS Image Quality - Xbox 360 vs. Xbox One
Comments Locked

286 Comments

View All Comments

  • JDG1980 - Wednesday, November 20, 2013 - link

    So, based on the numbers shown here, it looks like the PS4's GPU is roughly on par with a Radeon HD 7850 (more shaders, but slightly lower clock). Meanwhile, the XB1's GPU is considerably weaker, with performance falling somewhere between a 7770 and 7790. Considering that this is a game console we're talking about (notwithstanding Microsoft's attempt to position it as a do-everything set-top box), that's going to hurt the XB1 a lot.

    I just don't see any real advantage to the *consumer* in Microsoft's design decisions here, regardless of supply chain considerations, and I think Anandtech should have been more pro-active in calling them out on this.
  • mikeisfly - Thursday, November 21, 2013 - link

    The right question to ask is can both cards do 1080p gaming. Remeber these aren't PC where people are running games at much higher resolutions than 1920x180 on multiple monitors.
  • douglord - Thursday, November 21, 2013 - link

    Take a 7850 and 7770 and put them next to each other with FOR locked to 60 fps. Sit back 6 feet and play a fps. Tell me which is which. Maybe a 5% difference in visual fidelity.
  • Revdarian - Sunday, November 24, 2013 - link

    Lol no, by the way, what will you do if a game is heavy enough to run at 720p 30 on the ps4, at which resolution will you run it on the xb1?... yeap, it will be notoriously different.
  • jeffrey - Wednesday, November 20, 2013 - link

    With the PS4 offering-up such a more powerful system, the arguement turned to Xbox One's eSRAM and "cloud power" to equalize things. Even with Microsoft boosting clocks, the Xbox One simply does not deliver game play graphics the way the PS4 has now been demonstrated to do.

    The PS4 graphics look much better. In COD Ghosts it almost looks like the PS4 is a half-generation ahead of the Xbox One. This actually makes sense with the the PS4 offering 50% more GPU cores and 100% more ROPs.

    Considering the PS4 is $100 cheaper and with the bundled Kinect being a non-starter, the decision seems easy.

    The troubling piece is that both systems are dropping featues that previous gen systems had, like Blu-ray 3D.
  • bill5 - Wednesday, November 20, 2013 - link

    heh, half generation? Do you have visual problems?

    Looking at all the Anand evidence, pics and yt's, you're quibbling over a 1% visual difference, seriously. It's shocking how little difference there is in COD for example, and that's a full 720 vs 1080 split! I expect in the future Xone will close that gap to 900 vs 1080 and the like.

    I would say even the average *gamer* wont be able to tell much difference, let alone your mom.

    Hell, half the time it's hard to spot much different between "current" and "next" gen versions at a glance, let alone between the PS4/Xone versions.

    I'd say that, sad as it is, MS won that war. Their box will be perceived as "good enough". I've already seen reviews today touting Forza 5 as the best looking next gen title on any console, and the like.

    All you really need is ports. Mult plat devs are already showing all effects and textures will be the same, the only difference might be resolution (even then games like NFS Rivals and NBA 2K are 1080P on Xone).

    Then you'll get to exclusives, where PS4 could stretch it's lead if it has one. However these are the vast vast minority of games (and even then I'd argue early exclusives prove nothing)

    I hate what Ms did going low power, it was stupid. But they'll probably get away with it because, Sony.
  • Philthelegend - Wednesday, November 20, 2013 - link

    You trolling?
    You are the visually impaired if you don't see the difference! Just look at the screenshots and if you have a low resolution screen zoom them in and see the difference. The difference is like playing a game on very high settings(ps4) to medium(xbone) on PC.

    "MS won that war. Their box will be perceived as "good enough"." hehehehe you're an obvious troll or a blind fanboy, no one says that the loser won a battle because he was good enough

    You say the Forza 5 is the best looking next gen title, then you go on talking about ps4 exclusives prove nothing?

    The actual graphics are not the top priority, xbone could have the same graphics as the ps4 but the most important thing is to keep the framerates above and atleast 60 at all times.
  • TEAMSWITCHER - Wednesday, November 20, 2013 - link

    You and I must have watched the different videos. There is a pronounced "Shimmering" effect on the Xbox One - caused by weaker anti-aliasing. It's far more distracting than a mere 1%. In every video the PS4 image looks more solid and consistent. I'm less than an average Gamer and I can see the difference immediately.

    Microsoft simply didn't "Bring It" this time and when your in a tough competitive situation like game consoles you really can't afford not to. I really don't want to buy a "Good Enough" console. Thank you, but no thanks.
  • Hubb1e - Wednesday, November 20, 2013 - link

    I really didn't see much difference between the two. If I tried really hard I could see some more detail in the PS4 and it had a little less "shimmering" effect. In actual use on a standard 50" TV sitting the normal 8-10 feet away I doubt there will be much difference. Shit, most people don't even realize their TV is set to stretch 4:3 content and they have it set to torch mode because the "colors pop" better. It's probably going to come down to price and Kinect and in this case an extra $100 is a lot of extra money to pay. $449 would have a better a better price, but we'll see since there is plenty of time for MS to lower prices for their console after first adopters have paid the premium price.
  • Kurge - Wednesday, November 20, 2013 - link

    Fail. All of that has more to do with the developers than the hardware.

Log in

Don't have an account? Sign up now