CPU & GPU Hardware Analyzed

Although Microsoft did its best to minimize AMD’s role in all of this, the Xbox One features a semi-custom 28nm APU designed with AMD. If this sounds familiar it’s because the strategy is very similar to what Sony employed for the PS4’s silicon.

The phrase semi-custom comes from the fact that AMD is leveraging much of its already developed IP for the SoC. On the CPU front we have two Jaguar compute units, each one with four independent processor cores and a shared 2MB L2 cache. The combination of the two give the Xbox One its 8-core CPU. This is the same basic layout of the PS4‘s SoC.

If you’re not familiar with it, Jaguar is the follow-on to AMD’s Bobcat core - think of it as AMD’s answer to the Intel Atom. Jaguar is a 2-issue OoO architecture, but with roughly 20% higher IPC than Bobcat thanks to a number of tweaks. In ARM terms we’re talking about something that’s faster than a Cortex A15. I expect Jaguar to be close but likely fall behind Intel’s Silvermont, at least at the highest shipping frequencies. Jaguar is the foundation of AMD’s Kabini and Temash APUs, where it will ship first. I’ll have a deeper architectural look at Jaguar later this week. Update: It's live!

Inside the Xbox One, courtesy Wired

There’s no word on clock speed, but Jaguar at 28nm is good for up to 2GHz depending on thermal headroom. Current rumors point to both the PS4 and Xbox One running their Jaguar cores at 1.6GHz, which sounds about right. In terms of TDP, on the CPU side you’re likely looking at 30W with all cores fully loaded.

The move away from PowerPC to 64-bit x86 cores means the One breaks backwards compatibility with all Xbox 360 titles. Microsoft won’t be pursuing any sort of a backwards compatibility strategy, although if a game developer wanted to it could port an older title to the new console. Interestingly enough, the first Xbox was also an x86 design - from a hardware/ISA standpoint the new Xbox One is backwards compatible with its grandfather, although Microsoft would have to enable that as a feature in software - something that’s quite unlikely.

Microsoft Xbox One vs. Sony PlayStation 4 Spec comparison
  Xbox 360 Xbox One PlayStation 4
CPU Cores/Threads 3/6 8/8 8/8
CPU Frequency 3.2GHz 1.6GHz (est) 1.6GHz (est)
CPU µArch IBM PowerPC AMD Jaguar AMD Jaguar
Shared L2 Cache 1MB 2 x 2MB 2 x 2MB
GPU Cores   768 1152
Peak Shader Throughput 0.24 TFLOPS 1.23 TFLOPS 1.84 TFLOPS
Embedded Memory 10MB eDRAM 32MB eSRAM -
Embedded Memory Bandwidth 32GB/s 102GB/s -
System Memory 512MB 1400MHz GDDR3 8GB 2133MHz DDR3 8GB 5500MHz GDDR5
System Memory Bus 128-bits 256-bits 256-bits
System Memory Bandwidth 22.4 GB/s 68.3 GB/s 176.0 GB/s
Manufacturing Process   28nm 28nm

On the graphics side it’s once again obvious that Microsoft and Sony are shopping at the same store as the Xbox One’s SoC integrates an AMD GCN based GPU. Here’s where things start to get a bit controversial. Sony opted for an 18 Compute Unit GCN configuration, totaling 1152 shader processors/cores/ALUs. Microsoft went for a far smaller configuration: 768 (12 CUs).

Microsoft can’t make up the difference in clock speed alone (AMD’s GCN seems to top out around 1GHz on 28nm), and based on current leaks it looks like both MS and Sony are running their GPUs at the same 800MHz clock. The result is a 33% reduction in compute power, from 1.84 TFLOPs in the PS4 to 1.23 TFLOPs in the Xbox One. We’re still talking about over 5x the peak theoretical shader performance of the Xbox 360, likely even more given increases in efficiency thanks to AMD’s scalar GCN architecture (MS quotes up to 8x better GPU performance) - but there’s no escaping the fact that Microsoft has given the Xbox One less GPU hardware than Sony gave the PlayStation 4. Note that unlike the Xbox 360 vs. PS3 era, Sony's hardware advantage here won't need any clever developer work to extract - the architectures are near identical, Sony just has more resources available to use.

Remember all of my talk earlier about a slight pivot in strategy? Microsoft seems to believe that throwing as much power as possible at the next Xbox wasn’t the key to success and its silicon choices reflect that.

Introduction Memory Subsystem
Comments Locked

245 Comments

View All Comments

  • Rogatti - Wednesday, May 22, 2013 - link

    If Sony allow Linux OS ... PS4 GO !

    Next generation of EyeToy...E3 2013 ?
  • blacks329 - Thursday, May 23, 2013 - link

    They've already showed the next gen PS Eye and it was announced back in February that it would be shipping with every PS4 (just like Kinect) ... Sony needs better marketers lol.
  • Majeed Belle - Sunday, September 8, 2013 - link

    All of you who are trying to use the "Sony will ship with the PSeye" thing is missing a very big point.

    The PSeye IS NOT required for ANY functionality. That's a big difference there big man.

    I personnaly don't give a damn about the eye or kinect I didn't use either of them last gen and I won't use either now. If you are going to try and make a though you should try to state ALL the facts. Not just the ones that you think will validate your argument.
  • rasatouche - Thursday, May 23, 2013 - link

    Memory bandwidth is going to be a huge issue, no? I'm mean we've all seen the benchmarks when you get an AMD apu based X86 pc, and then you change the speed of the ram and you see 30 - 40% difference in FPS, on PC with all it's glorious overhead. In games GPU's are the deciding factor, and probably still will be for sometime, not to mention the PS4 will likely have a less intensive ram overhead for the OS to boot.

    What this is going to mean this gen IMO, is that sony's first party titles will probably look better, and third party games, and the 'ports' if you will, will be the reverse of last generation. I remember one of the main reasons I got a 360 was that it was the better console for 3rd party titles, they ran better, less texture popping & FPS dips / tearing, than the ps3 at the time. It will likely be the reverse this generation, seeing as games should be easier to get running optimally on the ps4, simply because it has more GPU. 50% more shaders in the GPU is nothing to sneeze at, to put it in PC performance terms, it's probably about the relative difference between a 660ti & a 680.
  • elitewolverine - Thursday, May 23, 2013 - link

    you might want to re-understand how the internals will work.

    ddr5 is great at bandwidth, something a video card needs, because its sharing large amounts of predetermined data, latency is not a real issue. DDR3, is used in pcs because its cheaper, but also because it has lower latency in general.

    You off load one of main things a gpu does, framebuffering, and ddr5 becomes a highly priced memory module.

    Don't forget, the x1 is carring a set of 2133 ddr3's, what you are talking about is people going from 1333 ddr3 to 1800 or 2133. Good thing the x1 already has 2133
  • mayankleoboy1 - Thursday, May 23, 2013 - link

    Could it be that 2 years down the line, MS or Sony could overclock the CPU part to something like 1.8GHz through a firmware update ?
  • tipoo - Thursday, May 23, 2013 - link

    Why would they not launch at that speed then, since every single shipped PS4 would have to be validated for the higher speed first?
  • WoodyPWX - Thursday, May 23, 2013 - link

    Sony is the winner here. The architecture is the same, so developers can easily tune some numbers (higher quality shaders, more particles, higher FPS, higher resolution etc) to get noticeably better results on the PS4. I would prefer PS4 as my next gaming device. Hopefully Sony will not screw up developer tools.
  • kensa99 - Thursday, May 23, 2013 - link

    I prefer Xbox one too! I liked playing Xbox 360 games ever and even read many articles about it from Aneesoft Xbox 360 column. And now I will choose Xbox as my favorite game console.
  • alex@1234 - Thursday, May 23, 2013 - link

    GPU is the most important factor in determining the console. PS4 holds the advantage here. Xbox one unless they change the GPU similar to PS4, I will not opt for it. Other than this the integration of TV, Internet is not necessary for most of the gamers. Still Xbox should change the GPU, otherwise it will lose.

Log in

Don't have an account? Sign up now