CPU & GPU Hardware Analyzed

Although Microsoft did its best to minimize AMD’s role in all of this, the Xbox One features a semi-custom 28nm APU designed with AMD. If this sounds familiar it’s because the strategy is very similar to what Sony employed for the PS4’s silicon.

The phrase semi-custom comes from the fact that AMD is leveraging much of its already developed IP for the SoC. On the CPU front we have two Jaguar compute units, each one with four independent processor cores and a shared 2MB L2 cache. The combination of the two give the Xbox One its 8-core CPU. This is the same basic layout of the PS4‘s SoC.

If you’re not familiar with it, Jaguar is the follow-on to AMD’s Bobcat core - think of it as AMD’s answer to the Intel Atom. Jaguar is a 2-issue OoO architecture, but with roughly 20% higher IPC than Bobcat thanks to a number of tweaks. In ARM terms we’re talking about something that’s faster than a Cortex A15. I expect Jaguar to be close but likely fall behind Intel’s Silvermont, at least at the highest shipping frequencies. Jaguar is the foundation of AMD’s Kabini and Temash APUs, where it will ship first. I’ll have a deeper architectural look at Jaguar later this week. Update: It's live!

Inside the Xbox One, courtesy Wired

There’s no word on clock speed, but Jaguar at 28nm is good for up to 2GHz depending on thermal headroom. Current rumors point to both the PS4 and Xbox One running their Jaguar cores at 1.6GHz, which sounds about right. In terms of TDP, on the CPU side you’re likely looking at 30W with all cores fully loaded.

The move away from PowerPC to 64-bit x86 cores means the One breaks backwards compatibility with all Xbox 360 titles. Microsoft won’t be pursuing any sort of a backwards compatibility strategy, although if a game developer wanted to it could port an older title to the new console. Interestingly enough, the first Xbox was also an x86 design - from a hardware/ISA standpoint the new Xbox One is backwards compatible with its grandfather, although Microsoft would have to enable that as a feature in software - something that’s quite unlikely.

Microsoft Xbox One vs. Sony PlayStation 4 Spec comparison
  Xbox 360 Xbox One PlayStation 4
CPU Cores/Threads 3/6 8/8 8/8
CPU Frequency 3.2GHz 1.6GHz (est) 1.6GHz (est)
CPU µArch IBM PowerPC AMD Jaguar AMD Jaguar
Shared L2 Cache 1MB 2 x 2MB 2 x 2MB
GPU Cores   768 1152
Peak Shader Throughput 0.24 TFLOPS 1.23 TFLOPS 1.84 TFLOPS
Embedded Memory 10MB eDRAM 32MB eSRAM -
Embedded Memory Bandwidth 32GB/s 102GB/s -
System Memory 512MB 1400MHz GDDR3 8GB 2133MHz DDR3 8GB 5500MHz GDDR5
System Memory Bus 128-bits 256-bits 256-bits
System Memory Bandwidth 22.4 GB/s 68.3 GB/s 176.0 GB/s
Manufacturing Process   28nm 28nm

On the graphics side it’s once again obvious that Microsoft and Sony are shopping at the same store as the Xbox One’s SoC integrates an AMD GCN based GPU. Here’s where things start to get a bit controversial. Sony opted for an 18 Compute Unit GCN configuration, totaling 1152 shader processors/cores/ALUs. Microsoft went for a far smaller configuration: 768 (12 CUs).

Microsoft can’t make up the difference in clock speed alone (AMD’s GCN seems to top out around 1GHz on 28nm), and based on current leaks it looks like both MS and Sony are running their GPUs at the same 800MHz clock. The result is a 33% reduction in compute power, from 1.84 TFLOPs in the PS4 to 1.23 TFLOPs in the Xbox One. We’re still talking about over 5x the peak theoretical shader performance of the Xbox 360, likely even more given increases in efficiency thanks to AMD’s scalar GCN architecture (MS quotes up to 8x better GPU performance) - but there’s no escaping the fact that Microsoft has given the Xbox One less GPU hardware than Sony gave the PlayStation 4. Note that unlike the Xbox 360 vs. PS3 era, Sony's hardware advantage here won't need any clever developer work to extract - the architectures are near identical, Sony just has more resources available to use.

Remember all of my talk earlier about a slight pivot in strategy? Microsoft seems to believe that throwing as much power as possible at the next Xbox wasn’t the key to success and its silicon choices reflect that.

Introduction Memory Subsystem
POST A COMMENT

244 Comments

View All Comments

  • geniekid - Wednesday, May 22, 2013 - link

    As alluded to in the article, only PS4 exclusives are likely to take advantage of the additional processing power. Most developers will probably use the same textures/lighting/etc. on both platforms to lower porting costs so you'd never see an improvement.

    I think they were correct to focus more on the Kinect 2.
    Reply
  • dysonlu - Wednesday, May 22, 2013 - link

    It's not difficult at all to include different levels of textures and ligthing. As we all know, the PC games makers have been doing that for years. And these news consoles are nothing but PCs. Reply
  • Flunk - Wednesday, May 22, 2013 - link

    Frankly, even if they don't program for it, it means that everything will run just a little smoother on the PS4. I'm now leaning toward the PS4 to replace my 360. If they go the same pay for multiplayer route they did this generation it will cement my decision. Reply
  • Voldenuit - Wednesday, May 22, 2013 - link

    The reality is that games are never fully optimized for any hardware configuration, so even if PS4 users never see higher res textures or higher poly models, having 50% (!!!) more GPU power means they will see smoother framerates with less dips.

    I'd take that over some Big Brother contraption in my living room (Kinect) that will be broken into by creepy hackers trying to spy on teenage girls. Or I would, if I were buying a console, which still hasn't been decided (cost/affordability rather than any ideological divides).
    Reply
  • lmcd - Wednesday, May 22, 2013 - link

    Exactly.

    Of course Move wasn't better given that a camera was used for that too...
    Reply
  • Ramon Zarat - Tuesday, May 28, 2013 - link

    Sony's cam is not required to be plugged in for the rest of the console to work . XB1, yes. No Kinect, no console, period.

    Sony's cam are not hooked to an always-on console. I could be offline forever if I want and the console would still work, and it would be impossible to hack if it's not online. If your XB1 is off the net for more than 24H, no console, period

    Sony's cam can actually be turned off, and I mean completely off. XB1, no. It's on, even when everything else is off. Just in case you are too lazy to just get out of the couch and press power on on your console, your Kinect is always on to accept your voice command.

    Always-online console + always-on cam and mic and no way to shut any of those thing off = sooner than later some Chinese hackers WILL record you f*cking your wife on that couch and blackmail you for some money threatening you to post that video on YouTube.

    I just seriously can't way for this to happen! :)
    Reply
  • blacks329 - Wednesday, May 22, 2013 - link

    FYI The new PS Eye (Kinect like Camera) will be included in every PS4 box. This was confirmed in February. Whether it is required to be plugged in like the X1 remains to be seen, but I wouldn't be surprised. Reply
  • piiman - Saturday, June 22, 2013 - link

    "I'd take that over some Big Brother contraption in my living room (Kinect) that will be broken into by creepy hackers trying to spy on teenage girls"

    Paranoid much?
    Just place something in front of the camera if you’re really that worried.
    Reply
  • lmcd - Wednesday, May 22, 2013 - link

    Rather the opposite -- any engine-licensed game will take advantage of additional processing power and/or have way better framerates. Reply
  • Ramon Zarat - Tuesday, May 28, 2013 - link

    Seriously? LOD, rendering distance, anti-aliasing level, texture filtering level, post processing level, tessellation, shadows and every other conceivable GFX parameters can be turned up or down on the fly by ANY modern game engine. Even fluid and particles are now fully visualized and not fixed, prerendered object and so their level of complexity can be easily adjusted up of down.

    PS4 would be displaying GFX on high or very high and XB1 only on medium. Same identical textures, same number of polygons etc... The foundation would remain the same, but PS4 would display more complex scene at higher quality. You don't need 2 different sets of games to produce drastically different results, it's all built in the engine already. It's only a matter of processing power and the PS4 GPU and GDDR5 are just that, more powerful.

    The only real inherent limitation would be the number of items/accessories simultaneously on screen (trees, cars, spectators, chair, cup, book etc...) in a given scene that would fixed by the game developers and even then, they could build some flexibility into it to allow the more powerful system to display more stuff.
    Reply

Log in

Don't have an account? Sign up now