CPU & GPU Hardware Analyzed

Although Microsoft did its best to minimize AMD’s role in all of this, the Xbox One features a semi-custom 28nm APU designed with AMD. If this sounds familiar it’s because the strategy is very similar to what Sony employed for the PS4’s silicon.

The phrase semi-custom comes from the fact that AMD is leveraging much of its already developed IP for the SoC. On the CPU front we have two Jaguar compute units, each one with four independent processor cores and a shared 2MB L2 cache. The combination of the two give the Xbox One its 8-core CPU. This is the same basic layout of the PS4‘s SoC.

If you’re not familiar with it, Jaguar is the follow-on to AMD’s Bobcat core - think of it as AMD’s answer to the Intel Atom. Jaguar is a 2-issue OoO architecture, but with roughly 20% higher IPC than Bobcat thanks to a number of tweaks. In ARM terms we’re talking about something that’s faster than a Cortex A15. I expect Jaguar to be close but likely fall behind Intel’s Silvermont, at least at the highest shipping frequencies. Jaguar is the foundation of AMD’s Kabini and Temash APUs, where it will ship first. I’ll have a deeper architectural look at Jaguar later this week. Update: It's live!

Inside the Xbox One, courtesy Wired

There’s no word on clock speed, but Jaguar at 28nm is good for up to 2GHz depending on thermal headroom. Current rumors point to both the PS4 and Xbox One running their Jaguar cores at 1.6GHz, which sounds about right. In terms of TDP, on the CPU side you’re likely looking at 30W with all cores fully loaded.

The move away from PowerPC to 64-bit x86 cores means the One breaks backwards compatibility with all Xbox 360 titles. Microsoft won’t be pursuing any sort of a backwards compatibility strategy, although if a game developer wanted to it could port an older title to the new console. Interestingly enough, the first Xbox was also an x86 design - from a hardware/ISA standpoint the new Xbox One is backwards compatible with its grandfather, although Microsoft would have to enable that as a feature in software - something that’s quite unlikely.

Microsoft Xbox One vs. Sony PlayStation 4 Spec comparison
  Xbox 360 Xbox One PlayStation 4
CPU Cores/Threads 3/6 8/8 8/8
CPU Frequency 3.2GHz 1.6GHz (est) 1.6GHz (est)
CPU µArch IBM PowerPC AMD Jaguar AMD Jaguar
Shared L2 Cache 1MB 2 x 2MB 2 x 2MB
GPU Cores   768 1152
Peak Shader Throughput 0.24 TFLOPS 1.23 TFLOPS 1.84 TFLOPS
Embedded Memory 10MB eDRAM 32MB eSRAM -
Embedded Memory Bandwidth 32GB/s 102GB/s -
System Memory 512MB 1400MHz GDDR3 8GB 2133MHz DDR3 8GB 5500MHz GDDR5
System Memory Bus 128-bits 256-bits 256-bits
System Memory Bandwidth 22.4 GB/s 68.3 GB/s 176.0 GB/s
Manufacturing Process   28nm 28nm

On the graphics side it’s once again obvious that Microsoft and Sony are shopping at the same store as the Xbox One’s SoC integrates an AMD GCN based GPU. Here’s where things start to get a bit controversial. Sony opted for an 18 Compute Unit GCN configuration, totaling 1152 shader processors/cores/ALUs. Microsoft went for a far smaller configuration: 768 (12 CUs).

Microsoft can’t make up the difference in clock speed alone (AMD’s GCN seems to top out around 1GHz on 28nm), and based on current leaks it looks like both MS and Sony are running their GPUs at the same 800MHz clock. The result is a 33% reduction in compute power, from 1.84 TFLOPs in the PS4 to 1.23 TFLOPs in the Xbox One. We’re still talking about over 5x the peak theoretical shader performance of the Xbox 360, likely even more given increases in efficiency thanks to AMD’s scalar GCN architecture (MS quotes up to 8x better GPU performance) - but there’s no escaping the fact that Microsoft has given the Xbox One less GPU hardware than Sony gave the PlayStation 4. Note that unlike the Xbox 360 vs. PS3 era, Sony's hardware advantage here won't need any clever developer work to extract - the architectures are near identical, Sony just has more resources available to use.

Remember all of my talk earlier about a slight pivot in strategy? Microsoft seems to believe that throwing as much power as possible at the next Xbox wasn’t the key to success and its silicon choices reflect that.

Introduction Memory Subsystem
Comments Locked

245 Comments

View All Comments

  • BSMonitor - Wednesday, May 22, 2013 - link

    Nexus tablet doesn't have CoD for free..
  • elitewolverine - Thursday, May 23, 2013 - link

    no one will make a $1 game with the visuals of CoD, BF2, halo, the list goes on. They would make 0 money.

    google taking hdtv gaming seriously? They make all their money on ad's, you honestly think people constantly want ads in a video game? And not product placement...ads. Before you matchmake just watching this 30sec video about vagisil...yea right...

    Also, what is a few generations? A few is more than 2, 3 generations ago we were at the ps1. 14yrs ago.

    Your telling me that its going to take 19yrs for a tablet to have todays graphics of the xbox1? By that time what the hell will the ps5 have or the x5....

    The biggest thing the x1 has for it, that every one is forgetting...cloud/azure.

    This is huge, so huge time will show just how little the x1 in multiplayer games will need to compute tasks
  • Majeed Belle - Sunday, September 8, 2013 - link

    I think you are putting way too much stake in the cloud especially when we are talking about computing anything graphics or otherwise. People can barely download music on a steady connection right now. Consoles can't even get you solid updates in a timely manner and you are talking about offloading real work over the internet?

    ok.
  • Mathos - Wednesday, May 22, 2013 - link

    After reading a lot of articles about these two consoles, and their SoC's. There are some things we can extrapolate from this info.

    Both Systems are based on the same 8core x86 amd64 CPU. Which means the main logic and memory controllers in the APU's are the the exact same. The comment about PS4 being married to ddr5 may not be true, as we all know that the GPU's can also run on ddr3, plus it may be possible that the cpu memory controller is also capable of running ddr5 or ddr3 in either system..

    Both systems are using a 256bit memory bus. Being these are x86 amd cpus, that likely points to jaguar using a quad channel memory controller 64+64+64+64=256, which could be good news when they hit the desktop, if they retain said quad channel controller. It would also be nice to see that in AMD's mainstream chips as well.
  • tipoo - Wednesday, May 22, 2013 - link

    GDDR5 is on the PS4 official spec sheet.
  • Kevin G - Wednesday, May 22, 2013 - link

    Going with eSRAM is an odd choice. I would have through capacity would have been more important than absolute latency. By merit of being on-die, using eDRAM would have lower latency than external DDR3. If they had chosen eDRAM, they could have had 128 MB on die. That is enough for three 32 bit, 4K resolution buffers. In such a case, I'd have that 128 MB of eDRAM directly accessible and not a cache. Sure, software would need to be aware of the two different memory pools for optimizations but most of that would be handled by API calls (ie a DirectX function calls would set up a frame buffer in the eDRAM for the programmer).

    The bandwidth figures for the eSRAM seem to be a bit on the low side too. The Xbox 360 had 256 GB/s of bandwidth between the ROPs and eDRAM. With high clock speeds and a wider bus, I would have thought the Xbox One had ~820 GB/s bandwidth there.

    I'm also puzzled by MS using DDR3 for main memory. While lower power than GDDR5, for a console plugged into a wall, the bandwidth benefits would out weigh the power savings in my judgement. There is also another option: DDR4. Going for a 3.2 Ghz effective clock on DDR4 should be feasible as long as MS could get a manufacture to start producing those chips this year. (DDR4 is ready for manufacture now but they're holding off volume production until a CPU with an on-die DDR4 memory controller becomes available.) With 3.2 Ghz DDR4, bandwidth would move to 102.4 GB/s. Still less than what the PS4 has but not drastically so. At the end of the Xbox One's life span, I'd see DDR4 being cheaper than acquiring DDR3.

    As far as the XBox One's AV capabilities, I'd personally have released two consoles. One with the basic HDMI switching and another with Cable card + tuner + DVR. And for good measure, the model with Cable card + tuner + DVR would also have an Xbox 360 SoC to provide backwards compatibility and run the DVR software while the x86 CPU's handle gaming and the basic apps. If MS is going to go in this direction, might as well go all the way.

    Good to see 802.11n and dual band support. With HDMI in and out, I'd have also included HDMI+Ethernet support there as well. Basically the Xbox One would have a nice embedded router between the Gigabit Ethernet port, the two HDMI ports and the 802.11n wireless.
  • jabber - Wednesday, May 22, 2013 - link

    Remember though that the DDR3 in the Xbox will be hardwired directly with no legacy or other PC related stuff getting in the way. This will be optimised DDR3 and not working exactly how its standardised in our PCs.
  • Kevin G - Wednesday, May 22, 2013 - link

    The only advantage DDR3 in the Xbox One has over a PC is that it is soldered. This allows for marginally better signaling without the edge connector of a DIMM.
  • kamil - Wednesday, May 22, 2013 - link

    That was surprisingly fair, considering a lot of what I've seen since yesterday. Sony tried hard to do what it thought would "improve the gaming experience" and ended up with a lot of social integration and considerably more aggressive hardware. Microsoft didn't really add much to actually playing games (though they do have some cloud-based stuff including game streaming) but has made a play for becoming a full living room experience, with games, live and recorded television, no hassle cable integration, and seemingly several exclusive partnerships. I'm not convinced that core gamers will see much use for those options (though most of the people I know in that group were already PC-focused if not exclusive) or the social things with the PS4, but the raw power would be a nice draw, assuming Sony doesn't accidentally pull a 360 and overheat with any noteworthy extended use.

    Of course, if the rumors of Microsoft pushing developers toward always-online DRM, included on-console mandatory check-in every 24 hours, fees for pre-owned or shared games, forced hard drive installs, etc. all pan out a lot of people are going to boycott on principle even if they don't buy used games and have great internet.
  • blacks329 - Thursday, May 23, 2013 - link

    I fall in that category of, not buying used games with decent internet (but capped - damn you Canadian duopoly!!) but definitely won't be picking up the X1 if this holds (at least early on).

    Additionally, I hate paying for XBL and have no intention of doing it going forward, hopefully Sony doesn't follow this route and maintains PS+ as value added and not a requirement for playing games online.

Log in

Don't have an account? Sign up now