It’s that time of decade again. Time for a new Xbox. It took four years for Microsoft to go from the original Xbox to the Xbox 360. The transition from Xbox 360 to the newly announced Xbox One will take right around 8 years, and the 360 won’t be going away anytime soon either. The console business demands long upgrade cycles in order to make early investments in hardware (often sold at a loss) worthwhile. This last round was much longer that it ever should have been, so the Xbox One arrives to a very welcoming crowd.

Yesterday Microsoft finally took the covers off the new Xbox, what it hopes will last for many years to come. At a high level here’s what we’re dealing with:

- 8-core AMD Jaguar CPU
- 12 CU/768 SP AMD GCN GPU
- 8GB DDR3 system memory
- 500GB HDD
- Blu-ray drive
- 2.4/5.0GHz 802.11 a/b/g/n, multiple radios with WiFi Direct support
- 4K HDMI in/out (for cable TV passthrough)
- USB 3.0
- Available later this year

While Microsoft was light on technical details, I believe we have enough to put together some decent analysis. Let’s get to it.


The Xbox 360 was crafted during a time that seems so long ago. Consumer electronics styled in white were all the rage, we would be a few years away from the aluminum revolution that engulfs us today. Looking at the Xbox One tells us a lot about how things have changed.

Microsoft isn’t so obsessed with size here, at least initially. Wired reports that the Xbox One is larger than the outgoing 360, although it’s not clear whether we’re talking about the new slim or the original design. Either way, given what’s under the hood - skimping on cooling and ventilation isn’t a good thing.

The squared off design and glossy black chassis scream entertainment center. Microsoft isn’t playing for a position in your games cabinet, the Xbox One is just as much about consuming media as it is about playing games.

In its presentation Microsoft kept referencing how the world has changed. Smartphones, tablets, even internet connectivity are very different today than they were when the Xbox 360 launched in 2005. It’s what Microsoft didn’t mention that really seems to have played a role in its decision making behind the One: many critics didn’t see hope for another generation of high-end game consoles.

With so much of today focused on mobile, free to play and casual gaming on smartphones and tablets - would anyone even buy a next-generation console? For much of the past couple of years I’ve been going around meetings saying that before consolidation comes great expansion. I’ve been saying this about a number of markets, but I believe the phrase is very applicable to gaming. Casual gaming, the advent of free to play and even the current mobile revolution won’t do anything to the demand for high-end consoles today or in the near term - they simply expand the market for gamers. Eventually those types of games and gaming platforms will grow to the point where they start competing with one another and then the big console players might have an issue to worry about, but I suspect that’s still some time away. The depth offered by big gaming titles remains unmatched elsewhere. You can argue that many games are priced too high, but the Halo, BioShock, Mass Effect, CoD experience still drives a considerable portion of the market.

The fact that this debate is happening however has to have impacted Microsoft. Simply building a better Xbox 360 wasn’t going to guarantee success, and I suspect there were not insignificant numbers within the company who felt that even making the Xbox One as much of a gaming machine as it is would be a mistake. What resulted was a subtle pivot in strategy.

The Battle for the TV

Last year you couldn’t throw a stone without hitting a rumor of Apple getting into the TV business. As of yet those rumors haven’t gone anywhere other than to point to continued investment in the Apple TV. Go back even further and Google had its own TV aspirations, although met with far less success. More recently, Intel threw its hat into the ring. I don’t know for sure how things have changed with the new CEO, but as far as I can tell he’s a rational man and things should proceed with Intel Media’s plans for an IPTV service. All of this is a round about way of saying that TV is clearly important and viewed by many as one of the next ecosystem battles in tech.

Combine the fact that TV is important, with the fact that the Xbox 360 has evolved into a Netflix box for many, add a dash of uncertainty for the future of high end gaming consoles and you end up with the formula behind the Xbox One. If the future doesn’t look bright for high-end gaming consoles, turning the Xbox into something much more than that will hopefully guarantee its presence in the living room. At least that’s what I suspect Microsoft’s thinking was going into the Xbox One. With that in mind, everything about the One makes a lot of sense.

CPU & GPU Hardware Analyzed
Comments Locked


View All Comments

  • xaml - Thursday, May 23, 2013 - link

    If every third Xbox 360 user had to get at least one repaired and after that died, bought a new one until finally salvaged by the 'Slim'...
  • Niabureth - Wednesday, May 29, 2013 - link

    And just how do you expect them to do that? Decisions on what hardware to use was made a lot earlier than Sony's PS4 presentation, meaning that train has already left the station. I'm guessing AMD is massproducing the hardware by now. Mircosoft: Oh we saw that Sony is going for a much more powerful architecture and we don't want any of the million of APU's u've just produced for us!
  • JDG1980 - Wednesday, May 22, 2013 - link

    If AMD is using Jaguar here, isn't that basically an admission that Bulldozer/Piledriver is junk, at least for gaming/desktop usage? Why don't they use a scaled-up Jaguar in their desktop APUs instead of Piledriver? The only thing Bulldozer/Piledriver seems to be good for is very heavily threaded loads - i.e. servers. Most desktop users are well served by even 4 cores, and it looks like they've already scaled Jaguar to 8. And AMD is getting absolutely killed on the IPC front on the desktop - if Jaguar is a step in the right direction then by all means it should be taken. BD/PD is a sunk cost, it should be written off, or restricted to Opterons only.
  • tipoo - Wednesday, May 22, 2013 - link

    Too big.
  • Slaimus - Wednesday, May 22, 2013 - link

    Bulldozer/Piledriver needs SOI. Steamroller is not ready yet, and it is not portable outside of Globalfoundries gate-first 28nm architecture. Jaguar is bulk 28nm and gate-last, which can be made by TSMC in large quantities at lower cost per wafer.
  • JDG1980 - Wednesday, May 22, 2013 - link

    All the more reason for AMD to switch to Jaguar in their mass-market CPUs and APUs.
    I'd be willing to bet money that a 4-core Jaguar clocked up to 3 GHz would handily beat a 4-module ("8-core") Piledriver clocked to 4 GHz. BD/PD is AMD's Netburst, a total FAIL of an architecture that needs to be dropped before it takes the whole company down with it.
  • Exophase - Wednesday, May 22, 2013 - link

    Jaguar can't be clocked at 3GHz - 2GHz is closer to the hard limit as far as we currently know. It's clock limited by design, just look at the clock latency of FPU operations. IPC is at best similar to Piledriver (in practice probably a little worse), so in tasks heavily limited by single threaded performance Jaguar will do much worse. Consoles can bear limited single threaded performance to some extent but PCs can't.
  • Spunjji - Wednesday, May 22, 2013 - link

    It's effectively a low-power optimised Athlon 64 with added bits, so it's not going to scale any higher than Phenom did. That already ran out of steam on the desktop. Bulldozer/Piledriver may not have been the knockout blow AMD needed but they're scaling better than die-shrinking the same architecture yet again would have.
  • JDG1980 - Wednesday, May 22, 2013 - link

    Bobcat/Jaguar is a new architecture specifically designed for low-power usage. It's not the same as the K10 design, though it wouldn't surprise me if they did share some parts.
    And even just keeping K10 with tweaks and die-shrinks would have worked better on the desktop than the Faildozer series. Phenom II X6 1100T was made on an outdated 45nm process, and still beat the top 32nm Bulldozer in most benchmarks. A die-shrink to 28nm would not only be much cheaper to manufacture per chip than Bulldozer/Piledriver, but would perform better as well. It's only pride and the refusal to admit sunk costs that has kept AMD on their trail of fail.
  • kyuu - Wednesday, May 22, 2013 - link

    That's a nice bit of FUD there. K10 had pretty much been pushed as far as it was going to go. Die-shrinking and tweaking it was not going to cut it. AMD needed a new architecture.

    Piledriver already handily surpasses K10 in every metric, including single-threaded performance.

Log in

Don't have an account? Sign up now