It’s that time of decade again. Time for a new Xbox. It took four years for Microsoft to go from the original Xbox to the Xbox 360. The transition from Xbox 360 to the newly announced Xbox One will take right around 8 years, and the 360 won’t be going away anytime soon either. The console business demands long upgrade cycles in order to make early investments in hardware (often sold at a loss) worthwhile. This last round was much longer that it ever should have been, so the Xbox One arrives to a very welcoming crowd.

Yesterday Microsoft finally took the covers off the new Xbox, what it hopes will last for many years to come. At a high level here’s what we’re dealing with:

- 8-core AMD Jaguar CPU
- 12 CU/768 SP AMD GCN GPU
- 8GB DDR3 system memory
- 500GB HDD
- Blu-ray drive
- 2.4/5.0GHz 802.11 a/b/g/n, multiple radios with WiFi Direct support
- 4K HDMI in/out (for cable TV passthrough)
- USB 3.0
- Available later this year

While Microsoft was light on technical details, I believe we have enough to put together some decent analysis. Let’s get to it.

Chassis

The Xbox 360 was crafted during a time that seems so long ago. Consumer electronics styled in white were all the rage, we would be a few years away from the aluminum revolution that engulfs us today. Looking at the Xbox One tells us a lot about how things have changed.

Microsoft isn’t so obsessed with size here, at least initially. Wired reports that the Xbox One is larger than the outgoing 360, although it’s not clear whether we’re talking about the new slim or the original design. Either way, given what’s under the hood - skimping on cooling and ventilation isn’t a good thing.

The squared off design and glossy black chassis scream entertainment center. Microsoft isn’t playing for a position in your games cabinet, the Xbox One is just as much about consuming media as it is about playing games.

In its presentation Microsoft kept referencing how the world has changed. Smartphones, tablets, even internet connectivity are very different today than they were when the Xbox 360 launched in 2005. It’s what Microsoft didn’t mention that really seems to have played a role in its decision making behind the One: many critics didn’t see hope for another generation of high-end game consoles.

With so much of today focused on mobile, free to play and casual gaming on smartphones and tablets - would anyone even buy a next-generation console? For much of the past couple of years I’ve been going around meetings saying that before consolidation comes great expansion. I’ve been saying this about a number of markets, but I believe the phrase is very applicable to gaming. Casual gaming, the advent of free to play and even the current mobile revolution won’t do anything to the demand for high-end consoles today or in the near term - they simply expand the market for gamers. Eventually those types of games and gaming platforms will grow to the point where they start competing with one another and then the big console players might have an issue to worry about, but I suspect that’s still some time away. The depth offered by big gaming titles remains unmatched elsewhere. You can argue that many games are priced too high, but the Halo, BioShock, Mass Effect, CoD experience still drives a considerable portion of the market.

The fact that this debate is happening however has to have impacted Microsoft. Simply building a better Xbox 360 wasn’t going to guarantee success, and I suspect there were not insignificant numbers within the company who felt that even making the Xbox One as much of a gaming machine as it is would be a mistake. What resulted was a subtle pivot in strategy.

The Battle for the TV

Last year you couldn’t throw a stone without hitting a rumor of Apple getting into the TV business. As of yet those rumors haven’t gone anywhere other than to point to continued investment in the Apple TV. Go back even further and Google had its own TV aspirations, although met with far less success. More recently, Intel threw its hat into the ring. I don’t know for sure how things have changed with the new CEO, but as far as I can tell he’s a rational man and things should proceed with Intel Media’s plans for an IPTV service. All of this is a round about way of saying that TV is clearly important and viewed by many as one of the next ecosystem battles in tech.

Combine the fact that TV is important, with the fact that the Xbox 360 has evolved into a Netflix box for many, add a dash of uncertainty for the future of high end gaming consoles and you end up with the formula behind the Xbox One. If the future doesn’t look bright for high-end gaming consoles, turning the Xbox into something much more than that will hopefully guarantee its presence in the living room. At least that’s what I suspect Microsoft’s thinking was going into the Xbox One. With that in mind, everything about the One makes a lot of sense.

CPU & GPU Hardware Analyzed
Comments Locked

245 Comments

View All Comments

  • tipoo - Wednesday, May 22, 2013 - link

    I wonder how close the DDR3 plus small fast eSRAM can get to the GDDR5s peak performance from the PS4. The GDDR5 will be better in general for the GPU no doubt, but how much will be offset by the eSRAM? And how much will GDDRs high latency hurt the CPU in the PS4?
  • Braincruser - Wednesday, May 22, 2013 - link

    The cpu is running on low frequency ~ 1.6 GHz which is half the frequency of most mainstream processors. And the GDDRs latency shouldn't be more than double the DDR3 latency. So in effect the latency stays the same, relativelly speaking.
  • MrMilli - Wednesday, May 22, 2013 - link

    GDDR5 actually has around ~8-10x worse latency compared to DDR3. So the CPU in the PS4 is going to be hurt. Everybody's talking about bandwidth but the Xbox One is going to have such a huge latency advantage that maybe in the end it's going to be better off.
  • mczak - Wednesday, May 22, 2013 - link

    gddr5 having much worse latency is a myth. The underlying memory technology is all the same after all, just the interface is different. Though yes memory controllers of gpus are more optimized for bandwidth rather than latency but that's not gddr5 inherent. The latency may be very slightly higher, but it probably won't be significant enough to be noticeable (no way for a factor of even 2 yet alone 8 as you're claiming).
    I don't know anything about the specific memory controller implementations of the PS4 or Xbox One (well other than one using ddr3 the other gddr5...) but I'd have to guess latency will be similar.
  • shtldr - Thursday, May 23, 2013 - link

    Are you talking latency in cycles (i.e. relative to memory's clock rate) or latency in seconds (absolute)? Latency in cycles is going to be worse, latency in seconds is going to be similar. If I understand it correctly, the absolute (objective) latency expressed in seconds is the deciding factor.
  • MrMilli - Thursday, May 23, 2013 - link

    I got my info from Beyond3D but I went to dig into whitepapers from Micron and Hynix and it seems that my info was wrong.
    Micron's DDR3 PC2133 has a CL14 read latency specification but possibly set as low as CL11 on the XBox. Hynix' GDDR5 (I don't know which brand GDDR5 the PS4 will use but they'll all be more or less the same) has a CL18 up to CL20 for GDDR5-5500.
    So even though this doesn't give actual latency information since that depends a lot on the memory controller, it probably won't be worse than 2x.
  • tipoo - Monday, May 27, 2013 - link

    Nowhere near as bad as I thought GDDR5 would be given what everyone is saying about it to defend DDR3, and given that it runs at such a high clock rate the CL effect will be reduced even more (that's measured in clock cycles, right?).
  • Riseer - Sunday, June 23, 2013 - link

    For game performance,GDDR5 has no equal atm.Their is a reason why it's used in Gpu's.MS is building a media center,while Sony is building a gaming console.Sony won't need to worry so much about latency for a console that puts games first and everything else second.Overall Ps4 will play games better then Xbone.Also ESram isn't a good thing,the only reason why Sony didn't use it is because it would complicate things more then they should be.This is why Sony went with GDDR5 it's a much simpler design that will streamline everything.This time around it will be MS with the more complicated console.
  • Riseer - Sunday, June 23, 2013 - link

    Also lets not forget you only have 32mb worth of ESRAM.At 1080p devs will push for more demanding effects.On Ps4 they have 8 gigs of ram that has around 70GB's more bandwidth.Since DDR3 isn't good for doing graphics,that only leaves 32mb of true Vram.That said Xbone can use the DDR3 ram for graphics,issue being DDR3 has low bandwidth.MS had no choice but to use ESRam to claw back some performance.
  • CyanLite - Sunday, May 26, 2013 - link

    I've been a long-term Xbox fan, but the silly Kinect requirement scares me. It's only a matter of time before somebody hacks that. And I'm a casual sit-down kind of gamer. Who wants to stand up and wave arm motions playing Call of Duty? Or shout multiple voice commands that are never recognized the first time around?

    If PS4 eliminates the camera requirement, get rids of the phone-home Internet connections, and lets me buy used games then I'm willing to reconsider my console loyalty.

Log in

Don't have an account? Sign up now