It’s that time of decade again. Time for a new Xbox. It took four years for Microsoft to go from the original Xbox to the Xbox 360. The transition from Xbox 360 to the newly announced Xbox One will take right around 8 years, and the 360 won’t be going away anytime soon either. The console business demands long upgrade cycles in order to make early investments in hardware (often sold at a loss) worthwhile. This last round was much longer that it ever should have been, so the Xbox One arrives to a very welcoming crowd.

Yesterday Microsoft finally took the covers off the new Xbox, what it hopes will last for many years to come. At a high level here’s what we’re dealing with:

- 8-core AMD Jaguar CPU
- 12 CU/768 SP AMD GCN GPU
- 8GB DDR3 system memory
- 500GB HDD
- Blu-ray drive
- 2.4/5.0GHz 802.11 a/b/g/n, multiple radios with WiFi Direct support
- 4K HDMI in/out (for cable TV passthrough)
- USB 3.0
- Available later this year

While Microsoft was light on technical details, I believe we have enough to put together some decent analysis. Let’s get to it.

Chassis

The Xbox 360 was crafted during a time that seems so long ago. Consumer electronics styled in white were all the rage, we would be a few years away from the aluminum revolution that engulfs us today. Looking at the Xbox One tells us a lot about how things have changed.

Microsoft isn’t so obsessed with size here, at least initially. Wired reports that the Xbox One is larger than the outgoing 360, although it’s not clear whether we’re talking about the new slim or the original design. Either way, given what’s under the hood - skimping on cooling and ventilation isn’t a good thing.

The squared off design and glossy black chassis scream entertainment center. Microsoft isn’t playing for a position in your games cabinet, the Xbox One is just as much about consuming media as it is about playing games.

In its presentation Microsoft kept referencing how the world has changed. Smartphones, tablets, even internet connectivity are very different today than they were when the Xbox 360 launched in 2005. It’s what Microsoft didn’t mention that really seems to have played a role in its decision making behind the One: many critics didn’t see hope for another generation of high-end game consoles.

With so much of today focused on mobile, free to play and casual gaming on smartphones and tablets - would anyone even buy a next-generation console? For much of the past couple of years I’ve been going around meetings saying that before consolidation comes great expansion. I’ve been saying this about a number of markets, but I believe the phrase is very applicable to gaming. Casual gaming, the advent of free to play and even the current mobile revolution won’t do anything to the demand for high-end consoles today or in the near term - they simply expand the market for gamers. Eventually those types of games and gaming platforms will grow to the point where they start competing with one another and then the big console players might have an issue to worry about, but I suspect that’s still some time away. The depth offered by big gaming titles remains unmatched elsewhere. You can argue that many games are priced too high, but the Halo, BioShock, Mass Effect, CoD experience still drives a considerable portion of the market.

The fact that this debate is happening however has to have impacted Microsoft. Simply building a better Xbox 360 wasn’t going to guarantee success, and I suspect there were not insignificant numbers within the company who felt that even making the Xbox One as much of a gaming machine as it is would be a mistake. What resulted was a subtle pivot in strategy.

The Battle for the TV

Last year you couldn’t throw a stone without hitting a rumor of Apple getting into the TV business. As of yet those rumors haven’t gone anywhere other than to point to continued investment in the Apple TV. Go back even further and Google had its own TV aspirations, although met with far less success. More recently, Intel threw its hat into the ring. I don’t know for sure how things have changed with the new CEO, but as far as I can tell he’s a rational man and things should proceed with Intel Media’s plans for an IPTV service. All of this is a round about way of saying that TV is clearly important and viewed by many as one of the next ecosystem battles in tech.

Combine the fact that TV is important, with the fact that the Xbox 360 has evolved into a Netflix box for many, add a dash of uncertainty for the future of high end gaming consoles and you end up with the formula behind the Xbox One. If the future doesn’t look bright for high-end gaming consoles, turning the Xbox into something much more than that will hopefully guarantee its presence in the living room. At least that’s what I suspect Microsoft’s thinking was going into the Xbox One. With that in mind, everything about the One makes a lot of sense.

CPU & GPU Hardware Analyzed
Comments Locked

245 Comments

View All Comments

  • elitewolverine - Thursday, May 23, 2013 - link

    its the same gpu at heart, sure shaders are lower, because of eSram. You might want to rethink how internals work. Advantage will be very minimal
  • alex@1234 - Friday, May 24, 2013 - link

    In every place its mentioned 32% higher GPU power, I don't think A GTX 660 TI and GTX 680 are equal. For sure PS4 holds the advantage. Lower shaders and lower in everything compared to PS4, DDR3 Xbox one-PS4 DDR5. For ESRAM, I will tell you something have a SSD, have 32 GB RAM, it cannot make it for a better GPU.
  • cjb110 - Thursday, May 23, 2013 - link

    In some ways this is the opposite to the previous generation. The 360 screamed games (at least its original dashboard), whereas the PS3 had all the potential media support (the xbar interface though let it down) as well as being an excellent blu-ray player (which is the whole reason I got mine).

    This time around MS have gone all out entertainment, that can do games, where as Sony seems to have gone games first. I'm imagining that physically the PS4 is more flashy too like the PS3 and 360 where...game devices not family entertainment boxes.

    Personally I'm keeping the 360 for my games library, and the One will likely replace the PS3.
  • Tuvok86 - Thursday, May 23, 2013 - link

    Xbox One ~ 7770 Ghz
    PS4 ~ 7850
  • jnemesh - Thursday, May 23, 2013 - link

    One of my biggest concerns with the new system is the Kinect requirement. I have my Xbox and other electronics in a rack in the closet. I would need to extend the USB 3.0 (and I am assuming this time around, the Kinect is using a standard USB connector on all models) over 40 feet to get the wire from my closet to the location beneath or above my wall mounted TV. With the existing Kinect for the 360, I never bothered with it, but you COULD buy a fairly expensive USB over cat5 extender (Gefen makes one of the more reliable models, but it's $499!). I know of no such adapter for USB 3.0, and since Kinect HAS to be used for the console to operate, this means I won't be buying an Xbox One! Does anyone know of a product that will extend USB 3.0 over a cat5 or cat6 cable? Or any solution?
  • epobirs - Saturday, May 25, 2013 - link

    There are USB 3.0 over fiber solutions available but frankly, I doubt anyone at MS is losing sleep over those few homes with such odd arrangements.
  • Panzerknacker - Thursday, May 23, 2013 - link

    Is it just me or are these new gen consoles seriously lacking in CPU performance? According to the benchmarks of the A4-5000, of which you could say the consoles have two, the CPU power is not even going to come close to any i5 or maybe even i3 chip.

    Considering the fact they are running the X86 platform this time, which probably is not the most efficient to run games (probably the reason why consoles in the past never used x86), and the fact that they run lots of secondary applications next to the game (which leaves maybe 6/8 cores left for the game on average), I think CPU performance is seriously lacking. CPU intensive games will be a no-no on this next gen on consoles.
  • Th-z - Saturday, May 25, 2013 - link

    The first Xbox used x86 CPU. Cost was the main reason not many consoles used x86 CPU in the past, unlike IBM Power and ARM, x86 doesn't give out license to whatever company to make their own CPU. But this time they probably see benefit has outweighed the cost (or even less cost) with x86 APU design from AMD - good performance per dollar/per watt for both CPU and GPU. I am not sure if Power today can reach this kind of performance per dollar/per watt for a CPU, or ARM has the CPU performance to run high end games. Also bear in mind that consoles use less CPU cycle to run games than PC.
  • hfm - Thursday, May 23, 2013 - link

    "Differences in the memory subsytems also gives us some insight into each approach to the next-gen consoles. Microsoft opted for embedded SRAM + DDR3, while Sony went for a very fast GDDR5 memory interface. Sony’s approach (especially when combined with a beefier GPU) is exactly what you’d build if you wanted to give game developers the fastest hardware. Microsoft’s approach on the other hand looks a little more broad. The Xbox One still gives game developers a significant performance boost over the previous generation, but also attempts to widen the audience for the console."

    I don't quite understand how their choice of memory is going to "widen the audience for the console". Unless it's going to cause the XBox One to truly be cheaper, which I doubt. Or if you are referring to the entire package with Kinect, though it didn't seem so in the context of the statement.
  • FloppySnake - Friday, May 24, 2013 - link

    It's my understanding (following an AMD statement during a phone conference over 8000m announcement) that ZeroCore had been enhanced for graceful fall-back, powering-down individual GPU segments not just the entire GPU. If this is employed we could see the PS4 delivering power as needed (not sure what control they'll have over GDDR5 clocks if any), but potentially not power hungry unless it needs to be. Perhaps warrants further investigation?

    I agree with the article that if used appropriately, the 32MB SRAM buffer could compensate for limited bandwidth, but only in a traditional pipeline; it could severely limit GPGPU potential as there's limited back-and-forth bandwidth between the CPU and GPU, a buffer won't help here.

    For clarity, the new Kinect uses a time-of-flight depth sensor, completely different technology to the previous Kinect. This offers superior depth resolution and fps but the XY resolution is actually something like 500x500 (or some combination that adds up to 250,000 pixels).

Log in

Don't have an account? Sign up now