Power & Thermals

Microsoft made a point to focus on the Xbox One’s new power states during its introduction. Remember that when the Xbox 360 was introduced, power gating wasn’t present in any shipping CPU or GPU architectures. The Xbox One (and likely the PlayStation 4) can power gate unused CPU cores. AMD’s GCN architecture supports power gating, so I’d assume that parts of the GPU can be power gated as well. Dynamic frequency/voltage scaling is also supported. The result is that we should see a good dynamic range of power consumption on the Xbox One, compared to the Xbox 360’s more on/off nature.

AMD’s Jaguar is quite power efficient, capable of low single digit idle power so I would expect far lower idle power consumption than even the current slim Xbox 360 (50W would be easy, 20W should be doable for truly idle). Under heavy gaming load I’d expect to see higher power consumption than the current Xbox 360, but still less than the original 2005 Xbox 360.

Compared to the PlayStation 4, Microsoft should have the cooler running console under load. Fewer GPU ALUs and lower power memory don’t help with performance but do at least offer one side benefit.

OS
 

The Xbox One is powered by two independent OSes running on a custom version of Microsoft’s Hyper-V hypervisor. Microsoft made the hypervisor very lightweight, and created hard partitions of system resources for the two OSes that run on top of it: the Xbox OS and the Windows kernel.

The Xbox OS is used to play games, while the Windows kernel effectively handles all apps (as well as things like some of the processing for Kinect inputs). Since both OSes are just VMs on the same hypervisor, they are both running simultaneously all of the time, enabling seamless switching between the two. With much faster hardware and more cores (8 vs 3 in the Xbox 360), Microsoft can likely dedicate Xbox 360-like CPU performance to the Windows kernel while running games without any negative performance impact. Transitioning in/out of a game should be very quick thanks to this architecture. It makes a ton of sense.

Similarly, you can now multitask with apps. Microsoft enabled Windows 8-like multitasking where you can snap an app to one side of the screen while watching a video or playing a game on the other.

The hard partitioning of resources would be nice to know more about. The easiest thing would be to dedicate a Jaguar compute module to each OS, but that might end up being overkill for the Windows kernel and insufficient for some gaming workloads. I suspect ~1GB of system memory ends up being carved off for Windows.

Kinect & New Controller
 

All Xbox One consoles will ship with a bundled Kinect sensor. Game console accessories generally don’t do all that well if they’re optional. Kinect seemed to be the exception to the rule, but Microsoft is very focused on Kinect being a part of the Xbox going forward so integration here makes sense.

The One’s introduction was done entirely via Kinect enabled voice and gesture controls. You can even wake the Xbox One from a sleep state using voice (say “Xbox on”), leveraging Kinect and good power gating at the silicon level. You can use large two-hand pinch and stretch gestures to quickly move in and out of the One’s home screen.

The Kinect sensor itself is one of 5 semi-custom silicon elements in the Xbox One - the other four are: SoC, PCH, Kinect IO chip and Blu-ray DSP (read: the end of optical drive based exploits). In the One’s Kinect implementation Microsoft goes from a 640 x 480 sensor to 1920 x 1080 (I’m assuming 1080p for the depth stream as well). The camera’s field of view was increased by 60%, allowing support for up to 6 recognized skeletons (compared to 2 in the original Kinect). Taller users can now get closer to the camera thanks to the larger FOV, similarly the sensor can be used in smaller rooms.

The Xbox One will also ship with a new redesigned wireless controller with vibrating triggers:

Thanks to Kinect's higher resolution and more sensitive camera, the console should be able to identify who is gaming and automatically pair the user to the controller.

TV
 

The Xbox One features a HDMI input for cable TV passthrough (from a cable box or some other tuner with HDMI out). Content passed through can be viewed with overlays from the Xbox or just as you would if the Xbox wasn’t present. Microsoft built its own electronic program guide that allows you to tune channels by name, not just channel number (e.g. say “Watch HBO”). The implementation looks pretty slick, and should hopefully keep you from having to switch inputs on your TV - the Xbox One should drive everything. Microsoft appears to be doing its best to merge legacy TV with the new world of buying/renting content via Xbox Live. It’s a smart move.

One area where Microsoft is being a bit more aggressive is in its work with the NFL. Microsoft demonstrated fantasy football integration while watching NFL passed through to the Xbox One.

Memory Subsystem Final Words
Comments Locked

245 Comments

View All Comments

  • elitewolverine - Thursday, May 23, 2013 - link

    its the same gpu at heart, sure shaders are lower, because of eSram. You might want to rethink how internals work. Advantage will be very minimal
  • alex@1234 - Friday, May 24, 2013 - link

    In every place its mentioned 32% higher GPU power, I don't think A GTX 660 TI and GTX 680 are equal. For sure PS4 holds the advantage. Lower shaders and lower in everything compared to PS4, DDR3 Xbox one-PS4 DDR5. For ESRAM, I will tell you something have a SSD, have 32 GB RAM, it cannot make it for a better GPU.
  • cjb110 - Thursday, May 23, 2013 - link

    In some ways this is the opposite to the previous generation. The 360 screamed games (at least its original dashboard), whereas the PS3 had all the potential media support (the xbar interface though let it down) as well as being an excellent blu-ray player (which is the whole reason I got mine).

    This time around MS have gone all out entertainment, that can do games, where as Sony seems to have gone games first. I'm imagining that physically the PS4 is more flashy too like the PS3 and 360 where...game devices not family entertainment boxes.

    Personally I'm keeping the 360 for my games library, and the One will likely replace the PS3.
  • Tuvok86 - Thursday, May 23, 2013 - link

    Xbox One ~ 7770 Ghz
    PS4 ~ 7850
  • jnemesh - Thursday, May 23, 2013 - link

    One of my biggest concerns with the new system is the Kinect requirement. I have my Xbox and other electronics in a rack in the closet. I would need to extend the USB 3.0 (and I am assuming this time around, the Kinect is using a standard USB connector on all models) over 40 feet to get the wire from my closet to the location beneath or above my wall mounted TV. With the existing Kinect for the 360, I never bothered with it, but you COULD buy a fairly expensive USB over cat5 extender (Gefen makes one of the more reliable models, but it's $499!). I know of no such adapter for USB 3.0, and since Kinect HAS to be used for the console to operate, this means I won't be buying an Xbox One! Does anyone know of a product that will extend USB 3.0 over a cat5 or cat6 cable? Or any solution?
  • epobirs - Saturday, May 25, 2013 - link

    There are USB 3.0 over fiber solutions available but frankly, I doubt anyone at MS is losing sleep over those few homes with such odd arrangements.
  • Panzerknacker - Thursday, May 23, 2013 - link

    Is it just me or are these new gen consoles seriously lacking in CPU performance? According to the benchmarks of the A4-5000, of which you could say the consoles have two, the CPU power is not even going to come close to any i5 or maybe even i3 chip.

    Considering the fact they are running the X86 platform this time, which probably is not the most efficient to run games (probably the reason why consoles in the past never used x86), and the fact that they run lots of secondary applications next to the game (which leaves maybe 6/8 cores left for the game on average), I think CPU performance is seriously lacking. CPU intensive games will be a no-no on this next gen on consoles.
  • Th-z - Saturday, May 25, 2013 - link

    The first Xbox used x86 CPU. Cost was the main reason not many consoles used x86 CPU in the past, unlike IBM Power and ARM, x86 doesn't give out license to whatever company to make their own CPU. But this time they probably see benefit has outweighed the cost (or even less cost) with x86 APU design from AMD - good performance per dollar/per watt for both CPU and GPU. I am not sure if Power today can reach this kind of performance per dollar/per watt for a CPU, or ARM has the CPU performance to run high end games. Also bear in mind that consoles use less CPU cycle to run games than PC.
  • hfm - Thursday, May 23, 2013 - link

    "Differences in the memory subsytems also gives us some insight into each approach to the next-gen consoles. Microsoft opted for embedded SRAM + DDR3, while Sony went for a very fast GDDR5 memory interface. Sony’s approach (especially when combined with a beefier GPU) is exactly what you’d build if you wanted to give game developers the fastest hardware. Microsoft’s approach on the other hand looks a little more broad. The Xbox One still gives game developers a significant performance boost over the previous generation, but also attempts to widen the audience for the console."

    I don't quite understand how their choice of memory is going to "widen the audience for the console". Unless it's going to cause the XBox One to truly be cheaper, which I doubt. Or if you are referring to the entire package with Kinect, though it didn't seem so in the context of the statement.
  • FloppySnake - Friday, May 24, 2013 - link

    It's my understanding (following an AMD statement during a phone conference over 8000m announcement) that ZeroCore had been enhanced for graceful fall-back, powering-down individual GPU segments not just the entire GPU. If this is employed we could see the PS4 delivering power as needed (not sure what control they'll have over GDDR5 clocks if any), but potentially not power hungry unless it needs to be. Perhaps warrants further investigation?

    I agree with the article that if used appropriately, the 32MB SRAM buffer could compensate for limited bandwidth, but only in a traditional pipeline; it could severely limit GPGPU potential as there's limited back-and-forth bandwidth between the CPU and GPU, a buffer won't help here.

    For clarity, the new Kinect uses a time-of-flight depth sensor, completely different technology to the previous Kinect. This offers superior depth resolution and fps but the XY resolution is actually something like 500x500 (or some combination that adds up to 250,000 pixels).

Log in

Don't have an account? Sign up now