Power Consumption

As always I ran the Xbox One through a series of power consumption tests. I’ve described the tests below:

Off - Console is completely off, standby mode is disabled
Standby - Console is asleep, can be woken up by voice commands (if supported). Background updating is allowed in this mode.
Idle - Ethernet connected, no disc in drive, system idling at dashboard.
Load (BF4) - Ethernet connected, Battlefield 4 disc in drive, running Battlefield 4, stationary in test scene.
Load (BD Playback) - Ethernet connected, Blu-ray disc in drive, average power across Inception test scene.
CPU Load - SunSpider - Ethernet connected, no disc in drive, running SunSpider 1.0.2 in web browser.
CPU Load - Kraken - Ethernet connected, no disc in drive, running Kraken 1.1 in web browser

Power Consumption Comparison
Total System Power Off Standby Idle Load (BF4) Load (BD Playback)
Microsoft Xbox 360 Slim 0.6W - 70.4W 90.4W (RDR) -
Microsoft Xbox One 0.22W 15.3W 69.7W 119.0W 79.9W
Sony PlayStation 4 0.45W 8.59W 88.9W 139.8W 98.0W

When I first saw the PS4’s idle numbers I was shocked. 80 watts is what our IVB-E GPU testbed idles at, and that’s with a massive 6-core CPU and a Titan GPU. Similarly, my Haswell + Titan CPU testbed has a lower idle power than that. The Xbox One’s numbers are a little better at 69W, but still 50 - 80% higher than I was otherwise expecting.

Standby power is also surprisingly high for the Xbox One. Granted in this mode you can turn on the entire console by saying Xbox On, but always-on voice recognition is also something Motorola deployed on the Moto X and did so in a far lower power budget.

The only good news on the power front is really what happens when the console is completely off. I’m happy to report that I measured between 0.22 and 0.45W of draw while off, far less than previous Xbox 360s.

Power under load is pretty much as expected. In general the Xbox One appears to draw ~120W under max load, which isn’t much at all. I’m actually surprised by the delta between idle power and loaded GPU power (~50W). In this case I’m wondering if Microsoft is doing much power gating of unused CPU cores and/or GPU resources. The same is true for Sony on the PS4. It’s entirely possible that AMD hasn’t offered the same hooks into power management that you’d see on a PC equipped with an APU.

Blu-ray playback power consumption is more reasonable on the Xbox One than on the PS4. In both cases though the numbers are much higher than I’d like them to be.

I threw in some browser based CPU benchmarks and power numbers as well. Both the Xbox One and PS4 ship with integrated web browsers. Neither experience is particularly well optimized for performance, but the PS4 definitely has the edge at least in javascript performance.

Power Consumption Comparison
Lower is Better SunSpider 1.0.2 (Performance) SunSpider 1.0.2 (Power) Kraken 1.1 (Performance) Kraken 1.1 (Power)
Microsoft Xbox One 2360.9 ms 72.4W 111892.5 ms 72.9W
Sony PlayStation 4 1027.4 ms 114.7W 22768.7 ms 114.5W

Power consumption while running these CPU workloads is interesting. The marginal increase in system power consumption while running both tests on the Xbox One indicates one of two things: we’re either only taxing 1 - 2 cores here and/or Microsoft isn’t power gating unused CPU cores. I suspect it’s the former, since IE on the Xbox technically falls under the Windows kernel’s jurisdiction and I don’t believe it has more than 1 - 2 cores allocated for its needs.

The PS4 on the other hand shows a far bigger increase in power consumption during these workloads. For one we’re talking about higher levels of performance, but it’s also possible that Sony is allowing apps access to more CPU cores.

There’s definitely room for improvement in driving down power consumption on both next-generation platforms. I don’t know that there’s huge motivation to do so outside of me complaining about it though. I would like to see idle power drop below 50W, standby power shouldn’t be anywhere near this high on either platform, and the same goes for power consumption while playing back a Blu-ray movie.

Image Quality - Xbox One vs. PlayStation 4 Final Words
Comments Locked

286 Comments

View All Comments

  • kyuu - Wednesday, November 20, 2013 - link

    I don't care. Why should I? The only thing that goes on in my living room is playing games and watching TV. So even in the unlikely event that the Kinect camera is feeding somebody (NSA? Microsoft interns? Who exactly am I supposed to be afraid of again?) a 24/7 feed of my living room and somebody is actually looking at it, big whoop.

    I'm not planning on purchasing either console, btw. Just irritated by the tin-foil hat brigade pretending it's reasonable to be scared by the Kinect.
  • kyuu - Wednesday, November 20, 2013 - link

    Oh, and not to mention that if that is actually taking place, it'll be found out pretty quickly and there'll be a huge backlash against Microsoft. The huge potential for negative press and lost sales for absolutely no gain makes me pretty sure it's not going on, though.
  • prophet001 - Thursday, November 21, 2013 - link

    How sad.

    Microsoft, Google, Sony, and any other corporation out there has absolutely zero right to my privacy. Whether I am or am not doing anything "wrong." You my friend will not know what you've lost until it is truly gone.
  • mikato - Monday, November 25, 2013 - link

    I don't think it will be a problem (see kyuu), but I really disagree with your "nothing to hide" attitude.
    http://en.wikipedia.org/wiki/Nothing_to_hide_argum...
  • Floew - Wednesday, November 20, 2013 - link

    I recently build a Steam box. With a 360 controller/wireless adapter and Steam Big Picture set to launch on startup, it's a surprisingly console-like experience. Works much better than I had expected, frankly. My motivation to plunk down cash for the new consoles is now very low.
  • Quidam67 - Wednesday, November 20, 2013 - link

    Anand, just wondering if the Xbox One controller works with a Windows based PC (as per the 360 controller)? Would be great if you could try that out and let us know :)
  • The Von Matrices - Wednesday, November 20, 2013 - link

    The wireless XBOX 360 controller required a special USB receiver to work with a PC, and that took a few years to be released. I don't know if XBOX One controllers are compatible with the 360 wireless controller receiver or if a new one is required. I actually liked the wired XBOX 360 controller for certain PC games, and I'm curious to know if Microsoft will make wired XBOX One controllers.
  • Quidam67 - Sunday, November 24, 2013 - link

    Targetted to work with PC in 2014 apparently http://www.polygon.com/2013/8/12/4615454/xbox-one-...
  • errorr - Wednesday, November 20, 2013 - link

    There is a lot of discussion about the memory bandwidth issues but what I want to know is how latency affects the performance picture. That SRAM latency might be an order of magnitude quicker even if it is small. What workloads are more latency dependant to where the Xbox design might have a performance advantage?
  • khanov - Wednesday, November 20, 2013 - link

    It is important to understand that GPUs work in a fundamentally different way to CPUs. The main difference when it comes to memory access is how they deal with latency.

    CPUs require cache to hide memory access latency. If the required instructions/data are not in cache there is a large latency penalty and the CPU core sits there doing nothing useful for hundreds of clock cycles. For this reason CPU designers pay close attention to cache size and design to ensure that cache hit rates stay north of 99% (on any modern CPU).

    GPUs do it differently. Any modern GPU has many thousands of threads in flight at once (even if it has, for example, only 512 shader cores) . When a memory access is needed, it is queued up and attended to by the memory controller in a timely fashion, but there is still the latency of hundreds of clock cycles to consider. So what the GPU does is switch to a different group of threads and process those other threads while it waits for the memory access to complete.

    In fact, whenever the needed data is not available, the GPU will switch thread groups so that it can continue to do useful work. If you consider that any given frame of a game contains millions of pixels, and that GPU calculations need to be performed for each and every pixel, then you can see how there would almost always be more threads waiting to switch over to. By switching threads instead of waiting and doing nothing, GPUs effectively hide memory latency very well. But they do it in a completely different way to a CPU.

    Because a GPU has many thousands of threads in flight at once, and each thread group is likely at some point to require some data fetched from memory, the memory bandwidth becomes a much more important factor than memory latency. Latency can be hidden by switching thread groups, but bandwidth constraints limit the overall amount of data that can be processed by the GPU per frame.

    This is, in a nutshell, why all modern pc graphics cards at the mid and high end use GDDR5 on a wide bus. Bandwidth is king for a GPU.

    The Xbox One attempts to offset some of its apparent lack of memory bandwidth by storing frequently used buffers in eSRAM. The eSRAM has a fairly high effective bandwidth, but its size is small. It still remains to be seen how effectively it can be used by talented developers. But you should not worry about its latency. Latency is really not important to the GPU.

    I hope this helps you to understand why everyone goes on and on about bandwidth. Sorry if it is a little long-winded.

Log in

Don't have an account? Sign up now