Power Consumption

As always I ran the Xbox One through a series of power consumption tests. I’ve described the tests below:

Off - Console is completely off, standby mode is disabled
Standby - Console is asleep, can be woken up by voice commands (if supported). Background updating is allowed in this mode.
Idle - Ethernet connected, no disc in drive, system idling at dashboard.
Load (BF4) - Ethernet connected, Battlefield 4 disc in drive, running Battlefield 4, stationary in test scene.
Load (BD Playback) - Ethernet connected, Blu-ray disc in drive, average power across Inception test scene.
CPU Load - SunSpider - Ethernet connected, no disc in drive, running SunSpider 1.0.2 in web browser.
CPU Load - Kraken - Ethernet connected, no disc in drive, running Kraken 1.1 in web browser

Power Consumption Comparison
Total System Power Off Standby Idle Load (BF4) Load (BD Playback)
Microsoft Xbox 360 Slim 0.6W - 70.4W 90.4W (RDR) -
Microsoft Xbox One 0.22W 15.3W 69.7W 119.0W 79.9W
Sony PlayStation 4 0.45W 8.59W 88.9W 139.8W 98.0W

When I first saw the PS4’s idle numbers I was shocked. 80 watts is what our IVB-E GPU testbed idles at, and that’s with a massive 6-core CPU and a Titan GPU. Similarly, my Haswell + Titan CPU testbed has a lower idle power than that. The Xbox One’s numbers are a little better at 69W, but still 50 - 80% higher than I was otherwise expecting.

Standby power is also surprisingly high for the Xbox One. Granted in this mode you can turn on the entire console by saying Xbox On, but always-on voice recognition is also something Motorola deployed on the Moto X and did so in a far lower power budget.

The only good news on the power front is really what happens when the console is completely off. I’m happy to report that I measured between 0.22 and 0.45W of draw while off, far less than previous Xbox 360s.

Power under load is pretty much as expected. In general the Xbox One appears to draw ~120W under max load, which isn’t much at all. I’m actually surprised by the delta between idle power and loaded GPU power (~50W). In this case I’m wondering if Microsoft is doing much power gating of unused CPU cores and/or GPU resources. The same is true for Sony on the PS4. It’s entirely possible that AMD hasn’t offered the same hooks into power management that you’d see on a PC equipped with an APU.

Blu-ray playback power consumption is more reasonable on the Xbox One than on the PS4. In both cases though the numbers are much higher than I’d like them to be.

I threw in some browser based CPU benchmarks and power numbers as well. Both the Xbox One and PS4 ship with integrated web browsers. Neither experience is particularly well optimized for performance, but the PS4 definitely has the edge at least in javascript performance.

Power Consumption Comparison
Lower is Better SunSpider 1.0.2 (Performance) SunSpider 1.0.2 (Power) Kraken 1.1 (Performance) Kraken 1.1 (Power)
Microsoft Xbox One 2360.9 ms 72.4W 111892.5 ms 72.9W
Sony PlayStation 4 1027.4 ms 114.7W 22768.7 ms 114.5W

Power consumption while running these CPU workloads is interesting. The marginal increase in system power consumption while running both tests on the Xbox One indicates one of two things: we’re either only taxing 1 - 2 cores here and/or Microsoft isn’t power gating unused CPU cores. I suspect it’s the former, since IE on the Xbox technically falls under the Windows kernel’s jurisdiction and I don’t believe it has more than 1 - 2 cores allocated for its needs.

The PS4 on the other hand shows a far bigger increase in power consumption during these workloads. For one we’re talking about higher levels of performance, but it’s also possible that Sony is allowing apps access to more CPU cores.

There’s definitely room for improvement in driving down power consumption on both next-generation platforms. I don’t know that there’s huge motivation to do so outside of me complaining about it though. I would like to see idle power drop below 50W, standby power shouldn’t be anywhere near this high on either platform, and the same goes for power consumption while playing back a Blu-ray movie.

Image Quality - Xbox One vs. PlayStation 4 Final Words
Comments Locked

286 Comments

View All Comments

  • djboxbaba - Wednesday, November 20, 2013 - link

    you have the cutest name ever bra, "wolfpup" kudos ^^
  • melgross - Wednesday, November 20, 2013 - link

    Hey noob, it doesn't work that way. SRAM is not equivalent to high speeds GDDR5. This has been well established already. You do get some boost, at some points, but it's not covering every area of performance the way GDDR5 is.
  • CubesTheGamer - Wednesday, November 20, 2013 - link

    Newb talk? No, you can't add them together. Let me tell you why, in technical terms.

    ESRAM is meant to be a cache, and what a cache does is take some data that you're going to need a lot (let's say there's some instruction code or some other code / data that needs to be read frequently. You put that data in the ESRAM, and it gets read 10+ times before being swapped for some other data. What you're saying makes it seem like we can constantly write and read from the ESRAM. That's not how it works.

    tl;dr: You can't add them together because you should only use the ESRAM 1/10th the amount of times as you should the main DDR3 RAM that the Xbox One has. So you're argument is invalid, and don't say things that you don't know about.
  • smartypnt4 - Thursday, November 21, 2013 - link

    He's indeed wrong, but I'd be willing to bet good money your hit rate on that eSRAM is way higher than 10% if it's used as a cache. Usual last level caches have a hit rate getting into the 80% range due to prefetching, and large ones like this have even higher hit rates.

    If it's hardware mapped like the article indicates(aka not a global cache, but more like it was on the 360), it won't hit quite as often with a naive program, but a good developer could ensure that the bulk of the memory accesses hit that eSRAM and not main memory.
  • Da W - Friday, November 22, 2013 - link

    XBone is ROP bound. Will you stop bitching around with your geometry and bandwith? Its all about ROP!
  • MadMan007 - Wednesday, November 20, 2013 - link

    I can add $2 and a whore together too, that doesn't make it good.
  • looncraz - Thursday, November 21, 2013 - link

    You actually only get about 100GB/s READ or 100GB/s WRITE... The best-case scenario on the XBox One is 68GB/s + 100GB/s - still NOT matching the PS4's capabilities for reading/writing ANY memory... and only in certain situations where you are streaming from both memory systems.

    Xbone PEAKS below PS4's AVERAGE memory performance.
  • daverasaro - Saturday, November 23, 2013 - link

    Huh? Actually you are wrong. The Xbox One uses 8GB of DDR3 RAM at 2133 MHz for 68.3 GB/s of bandwidth, but also adds an extra 32 MB of ESRAM for 102 GB/s of embedded memory bandwidth. The the PS4 uses 8GB of GDDR5 RAM at 5500 MHz for 170.6 GB/s of bandwidth.
  • daverasaro - Saturday, November 23, 2013 - link

    32GB*
  • SunLord - Wednesday, November 20, 2013 - link

    I don't get this constant worrying about power usage on non-mobile devices they plug into a wall and as long as it's not some obscene (300+W) amount of draw I don't care damn it... Heat can be an issue but I'm personally not even remotely concerned that it might cost me $3 more a year in power usage to use my $400 PS4 if I was i shouldn't be buying a PS4 or Xbox One let alone games for them.

Log in

Don't have an account? Sign up now