Power Consumption

As always I ran the Xbox One through a series of power consumption tests. I’ve described the tests below:

Off - Console is completely off, standby mode is disabled
Standby - Console is asleep, can be woken up by voice commands (if supported). Background updating is allowed in this mode.
Idle - Ethernet connected, no disc in drive, system idling at dashboard.
Load (BF4) - Ethernet connected, Battlefield 4 disc in drive, running Battlefield 4, stationary in test scene.
Load (BD Playback) - Ethernet connected, Blu-ray disc in drive, average power across Inception test scene.
CPU Load - SunSpider - Ethernet connected, no disc in drive, running SunSpider 1.0.2 in web browser.
CPU Load - Kraken - Ethernet connected, no disc in drive, running Kraken 1.1 in web browser

Power Consumption Comparison
Total System Power Off Standby Idle Load (BF4) Load (BD Playback)
Microsoft Xbox 360 Slim 0.6W - 70.4W 90.4W (RDR) -
Microsoft Xbox One 0.22W 15.3W 69.7W 119.0W 79.9W
Sony PlayStation 4 0.45W 8.59W 88.9W 139.8W 98.0W

When I first saw the PS4’s idle numbers I was shocked. 80 watts is what our IVB-E GPU testbed idles at, and that’s with a massive 6-core CPU and a Titan GPU. Similarly, my Haswell + Titan CPU testbed has a lower idle power than that. The Xbox One’s numbers are a little better at 69W, but still 50 - 80% higher than I was otherwise expecting.

Standby power is also surprisingly high for the Xbox One. Granted in this mode you can turn on the entire console by saying Xbox On, but always-on voice recognition is also something Motorola deployed on the Moto X and did so in a far lower power budget.

The only good news on the power front is really what happens when the console is completely off. I’m happy to report that I measured between 0.22 and 0.45W of draw while off, far less than previous Xbox 360s.

Power under load is pretty much as expected. In general the Xbox One appears to draw ~120W under max load, which isn’t much at all. I’m actually surprised by the delta between idle power and loaded GPU power (~50W). In this case I’m wondering if Microsoft is doing much power gating of unused CPU cores and/or GPU resources. The same is true for Sony on the PS4. It’s entirely possible that AMD hasn’t offered the same hooks into power management that you’d see on a PC equipped with an APU.

Blu-ray playback power consumption is more reasonable on the Xbox One than on the PS4. In both cases though the numbers are much higher than I’d like them to be.

I threw in some browser based CPU benchmarks and power numbers as well. Both the Xbox One and PS4 ship with integrated web browsers. Neither experience is particularly well optimized for performance, but the PS4 definitely has the edge at least in javascript performance.

Power Consumption Comparison
Lower is Better SunSpider 1.0.2 (Performance) SunSpider 1.0.2 (Power) Kraken 1.1 (Performance) Kraken 1.1 (Power)
Microsoft Xbox One 2360.9 ms 72.4W 111892.5 ms 72.9W
Sony PlayStation 4 1027.4 ms 114.7W 22768.7 ms 114.5W

Power consumption while running these CPU workloads is interesting. The marginal increase in system power consumption while running both tests on the Xbox One indicates one of two things: we’re either only taxing 1 - 2 cores here and/or Microsoft isn’t power gating unused CPU cores. I suspect it’s the former, since IE on the Xbox technically falls under the Windows kernel’s jurisdiction and I don’t believe it has more than 1 - 2 cores allocated for its needs.

The PS4 on the other hand shows a far bigger increase in power consumption during these workloads. For one we’re talking about higher levels of performance, but it’s also possible that Sony is allowing apps access to more CPU cores.

There’s definitely room for improvement in driving down power consumption on both next-generation platforms. I don’t know that there’s huge motivation to do so outside of me complaining about it though. I would like to see idle power drop below 50W, standby power shouldn’t be anywhere near this high on either platform, and the same goes for power consumption while playing back a Blu-ray movie.

Image Quality - Xbox One vs. PlayStation 4 Final Words
Comments Locked

286 Comments

View All Comments

  • melgross - Wednesday, November 20, 2013 - link

    Because nobody used Media Center.
  • Da W - Friday, November 22, 2013 - link

    Because nobody made an off the shelf, plug and play, HTPC. Since MS is making hardware now, i don't know why they didn't try to rebaggage Media Center as a Windows 8 app and make another try. The whole world is fighting for your TV, Microsoft was here since 2005 and somehow they call it quit (for the PC) and put all their eggs in Xbox basket.

    How expensive would it be to offer two options instead of one? I know a good deal of enthusiasts that will kill for a 2k$ HTPC with full XBone capabilities. Would cut the grass under steambox feets too.
  • taikamya - Wednesday, November 20, 2013 - link

    So wait.. that IGN review where they stated that the PS4 has a 2.75Ghz clock is false?
    'Cause this can explain the faster response times and more power usage, since the GPU's are not THAT different. I don't think that all that power difference of 20W-30W is GPU only.

    Okay, "max frequency of 2.75Ghz".. either way, that could explain a lot.(including the overheating problems some people are having now)

    http://goo.gl/Fd6xJY
  • taikamya - Wednesday, November 20, 2013 - link

    Excuse me, I'm new here so.... I'm sorry if we're not supposed to post links or anything for that matter. The IGN review is called "Playstation 4 Operating Temperature Revealed".

    I would be glad if someone could clear this up for me. Since this Anand review states that the PS4 runs at 1.6Ghz.
  • althaz - Wednesday, November 20, 2013 - link

    It runs at 1.6 Ghz, IGN are incorrect.
  • A5 - Wednesday, November 20, 2013 - link

    Don't go to IGN for technical information. Or anything, really. They're just plain wrong on this.
  • cupholder - Thursday, November 21, 2013 - link

    Yeah, double the ROPs = not THAT different.

    Each of my 770s are totally the same as a Titan... Totally.
  • bill5 - Wednesday, November 20, 2013 - link

    14 CU's, yes, it does have 14 for redundancy.

    The worst part is as I tweeted you, as recently as weeks from launch MS was strongly considering enabling the two redundant CU's, but choose not too. Both my own reliable sources told me this, as well it was somewhat referenced by MS engineers in a digital foundry article.

    Anyways I strongly wish they had, 1.5 teraflops just would have felt so much better, even if no paper a small increase.

    MS was so dumb to not beef up the hardware more, charging 499 for essentially a HD7770 GPU in nearly 2014 I find sad.

    Hell my ancient 2009, factory overclocked to 950, HD 4890 has more flops in practice, even if the 7770/XO GPU is probably faster due to being more advanced.

    Think about that, the 4890 is a 5 year old GPU. The XO is a brand new console expected to last 7+ years. So sad I dont even wanna think about it.

    Ahh well, the sad thing is by the looks of your comparison vids MS will very likely get away with it. even the 720P vs 1080P Ghosts comparison there is not much difference (and I imagine over time the XO will close the resolution gap to something more like 900P vs 1080P)

    One of the most interesting parts of your article though was the speculation XO is ROP limited. Not something I hadn't heard before, but still interesting. Shortsighted on MS part if so.

    Overall it feels like as usual MS is misguided. Focus on Live TV when it's probably slowly fading away (if not for that pesky sports problem...), and other things that seem cute and cool but half assed (voice recognition, Snap, Skype, etc etc etc).

    Yet for all that I can still see them doing well, mostly because Sony is even more incompetent. If they were up against Samsung or Apple they would be already dead in consoles, but fortunately for them they are not, they are up against Sony, who loses pretty much every market they are in.

    I think if XO struggles it would be a nice rebrand as a kinect-less, games focused, machine at 299. For that it'd arguably be a nice buy, and cheap DDR3 base should enable it. But if it sells OK at 499 with Kinect, and it probably will, we'll probably never get a chance to find out.
  • djboxbaba - Wednesday, November 20, 2013 - link

    It really is sad.. Good post.
  • augiem - Wednesday, November 20, 2013 - link

    I agree for the most part, but 14, or even 18 CUs isn't going to be enough to really makea big difference. I think the sad part technology-wise is how not one of the 3 major console gaming companies this time around focused on pushing the horsepower or even doing anything very innovative. Don't get me wrong, I for one don't think graphics is primarily what makes a good game, but since the days of Atari -> NES, this really feels like the smallest technological bump (was gonna say "leap", but that just doesn't seem to appy) from gen to gen. What makes it worse is the last gen lasted longer than any before it. You know the rise of the dirt cheap phone/tablet/FB/freemium game had something to do with it...

Log in

Don't have an account? Sign up now