Power Consumption

As always I ran the Xbox One through a series of power consumption tests. I’ve described the tests below:

Off - Console is completely off, standby mode is disabled
Standby - Console is asleep, can be woken up by voice commands (if supported). Background updating is allowed in this mode.
Idle - Ethernet connected, no disc in drive, system idling at dashboard.
Load (BF4) - Ethernet connected, Battlefield 4 disc in drive, running Battlefield 4, stationary in test scene.
Load (BD Playback) - Ethernet connected, Blu-ray disc in drive, average power across Inception test scene.
CPU Load - SunSpider - Ethernet connected, no disc in drive, running SunSpider 1.0.2 in web browser.
CPU Load - Kraken - Ethernet connected, no disc in drive, running Kraken 1.1 in web browser

Power Consumption Comparison
Total System Power Off Standby Idle Load (BF4) Load (BD Playback)
Microsoft Xbox 360 Slim 0.6W - 70.4W 90.4W (RDR) -
Microsoft Xbox One 0.22W 15.3W 69.7W 119.0W 79.9W
Sony PlayStation 4 0.45W 8.59W 88.9W 139.8W 98.0W

When I first saw the PS4’s idle numbers I was shocked. 80 watts is what our IVB-E GPU testbed idles at, and that’s with a massive 6-core CPU and a Titan GPU. Similarly, my Haswell + Titan CPU testbed has a lower idle power than that. The Xbox One’s numbers are a little better at 69W, but still 50 - 80% higher than I was otherwise expecting.

Standby power is also surprisingly high for the Xbox One. Granted in this mode you can turn on the entire console by saying Xbox On, but always-on voice recognition is also something Motorola deployed on the Moto X and did so in a far lower power budget.

The only good news on the power front is really what happens when the console is completely off. I’m happy to report that I measured between 0.22 and 0.45W of draw while off, far less than previous Xbox 360s.

Power under load is pretty much as expected. In general the Xbox One appears to draw ~120W under max load, which isn’t much at all. I’m actually surprised by the delta between idle power and loaded GPU power (~50W). In this case I’m wondering if Microsoft is doing much power gating of unused CPU cores and/or GPU resources. The same is true for Sony on the PS4. It’s entirely possible that AMD hasn’t offered the same hooks into power management that you’d see on a PC equipped with an APU.

Blu-ray playback power consumption is more reasonable on the Xbox One than on the PS4. In both cases though the numbers are much higher than I’d like them to be.

I threw in some browser based CPU benchmarks and power numbers as well. Both the Xbox One and PS4 ship with integrated web browsers. Neither experience is particularly well optimized for performance, but the PS4 definitely has the edge at least in javascript performance.

Power Consumption Comparison
Lower is Better SunSpider 1.0.2 (Performance) SunSpider 1.0.2 (Power) Kraken 1.1 (Performance) Kraken 1.1 (Power)
Microsoft Xbox One 2360.9 ms 72.4W 111892.5 ms 72.9W
Sony PlayStation 4 1027.4 ms 114.7W 22768.7 ms 114.5W

Power consumption while running these CPU workloads is interesting. The marginal increase in system power consumption while running both tests on the Xbox One indicates one of two things: we’re either only taxing 1 - 2 cores here and/or Microsoft isn’t power gating unused CPU cores. I suspect it’s the former, since IE on the Xbox technically falls under the Windows kernel’s jurisdiction and I don’t believe it has more than 1 - 2 cores allocated for its needs.

The PS4 on the other hand shows a far bigger increase in power consumption during these workloads. For one we’re talking about higher levels of performance, but it’s also possible that Sony is allowing apps access to more CPU cores.

There’s definitely room for improvement in driving down power consumption on both next-generation platforms. I don’t know that there’s huge motivation to do so outside of me complaining about it though. I would like to see idle power drop below 50W, standby power shouldn’t be anywhere near this high on either platform, and the same goes for power consumption while playing back a Blu-ray movie.

Image Quality - Xbox One vs. PlayStation 4 Final Words
Comments Locked

286 Comments

View All Comments

  • 3DoubleD - Thursday, November 21, 2013 - link

    You sit closer than 7 ft (4 ft optimal) to your 60" TV? This must be in a bedroom, office, or a tiny apartment. I live in what I consider a small apartment and I still sit 10 ft away. Perhaps you just put your couch in the center of the room so that it is really close to your TV? Either way, this is not most people's setup. Average seating distances are far greater than 7 ft. UHD TVs will need to be ~100+" for any benefit to be apparent to regular consumers.

    You must also live in Europe or Asia to get an internet rate like that. I pay $45/mo for 45/4 Mbit with a 300GB cap - although it's unlimited between 2am - 8am, which I take full advantage of.
  • nathanddrews - Thursday, November 21, 2013 - link

    We've got three rows of seating in our home theater. 115" 1080p projection with seating at approximately 7', 11', and 15'. I choose my seating positions based completely upon my audio configuration which is calibrated to the room's acoustic strengths, not upon one-size-fits-all visual acuity seating calculators. We generally sit in the front row when we don't have guests. It's immersive without being nauseating. Pixels are visible in the back row with good eyesight, so I'm anxiously awaiting a 4K upgrade, but probably not until laser projection becomes affordable.

    We've got Comcast Business Class 50/10 for $99/mo. No cap and 50/10 are the guaranteed minimum speeds, unlike the residential service which has a cap (sort of) and sells you a max speed instead of a minimum. Comcast BC also has a $59 plan with no cap that is 12/3, but we wanted more speed. Still can't get gigabit fiber... :-(
  • 3DoubleD - Friday, November 22, 2013 - link

    Sweet setup! You definitely have the screen real estate and seating arrangement to take advantage of 4k. I'd like a similar setup when I move on from apartment style living to a house. Awesome Internet setup too. I could get unlimited as well, and did for a while, but I realized I could pay half as much and get away without hitting my cap by downloading during "happy hours", but that takes some planning.

    I've been anxiously waiting for laser projection systems as well... Will they ever come or is it vaporware? Hopefully that is what my next TV purchase will be.
  • douglord - Thursday, November 21, 2013 - link

    more BS from the anti 4k crowd. I'm sitting 8 feet from my TV right now. In fact it's difficult to sit further away in your standard apartment living room. For a 60 inch TV 4k resolution is recommended for anything 8 feet or closer. For a 85 inch TV its 11 feet. For a 100 inch screen its 13 feet.
  • 3DoubleD - Friday, November 22, 2013 - link

    I'm hardly the anti 4k crowd. I think 4k is great, I just think it is only great when properly implemented. This means that 4k TVs should start at 60", since only very few people sit close enough to begin to see the difference. At 8ft,that is the optimal for 1080p for 60". If you really want to take advantage of 4k you'd sit at 4ft for a 60" set.
  • A5 - Wednesday, November 20, 2013 - link

    PS3 didn't launch with DLNA support, either. I'm guessing it will get patched in at some point.

    As for the rest of it, I'm guessing they made a guess that 4K won't really catch on during the lifespan of these systems, which seems like a fairly safe bet to me.
  • Hubb1e - Wednesday, November 20, 2013 - link

    And with only 16 ROPs Microsoft has trouble even pushing 1080p gaming. It seems that they targeted 720p gaming which is fine with me since most of the time TVs aren't big enough for this to matter. Microsoft did target 4K video though and they designed the video decode blocks specifically to handle this load. It will likely be high resolution but low bitrate video which in most cases is not an improvement over 1080p with high bitrate.
  • piroroadkill - Wednesday, November 20, 2013 - link

    2005? Well the consoles then being well specc'd? I disagree, they were mostly pretty great, but I recall very distinctly thinking 512MiB RAM was pretty poor.
  • airmantharp - Wednesday, November 20, 2013 - link

    It was horrific, and the effects of that decision still haunt us today.
  • bill5 - Wednesday, November 20, 2013 - link

    of course it matters. here xbox one has an edge with an awesome 204 gb/s of esram bandwidth p;us 68 gb/s of ddr bw for a total of 272 gb/s.

    and yes, you can add them together. so dont even start that noob talk.

Log in

Don't have an account? Sign up now