Power Usage

There’s a lot of performance on tap in the Xbox One X, which never comes with no strings attached. Like the Xbox One S, the APU inside is built on TSMC’s 16 nm FinFET process, which should help keep power usage under control. In addition, the Xbox One X is outfitted with a power supply that Microsoft equates to an 80 Plus Gold unit, which means it should be 90% efficient at 50% load with a 115 V source, and there shouldn’t be too much extra power wasted from the PSU converting AC voltage.

There’s several scenarios we tested for power usage:

Off – Xbox One X is powered off in Energy Savings mode, which means standby mode is disabled.

Standby – Xbox One X is powered off in Instant-On mode, which allows background updating and voice activation enabled (if supported).

Idle – Ethernet connected, no disc in the drive, system idling at dashboard.

Load (UHD BD Playback) – Ethernet connected, UHD Blu-Ray disc in the drive playing Planet Earth II, compared to The Hobbit on Blu-Ray on the original Xbox One.

Load (GoW4) – Ethernet connected, no disc in the drive, playing Gears of War 4 in UHD/HDR.

Load (The Wolf Among Us) – Ethernet connected, no disc in the drive, playing The Wolf Among Us in FHD SDR.

We’ve been able to compare against the original Xbox One, although not the S model as we didn’t have one on hand. The Wolf Among Us was chosen as an older game which caps at 1080p and SDR, and Gears of War 4 shows the power draw at full 4K HDR rendering. The comparison against the original for this game will of course be for the 1080p version though, since that’s the max it supports.

Power Consumption Comparions
Total System Power Energy-Saving Instant-On Idle Load (UHD BD) Load (GoW4) Load (The Wolf Among Us)
Xbox One < 2W 14.2W 53W 80W 107W 102W
Xbox One X < 1W 10W 56W 64W 172W 101W

As with the original Xbox One, when Instant On is disabled, the console is practically fully off. There’s a small amount of draw, but overall, not very much. Most people that use the console are going to likely want it in Instant On mode though, so games and the console can update while the system is off, as well as to provide a much quicker startup time, and games can remain loaded in RAM. In Standby mode, power draw is reasonable at 10 W, which is lower than the original console when it first launched. It’s still a fair bit of power, but when you factor in that it needs to keep 12 GB of GDDR5 memory powered up (among other things), it is not unreasonable to expect this amount of power draw.

Idling at the dashboard draws around 55 W, and to add to that, most non-gaming tasks don’t add very much to this total, if any. If you’re using your Xbox to passthrough HDMI from a cable box, it will take this same power draw. Maybe this would be an impossible pipe dream, but it would be nice to see the Xbox One also pass through HDMI when it is in Standby mode.

Playing back a UHD Blu-Ray (standard Blu-Ray on the original Xbox One) was a tiny bit higher than idle, which is good to see. Some of the draw would be the disc drive itself, but a lot of the playback would be offloaded to fixed function hardware in the media block so it’s not surprising to see it so close to idle.

Clearly gaming on older Xbox One games is not much of a chore for the Xbox One X, since the power draw is only about 50 W over idle. But, when gaming with an Xbox One X Enhanced title, such as Gears of War 4, the power draw jumps significantly to 172 W as the peak observed. This is quite a jump over the original console, and makes the cooling system, which is barely audible even under these loads, even more impressive. Compared to a high-end gaming PC though, the power draw is quite a bit less.

Enjoying Media User Interface
Comments Locked

128 Comments

View All Comments

  • abrowne1993 - Friday, November 3, 2017 - link

    Wow, surprised to see a console review here. And before it even launches, at that.
  • yhselp - Friday, November 3, 2017 - link

    Great article. Thanks. Just one thing I didn't understand - why would XBO's DDR3 account for its performance deficit when its eSRAM buffer actually has higher bandwidth than PS4's GDDR5, albeit limited to 32 MB? I thought XBO's performance deficit comes from its weaker GPU. Can you explain, please?
  • yhselp - Friday, November 3, 2017 - link

    Ignore me, I apologize, just realized what you meant to say - that picking DDR3 necessitated an eSRAM buffer, which limited their ability to put more stuff on-chip, which resulted in a weaker GPU. Have never thought of it this way, thanks for making it clear.
  • yhselp - Friday, November 3, 2017 - link

    It's also quite ironic that by deciding against GDDR5, Microsoft essentially guaranteed that Sony would not run into production limitations with the PS4.
  • rasmithuk - Friday, November 3, 2017 - link

    I think the article is wrong here.

    GDDR5 was available in volume, but not in 4Gb packages (which were required to get a sensible board layout).
    2Gb were available, but without going for riser cards they could only get 16 on the board, giving a limit of 4GB of RAM.

    4Gb chips became available late 2012, too late for Microsoft to change the whole console design to use, but early enough for Sony (who were happy with 4GB of RAM on the PS4) to switch.
  • Rufnek - Monday, November 6, 2017 - link

    @rasmithuk
    You may be entirely correct, but the reality is that Sony found a way to get GDDR5 memory in the console, with a built in power supply now less, in a much smaller space than the One Without an internal power supply. Reasoning is really that one company chose to spend real money on a compact design, and the other company was really looking to keep costs at a minimum.
  • novastar78 - Monday, November 6, 2017 - link

    This is actually 100% correct based on the sources I've talked to.

    MS wanted 8GB because of the vitualized OS they wanted to give more headroom so they needed 8GB.
  • tipoo - Friday, November 3, 2017 - link

    32MB of high bandwidth memory vs all 8GB being that bandwidth.
  • Samus - Saturday, November 4, 2017 - link

    The real problem is the SRAM just isn't much faster than GDDR5 in the first place. It only looks fast when you compare it to DDR3.

    Microsoft really screwed up going with DDR3. There was no excuse really. I agree with rasmithuk that the packaging they needed wasn't available for the board footprint, but that could have been worked out in the short turn (in the form of bigger initial losses on hardware sales as well) but the real head scratcher is consoles don't even use high speed GDDR5.

    They use the 6-7Ghz variety. Which is not only substantially less expensive, but has never been remotely close to a "supply shortage." When you read about shortages, it's of stuff at the ultra high end. Look at 3D NAND, GDDR5X, HBM2. These all have low yields because they are new technology. GDDR5 at 8Ghz initially had low yields as well, and it was never considered for consoles, particularly because AMD has historically stayed away from 8GHz GDDR5.
  • StevoLincolnite - Friday, November 3, 2017 - link

    No mention of how the ROPS don't align perfectly with the memory bus? Resulting in a Radeon 7970-memory like situation?

Log in

Don't have an account? Sign up now