Image Quality - Xbox 360 vs. Xbox One

Before I get to the PS4 comparison, I wanted to start with some videos showcasing the improvement you can expect from launch day titles that are available on both the Xbox 360 and Xbox One. I turned to Call of Duty: Ghosts for this comparison as it’s broadly available on all platforms I’m comparing today.

Note that cross platform launch titles, particularly those available on previous generation consoles, end up being the worst examples of what’s possible on a next-generation platform. For the most part they’re optimized for the platform with the larger installed base (i.e. prior-gen hardware), and the visual uplift on new hardware isn’t as much as it could be. I’d say my subjective experience in playing a lot of the launch titles on Xbox One and PS4 mirrors this sentiment. Basic things like not having accurate/realistic cloth physics in games like CoD: Ghosts just screams port and not something that was designed specifically for these next gen systems. Just as we’ve seen in prior generations, it’s likely going to be a good 12 - 24 months before we see great examples of games on this new generation of hardware.

Now that I’ve adequately explained why this is a bad comparison, let’s get to the comparison. I’ve captured HDMI output on both consoles. They were both set to full range (0-255), however I had issues with the Xbox One respecting this setting for some reason. That combined with differences across Ghosts on both platforms left me with black levels that don’t seem equalized between the platforms. If you can ignore that, we can get to the comparison at hand.

All of these videos are encoded at 4K, with two 1080p captures placed side by side. Be sure to select the highest quality playback option YouTube offers.

The first scene is the intro to Ghosts. Here you can see clear differences in lighting, details in the characters, as well as some basic resolution/AA differences as well (Xbox 360 image sampleXbox One image sample).

The second scene is best described as Call of Duty meets Gravity. Here the scene is going by pretty quickly so you’re going to have to pause the video to get a good feel for any differences in the platforms. What’s most apparent here though is the fact that many present day users can likely get by sticking with older hardware due to the lack of titles that are truly optimized for the Xbox One/PS4.

Now getting to scenes more representative of actual gameplay, we have Riley riding around wanting badly to drive the military vehicle. Here the differences are huge. The Xbox One features more realistic lighting, you can see texture in Riley’s fur, shadows are more detailed and there seems to be a resolution/AA advantage as well. What’s funny is that although the Xbox One appears to have a resolution advantage, the 360 appears to have less aliasing as everything is just so blurry.

Speaking of aliasing, we have our final IQ test which is really the perfect test case for high resolution/AA. Once again we see a completely different scene comparing the Xbox One to Xbox 360. Completely different lighting, much more detail in the environments as well as objects on the ground. The 360 version of Ghosts is just significantly more blurry than what you get on the One, which unfortunately makes aliasing stand out even more on the One.

Even though it’ll be a little while before we get truly optimzed next-gen titles, there’s an appreciable improvement on those games we have today for anyone upgrading from an older console. The difference may be more subtle than in previous generations, but it’s there.

Performance - An Update Image Quality - Xbox One vs. PlayStation 4
Comments Locked

286 Comments

View All Comments

  • elerick - Wednesday, November 20, 2013 - link

    Thanks for the power consumption measurements. Could the xbox one standby power be higher due to power on / off voice commands?
  • althaz - Wednesday, November 20, 2013 - link

    I suspect this is almost certainly the case. I wonder if it drops down below 10w if you turn off the Kinect (which I would never do myself)?

    I also hope Sony update their software - the Xbox stays in the 16-18w range when downloading updates, whereas the PS4 jumps up to 70 (70w when in standby and downloading an update, but still takes 30 seconds to start up!).
  • mutantsushi - Saturday, November 23, 2013 - link

    It seems that the PS4's extremely high standby/download power draw is due to the ARM co-processor not being up to the task, it was supposed to be able to handle basic I/O tasks and other needed features, but apparently it wasn't quite spec'd sufficiently for the task, forcing Sony to keep the main CPU powered on to handle that task. The rumor is that they will "soon" release a new revision with a more powerful ARM core that is up to the task, and which should allow powering down the x86 CPU completely, as per the original plan. (either that, or managing to rework the "standby" software functions so that the existing ARM core can handle it would also do the trick)

    I believe MS is now also rumored to "soon" release a revision of the Xbone, although what that might entail is unknown. An SKU without the Kinect could allow them to drop the price $100 to better compete with PS4.

    Incidentally, the Xbone seems to be running HOTTER than the PS4, so MS' design certainly cannot be said to be a more efficient cooling design, more like they have more open space which isn't being efficiently used compared to PS4's design. The temp differential is also probably down to MS' last minute decision to give a 5% clockspeed bump to the APU.

    I'm looking forward to the 'in depth' article covering each. As far as performance is applicable in actual use scenarios, i.e. games, I'm interested to get the real low-down... The vast similarity in most aspects really narrows the number of factors to consider, so the actual differentiating factors really should be able to comprehensively addressed in their implications.

    Like Anand says, I don't think memory thru-put is a gross differentiator per se, or at least we could say that Xbone's ESRAM can be equivalent under certain plausible scenarios, even if it is less flexible than GDDR and thus restricts possible development paths. For cross-platform titles at least, that isn't really a factor IMHO.

    The ROP difference is probably the major factor for any delta in frame buffer resolution, but PS4's +50% compute unit advantage still remains as a factor in future exploitability... And if one wants to look at future exploitability then addressing GPU and PS4's features applicable to that is certainly necessary. I have seen discussion of GPGPU approaches which essentially can more efficiently achieve 'traditional graphics' tasks than a conventional pure GPU approach, so this is directly applicable to 'pure graphics' itself, as well as the other applications of GPGPU - game logic/controls like collisions, physics, audio raycasting, etc.

    When assessing both platforms' implications for future developments, I just don't see anything on Xbone's side that presents much advantage re: unique architectural advantages that isn't portable to PS4 without serious degradation, while the reverse does very much present that situation. While crossplatform games of course will not truly leverage architectural advantages which allow for 'game changing' differences, PS4's CU, ROP, and GPGPU queue advantages should pretty consistently allow for 'turning up the quality dial' on any cross-platform title... And to the extent that their exploitation techniques becomes widely used, we could in fact see some 'standardizd' design approachs which exploit e.g. the GPGPU techniques in ways easily 'bolted on' to a generic cross platform design... Again that's not going to change the ultimate game experience, but it is another vector to increase the qualitative experience. Certainly in even the first release games there is differences in occlusion techniques, and this is almost certainly without significant exploitation of GPGPU.
  • mutantsushi - Saturday, November 23, 2013 - link

    If Xbone's resolution is considered satisfactory, I do wonder what PS4 can achieve at the same resolution but utilizing the extra CU and GPGPU capacity to achieve actually unique difference, not just a similar experience at higher resolution (i.e. 1080 vs. 900). If 900 upscaled is considered fine, what can be done if that extra horsepower is allocated elsewhere instead of increasing the pixel count?
  • errorr - Wednesday, November 20, 2013 - link

    That is still ridiculous considering the moto x and some other future phones can do the same thing at order of magnitudes less power draw.
  • Teknobug - Wednesday, November 20, 2013 - link

    I love my Moto X, X8 8-core processor means each core has its own job, 1 core for always listening and 1 core for active notifications. Very easy on the battery, which is why it is one of the best battery life phones right now.
  • uditrana - Thursday, November 21, 2013 - link

    Do you even know what you are talking about?
  • blzd - Thursday, November 21, 2013 - link

    Moto X is dual core. x8 is a co processor not eight cores.
  • errzone - Monday, November 25, 2013 - link

    That's not entirely true either. The Moto X uses the Qualcomm MSM8960. The SOC is a dual core processor with an Adreno 320 GPU, which has 4 cores. Adding the 2 co-processors equals 8; hence Motorola marketing speak of X8.
  • Roy2001 - Wednesday, December 11, 2013 - link

    Kinect?

Log in

Don't have an account? Sign up now