Image Quality - Xbox One vs. PlayStation 4

This is the big one. We’ve already established that the PS4 has more GPU performance under the hood, but how does that delta manifest in games? My guess is we’re going to see two different situations. The first being what we have here today. For the most part I haven’t noticed huge differences in frame rate between Xbox One and PS4 versions of the same game, but I have noticed appreciable differences in resolution/AA. This could very well be the One’s ROP limitations coming into play. Quality per pixel seems roughly equivalent across consoles, the PS4 just has an easier time delivering more of those pixels.

The second situation could be one where an eager developer puts the PS4’s hardware to use and creates a game that doesn’t scale (exclusively) in resolution, but also in other aspects of image quality as well. My guess is the types of titles to fall into this second category will end up being PS4 exclusives (e.g. Uncharted 4) rather than something that’s cross-platform. There’s little motivation for a cross-platform developer to spend a substantial amount of time in optimizing for one console.

Call of Duty: Ghosts

Let’s start out with Call of Duty: Ghosts. Here I’m going to focus on two scenes: what we’ve been calling internally Let the Dog Drive, and the aliasing test. Once again I wasn’t able to completely normalize black levels across both consoles in Ghosts for some reason.

In motion both consoles look pretty good. You really start to see the PS4’s resolution/AA advantages at the very end of the sequence though (PS4 image sample, Xbox One image sample). The difference between these two obviously isn’t as great as from the 360 to Xbox One, but there is a definite resolution advantage to the PS4. It’s even more obvious if you look at our aliasing test:

Image quality otherwise looks comparable between the two consoles.

NBA 2K14

NBA 2K14 is one cross platform title where I swear I could sense slight frame rate differences between the two consoles (during high quality replays) but it’s not something I managed to capture on video. Once again we find ourselves in a situation where there is a difference in resolution and/or AA levels between the Xbox One and PS4 versions of the game.

Both versions look great. I’m not sure how much of this is the next-gen consoles since the last time I played an NBA 2K game was back when I was in college, but man have console basketball games significantly improved in their realism over the past decade. On a side note, NBA 2K14 does seem to make good use of the impulse triggers on the Xbox One’s controller.



Battlefield 4

I grabbed a couple of scenes from early on in Battlefield 4. Once again the differences here are almost entirely limited to the amount of aliasing in the scene as far as I can tell. The Xbox One version is definitely more distracting. In practice I notice the difference in resolution, but it’s never enough to force me to pick one platform over another. I’m personally more comfortable with the Xbox One’s controller than the PS4’s, which makes for an interesting set of tradeoffs.

Image Quality - Xbox 360 vs. Xbox One Power Consumption
POST A COMMENT

286 Comments

View All Comments

  • elerick - Wednesday, November 20, 2013 - link

    Thanks for the power consumption measurements. Could the xbox one standby power be higher due to power on / off voice commands? Reply
  • althaz - Wednesday, November 20, 2013 - link

    I suspect this is almost certainly the case. I wonder if it drops down below 10w if you turn off the Kinect (which I would never do myself)?

    I also hope Sony update their software - the Xbox stays in the 16-18w range when downloading updates, whereas the PS4 jumps up to 70 (70w when in standby and downloading an update, but still takes 30 seconds to start up!).
    Reply
  • mutantsushi - Saturday, November 23, 2013 - link

    It seems that the PS4's extremely high standby/download power draw is due to the ARM co-processor not being up to the task, it was supposed to be able to handle basic I/O tasks and other needed features, but apparently it wasn't quite spec'd sufficiently for the task, forcing Sony to keep the main CPU powered on to handle that task. The rumor is that they will "soon" release a new revision with a more powerful ARM core that is up to the task, and which should allow powering down the x86 CPU completely, as per the original plan. (either that, or managing to rework the "standby" software functions so that the existing ARM core can handle it would also do the trick)

    I believe MS is now also rumored to "soon" release a revision of the Xbone, although what that might entail is unknown. An SKU without the Kinect could allow them to drop the price $100 to better compete with PS4.

    Incidentally, the Xbone seems to be running HOTTER than the PS4, so MS' design certainly cannot be said to be a more efficient cooling design, more like they have more open space which isn't being efficiently used compared to PS4's design. The temp differential is also probably down to MS' last minute decision to give a 5% clockspeed bump to the APU.

    I'm looking forward to the 'in depth' article covering each. As far as performance is applicable in actual use scenarios, i.e. games, I'm interested to get the real low-down... The vast similarity in most aspects really narrows the number of factors to consider, so the actual differentiating factors really should be able to comprehensively addressed in their implications.

    Like Anand says, I don't think memory thru-put is a gross differentiator per se, or at least we could say that Xbone's ESRAM can be equivalent under certain plausible scenarios, even if it is less flexible than GDDR and thus restricts possible development paths. For cross-platform titles at least, that isn't really a factor IMHO.

    The ROP difference is probably the major factor for any delta in frame buffer resolution, but PS4's +50% compute unit advantage still remains as a factor in future exploitability... And if one wants to look at future exploitability then addressing GPU and PS4's features applicable to that is certainly necessary. I have seen discussion of GPGPU approaches which essentially can more efficiently achieve 'traditional graphics' tasks than a conventional pure GPU approach, so this is directly applicable to 'pure graphics' itself, as well as the other applications of GPGPU - game logic/controls like collisions, physics, audio raycasting, etc.

    When assessing both platforms' implications for future developments, I just don't see anything on Xbone's side that presents much advantage re: unique architectural advantages that isn't portable to PS4 without serious degradation, while the reverse does very much present that situation. While crossplatform games of course will not truly leverage architectural advantages which allow for 'game changing' differences, PS4's CU, ROP, and GPGPU queue advantages should pretty consistently allow for 'turning up the quality dial' on any cross-platform title... And to the extent that their exploitation techniques becomes widely used, we could in fact see some 'standardizd' design approachs which exploit e.g. the GPGPU techniques in ways easily 'bolted on' to a generic cross platform design... Again that's not going to change the ultimate game experience, but it is another vector to increase the qualitative experience. Certainly in even the first release games there is differences in occlusion techniques, and this is almost certainly without significant exploitation of GPGPU.
    Reply
  • mutantsushi - Saturday, November 23, 2013 - link

    If Xbone's resolution is considered satisfactory, I do wonder what PS4 can achieve at the same resolution but utilizing the extra CU and GPGPU capacity to achieve actually unique difference, not just a similar experience at higher resolution (i.e. 1080 vs. 900). If 900 upscaled is considered fine, what can be done if that extra horsepower is allocated elsewhere instead of increasing the pixel count? Reply
  • errorr - Wednesday, November 20, 2013 - link

    That is still ridiculous considering the moto x and some other future phones can do the same thing at order of magnitudes less power draw. Reply
  • Teknobug - Wednesday, November 20, 2013 - link

    I love my Moto X, X8 8-core processor means each core has its own job, 1 core for always listening and 1 core for active notifications. Very easy on the battery, which is why it is one of the best battery life phones right now. Reply
  • uditrana - Thursday, November 21, 2013 - link

    Do you even know what you are talking about? Reply
  • blzd - Thursday, November 21, 2013 - link

    Moto X is dual core. x8 is a co processor not eight cores. Reply
  • errzone - Monday, November 25, 2013 - link

    That's not entirely true either. The Moto X uses the Qualcomm MSM8960. The SOC is a dual core processor with an Adreno 320 GPU, which has 4 cores. Adding the 2 co-processors equals 8; hence Motorola marketing speak of X8. Reply
  • Roy2001 - Wednesday, December 11, 2013 - link

    Kinect? Reply

Log in

Don't have an account? Sign up now