Image Quality - Xbox One vs. PlayStation 4

This is the big one. We’ve already established that the PS4 has more GPU performance under the hood, but how does that delta manifest in games? My guess is we’re going to see two different situations. The first being what we have here today. For the most part I haven’t noticed huge differences in frame rate between Xbox One and PS4 versions of the same game, but I have noticed appreciable differences in resolution/AA. This could very well be the One’s ROP limitations coming into play. Quality per pixel seems roughly equivalent across consoles, the PS4 just has an easier time delivering more of those pixels.

The second situation could be one where an eager developer puts the PS4’s hardware to use and creates a game that doesn’t scale (exclusively) in resolution, but also in other aspects of image quality as well. My guess is the types of titles to fall into this second category will end up being PS4 exclusives (e.g. Uncharted 4) rather than something that’s cross-platform. There’s little motivation for a cross-platform developer to spend a substantial amount of time in optimizing for one console.

Call of Duty: Ghosts

Let’s start out with Call of Duty: Ghosts. Here I’m going to focus on two scenes: what we’ve been calling internally Let the Dog Drive, and the aliasing test. Once again I wasn’t able to completely normalize black levels across both consoles in Ghosts for some reason.

In motion both consoles look pretty good. You really start to see the PS4’s resolution/AA advantages at the very end of the sequence though (PS4 image sample, Xbox One image sample). The difference between these two obviously isn’t as great as from the 360 to Xbox One, but there is a definite resolution advantage to the PS4. It’s even more obvious if you look at our aliasing test:

Image quality otherwise looks comparable between the two consoles.

NBA 2K14

NBA 2K14 is one cross platform title where I swear I could sense slight frame rate differences between the two consoles (during high quality replays) but it’s not something I managed to capture on video. Once again we find ourselves in a situation where there is a difference in resolution and/or AA levels between the Xbox One and PS4 versions of the game.

Both versions look great. I’m not sure how much of this is the next-gen consoles since the last time I played an NBA 2K game was back when I was in college, but man have console basketball games significantly improved in their realism over the past decade. On a side note, NBA 2K14 does seem to make good use of the impulse triggers on the Xbox One’s controller.



Battlefield 4

I grabbed a couple of scenes from early on in Battlefield 4. Once again the differences here are almost entirely limited to the amount of aliasing in the scene as far as I can tell. The Xbox One version is definitely more distracting. In practice I notice the difference in resolution, but it’s never enough to force me to pick one platform over another. I’m personally more comfortable with the Xbox One’s controller than the PS4’s, which makes for an interesting set of tradeoffs.

Image Quality - Xbox 360 vs. Xbox One Power Consumption
Comments Locked

286 Comments

View All Comments

  • djboxbaba - Wednesday, November 20, 2013 - link

    you have the cutest name ever bra, "wolfpup" kudos ^^
  • melgross - Wednesday, November 20, 2013 - link

    Hey noob, it doesn't work that way. SRAM is not equivalent to high speeds GDDR5. This has been well established already. You do get some boost, at some points, but it's not covering every area of performance the way GDDR5 is.
  • CubesTheGamer - Wednesday, November 20, 2013 - link

    Newb talk? No, you can't add them together. Let me tell you why, in technical terms.

    ESRAM is meant to be a cache, and what a cache does is take some data that you're going to need a lot (let's say there's some instruction code or some other code / data that needs to be read frequently. You put that data in the ESRAM, and it gets read 10+ times before being swapped for some other data. What you're saying makes it seem like we can constantly write and read from the ESRAM. That's not how it works.

    tl;dr: You can't add them together because you should only use the ESRAM 1/10th the amount of times as you should the main DDR3 RAM that the Xbox One has. So you're argument is invalid, and don't say things that you don't know about.
  • smartypnt4 - Thursday, November 21, 2013 - link

    He's indeed wrong, but I'd be willing to bet good money your hit rate on that eSRAM is way higher than 10% if it's used as a cache. Usual last level caches have a hit rate getting into the 80% range due to prefetching, and large ones like this have even higher hit rates.

    If it's hardware mapped like the article indicates(aka not a global cache, but more like it was on the 360), it won't hit quite as often with a naive program, but a good developer could ensure that the bulk of the memory accesses hit that eSRAM and not main memory.
  • Da W - Friday, November 22, 2013 - link

    XBone is ROP bound. Will you stop bitching around with your geometry and bandwith? Its all about ROP!
  • MadMan007 - Wednesday, November 20, 2013 - link

    I can add $2 and a whore together too, that doesn't make it good.
  • looncraz - Thursday, November 21, 2013 - link

    You actually only get about 100GB/s READ or 100GB/s WRITE... The best-case scenario on the XBox One is 68GB/s + 100GB/s - still NOT matching the PS4's capabilities for reading/writing ANY memory... and only in certain situations where you are streaming from both memory systems.

    Xbone PEAKS below PS4's AVERAGE memory performance.
  • daverasaro - Saturday, November 23, 2013 - link

    Huh? Actually you are wrong. The Xbox One uses 8GB of DDR3 RAM at 2133 MHz for 68.3 GB/s of bandwidth, but also adds an extra 32 MB of ESRAM for 102 GB/s of embedded memory bandwidth. The the PS4 uses 8GB of GDDR5 RAM at 5500 MHz for 170.6 GB/s of bandwidth.
  • daverasaro - Saturday, November 23, 2013 - link

    32GB*
  • SunLord - Wednesday, November 20, 2013 - link

    I don't get this constant worrying about power usage on non-mobile devices they plug into a wall and as long as it's not some obscene (300+W) amount of draw I don't care damn it... Heat can be an issue but I'm personally not even remotely concerned that it might cost me $3 more a year in power usage to use my $400 PS4 if I was i shouldn't be buying a PS4 or Xbox One let alone games for them.

Log in

Don't have an account? Sign up now