Image Quality - Xbox One vs. PlayStation 4

This is the big one. We’ve already established that the PS4 has more GPU performance under the hood, but how does that delta manifest in games? My guess is we’re going to see two different situations. The first being what we have here today. For the most part I haven’t noticed huge differences in frame rate between Xbox One and PS4 versions of the same game, but I have noticed appreciable differences in resolution/AA. This could very well be the One’s ROP limitations coming into play. Quality per pixel seems roughly equivalent across consoles, the PS4 just has an easier time delivering more of those pixels.

The second situation could be one where an eager developer puts the PS4’s hardware to use and creates a game that doesn’t scale (exclusively) in resolution, but also in other aspects of image quality as well. My guess is the types of titles to fall into this second category will end up being PS4 exclusives (e.g. Uncharted 4) rather than something that’s cross-platform. There’s little motivation for a cross-platform developer to spend a substantial amount of time in optimizing for one console.

Call of Duty: Ghosts

Let’s start out with Call of Duty: Ghosts. Here I’m going to focus on two scenes: what we’ve been calling internally Let the Dog Drive, and the aliasing test. Once again I wasn’t able to completely normalize black levels across both consoles in Ghosts for some reason.

In motion both consoles look pretty good. You really start to see the PS4’s resolution/AA advantages at the very end of the sequence though (PS4 image sample, Xbox One image sample). The difference between these two obviously isn’t as great as from the 360 to Xbox One, but there is a definite resolution advantage to the PS4. It’s even more obvious if you look at our aliasing test:

Image quality otherwise looks comparable between the two consoles.

NBA 2K14

NBA 2K14 is one cross platform title where I swear I could sense slight frame rate differences between the two consoles (during high quality replays) but it’s not something I managed to capture on video. Once again we find ourselves in a situation where there is a difference in resolution and/or AA levels between the Xbox One and PS4 versions of the game.

Both versions look great. I’m not sure how much of this is the next-gen consoles since the last time I played an NBA 2K game was back when I was in college, but man have console basketball games significantly improved in their realism over the past decade. On a side note, NBA 2K14 does seem to make good use of the impulse triggers on the Xbox One’s controller.



Battlefield 4

I grabbed a couple of scenes from early on in Battlefield 4. Once again the differences here are almost entirely limited to the amount of aliasing in the scene as far as I can tell. The Xbox One version is definitely more distracting. In practice I notice the difference in resolution, but it’s never enough to force me to pick one platform over another. I’m personally more comfortable with the Xbox One’s controller than the PS4’s, which makes for an interesting set of tradeoffs.

Image Quality - Xbox 360 vs. Xbox One Power Consumption
Comments Locked

286 Comments

View All Comments

  • kyuu - Wednesday, November 20, 2013 - link

    I don't care. Why should I? The only thing that goes on in my living room is playing games and watching TV. So even in the unlikely event that the Kinect camera is feeding somebody (NSA? Microsoft interns? Who exactly am I supposed to be afraid of again?) a 24/7 feed of my living room and somebody is actually looking at it, big whoop.

    I'm not planning on purchasing either console, btw. Just irritated by the tin-foil hat brigade pretending it's reasonable to be scared by the Kinect.
  • kyuu - Wednesday, November 20, 2013 - link

    Oh, and not to mention that if that is actually taking place, it'll be found out pretty quickly and there'll be a huge backlash against Microsoft. The huge potential for negative press and lost sales for absolutely no gain makes me pretty sure it's not going on, though.
  • prophet001 - Thursday, November 21, 2013 - link

    How sad.

    Microsoft, Google, Sony, and any other corporation out there has absolutely zero right to my privacy. Whether I am or am not doing anything "wrong." You my friend will not know what you've lost until it is truly gone.
  • mikato - Monday, November 25, 2013 - link

    I don't think it will be a problem (see kyuu), but I really disagree with your "nothing to hide" attitude.
    http://en.wikipedia.org/wiki/Nothing_to_hide_argum...
  • Floew - Wednesday, November 20, 2013 - link

    I recently build a Steam box. With a 360 controller/wireless adapter and Steam Big Picture set to launch on startup, it's a surprisingly console-like experience. Works much better than I had expected, frankly. My motivation to plunk down cash for the new consoles is now very low.
  • Quidam67 - Wednesday, November 20, 2013 - link

    Anand, just wondering if the Xbox One controller works with a Windows based PC (as per the 360 controller)? Would be great if you could try that out and let us know :)
  • The Von Matrices - Wednesday, November 20, 2013 - link

    The wireless XBOX 360 controller required a special USB receiver to work with a PC, and that took a few years to be released. I don't know if XBOX One controllers are compatible with the 360 wireless controller receiver or if a new one is required. I actually liked the wired XBOX 360 controller for certain PC games, and I'm curious to know if Microsoft will make wired XBOX One controllers.
  • Quidam67 - Sunday, November 24, 2013 - link

    Targetted to work with PC in 2014 apparently http://www.polygon.com/2013/8/12/4615454/xbox-one-...
  • errorr - Wednesday, November 20, 2013 - link

    There is a lot of discussion about the memory bandwidth issues but what I want to know is how latency affects the performance picture. That SRAM latency might be an order of magnitude quicker even if it is small. What workloads are more latency dependant to where the Xbox design might have a performance advantage?
  • khanov - Wednesday, November 20, 2013 - link

    It is important to understand that GPUs work in a fundamentally different way to CPUs. The main difference when it comes to memory access is how they deal with latency.

    CPUs require cache to hide memory access latency. If the required instructions/data are not in cache there is a large latency penalty and the CPU core sits there doing nothing useful for hundreds of clock cycles. For this reason CPU designers pay close attention to cache size and design to ensure that cache hit rates stay north of 99% (on any modern CPU).

    GPUs do it differently. Any modern GPU has many thousands of threads in flight at once (even if it has, for example, only 512 shader cores) . When a memory access is needed, it is queued up and attended to by the memory controller in a timely fashion, but there is still the latency of hundreds of clock cycles to consider. So what the GPU does is switch to a different group of threads and process those other threads while it waits for the memory access to complete.

    In fact, whenever the needed data is not available, the GPU will switch thread groups so that it can continue to do useful work. If you consider that any given frame of a game contains millions of pixels, and that GPU calculations need to be performed for each and every pixel, then you can see how there would almost always be more threads waiting to switch over to. By switching threads instead of waiting and doing nothing, GPUs effectively hide memory latency very well. But they do it in a completely different way to a CPU.

    Because a GPU has many thousands of threads in flight at once, and each thread group is likely at some point to require some data fetched from memory, the memory bandwidth becomes a much more important factor than memory latency. Latency can be hidden by switching thread groups, but bandwidth constraints limit the overall amount of data that can be processed by the GPU per frame.

    This is, in a nutshell, why all modern pc graphics cards at the mid and high end use GDDR5 on a wide bus. Bandwidth is king for a GPU.

    The Xbox One attempts to offset some of its apparent lack of memory bandwidth by storing frequently used buffers in eSRAM. The eSRAM has a fairly high effective bandwidth, but its size is small. It still remains to be seen how effectively it can be used by talented developers. But you should not worry about its latency. Latency is really not important to the GPU.

    I hope this helps you to understand why everyone goes on and on about bandwidth. Sorry if it is a little long-winded.

Log in

Don't have an account? Sign up now