Image Quality - Xbox 360 vs. Xbox One

Before I get to the PS4 comparison, I wanted to start with some videos showcasing the improvement you can expect from launch day titles that are available on both the Xbox 360 and Xbox One. I turned to Call of Duty: Ghosts for this comparison as it’s broadly available on all platforms I’m comparing today.

Note that cross platform launch titles, particularly those available on previous generation consoles, end up being the worst examples of what’s possible on a next-generation platform. For the most part they’re optimized for the platform with the larger installed base (i.e. prior-gen hardware), and the visual uplift on new hardware isn’t as much as it could be. I’d say my subjective experience in playing a lot of the launch titles on Xbox One and PS4 mirrors this sentiment. Basic things like not having accurate/realistic cloth physics in games like CoD: Ghosts just screams port and not something that was designed specifically for these next gen systems. Just as we’ve seen in prior generations, it’s likely going to be a good 12 - 24 months before we see great examples of games on this new generation of hardware.

Now that I’ve adequately explained why this is a bad comparison, let’s get to the comparison. I’ve captured HDMI output on both consoles. They were both set to full range (0-255), however I had issues with the Xbox One respecting this setting for some reason. That combined with differences across Ghosts on both platforms left me with black levels that don’t seem equalized between the platforms. If you can ignore that, we can get to the comparison at hand.

All of these videos are encoded at 4K, with two 1080p captures placed side by side. Be sure to select the highest quality playback option YouTube offers.

The first scene is the intro to Ghosts. Here you can see clear differences in lighting, details in the characters, as well as some basic resolution/AA differences as well (Xbox 360 image sampleXbox One image sample).

The second scene is best described as Call of Duty meets Gravity. Here the scene is going by pretty quickly so you’re going to have to pause the video to get a good feel for any differences in the platforms. What’s most apparent here though is the fact that many present day users can likely get by sticking with older hardware due to the lack of titles that are truly optimized for the Xbox One/PS4.

Now getting to scenes more representative of actual gameplay, we have Riley riding around wanting badly to drive the military vehicle. Here the differences are huge. The Xbox One features more realistic lighting, you can see texture in Riley’s fur, shadows are more detailed and there seems to be a resolution/AA advantage as well. What’s funny is that although the Xbox One appears to have a resolution advantage, the 360 appears to have less aliasing as everything is just so blurry.

Speaking of aliasing, we have our final IQ test which is really the perfect test case for high resolution/AA. Once again we see a completely different scene comparing the Xbox One to Xbox 360. Completely different lighting, much more detail in the environments as well as objects on the ground. The 360 version of Ghosts is just significantly more blurry than what you get on the One, which unfortunately makes aliasing stand out even more on the One.

Even though it’ll be a little while before we get truly optimzed next-gen titles, there’s an appreciable improvement on those games we have today for anyone upgrading from an older console. The difference may be more subtle than in previous generations, but it’s there.

Performance - An Update Image Quality - Xbox One vs. PlayStation 4
Comments Locked

286 Comments

View All Comments

  • kyuu - Wednesday, November 20, 2013 - link

    I don't care. Why should I? The only thing that goes on in my living room is playing games and watching TV. So even in the unlikely event that the Kinect camera is feeding somebody (NSA? Microsoft interns? Who exactly am I supposed to be afraid of again?) a 24/7 feed of my living room and somebody is actually looking at it, big whoop.

    I'm not planning on purchasing either console, btw. Just irritated by the tin-foil hat brigade pretending it's reasonable to be scared by the Kinect.
  • kyuu - Wednesday, November 20, 2013 - link

    Oh, and not to mention that if that is actually taking place, it'll be found out pretty quickly and there'll be a huge backlash against Microsoft. The huge potential for negative press and lost sales for absolutely no gain makes me pretty sure it's not going on, though.
  • prophet001 - Thursday, November 21, 2013 - link

    How sad.

    Microsoft, Google, Sony, and any other corporation out there has absolutely zero right to my privacy. Whether I am or am not doing anything "wrong." You my friend will not know what you've lost until it is truly gone.
  • mikato - Monday, November 25, 2013 - link

    I don't think it will be a problem (see kyuu), but I really disagree with your "nothing to hide" attitude.
    http://en.wikipedia.org/wiki/Nothing_to_hide_argum...
  • Floew - Wednesday, November 20, 2013 - link

    I recently build a Steam box. With a 360 controller/wireless adapter and Steam Big Picture set to launch on startup, it's a surprisingly console-like experience. Works much better than I had expected, frankly. My motivation to plunk down cash for the new consoles is now very low.
  • Quidam67 - Wednesday, November 20, 2013 - link

    Anand, just wondering if the Xbox One controller works with a Windows based PC (as per the 360 controller)? Would be great if you could try that out and let us know :)
  • The Von Matrices - Wednesday, November 20, 2013 - link

    The wireless XBOX 360 controller required a special USB receiver to work with a PC, and that took a few years to be released. I don't know if XBOX One controllers are compatible with the 360 wireless controller receiver or if a new one is required. I actually liked the wired XBOX 360 controller for certain PC games, and I'm curious to know if Microsoft will make wired XBOX One controllers.
  • Quidam67 - Sunday, November 24, 2013 - link

    Targetted to work with PC in 2014 apparently http://www.polygon.com/2013/8/12/4615454/xbox-one-...
  • errorr - Wednesday, November 20, 2013 - link

    There is a lot of discussion about the memory bandwidth issues but what I want to know is how latency affects the performance picture. That SRAM latency might be an order of magnitude quicker even if it is small. What workloads are more latency dependant to where the Xbox design might have a performance advantage?
  • khanov - Wednesday, November 20, 2013 - link

    It is important to understand that GPUs work in a fundamentally different way to CPUs. The main difference when it comes to memory access is how they deal with latency.

    CPUs require cache to hide memory access latency. If the required instructions/data are not in cache there is a large latency penalty and the CPU core sits there doing nothing useful for hundreds of clock cycles. For this reason CPU designers pay close attention to cache size and design to ensure that cache hit rates stay north of 99% (on any modern CPU).

    GPUs do it differently. Any modern GPU has many thousands of threads in flight at once (even if it has, for example, only 512 shader cores) . When a memory access is needed, it is queued up and attended to by the memory controller in a timely fashion, but there is still the latency of hundreds of clock cycles to consider. So what the GPU does is switch to a different group of threads and process those other threads while it waits for the memory access to complete.

    In fact, whenever the needed data is not available, the GPU will switch thread groups so that it can continue to do useful work. If you consider that any given frame of a game contains millions of pixels, and that GPU calculations need to be performed for each and every pixel, then you can see how there would almost always be more threads waiting to switch over to. By switching threads instead of waiting and doing nothing, GPUs effectively hide memory latency very well. But they do it in a completely different way to a CPU.

    Because a GPU has many thousands of threads in flight at once, and each thread group is likely at some point to require some data fetched from memory, the memory bandwidth becomes a much more important factor than memory latency. Latency can be hidden by switching thread groups, but bandwidth constraints limit the overall amount of data that can be processed by the GPU per frame.

    This is, in a nutshell, why all modern pc graphics cards at the mid and high end use GDDR5 on a wide bus. Bandwidth is king for a GPU.

    The Xbox One attempts to offset some of its apparent lack of memory bandwidth by storing frequently used buffers in eSRAM. The eSRAM has a fairly high effective bandwidth, but its size is small. It still remains to be seen how effectively it can be used by talented developers. But you should not worry about its latency. Latency is really not important to the GPU.

    I hope this helps you to understand why everyone goes on and on about bandwidth. Sorry if it is a little long-winded.

Log in

Don't have an account? Sign up now