Image Quality - Xbox 360 vs. Xbox One

Before I get to the PS4 comparison, I wanted to start with some videos showcasing the improvement you can expect from launch day titles that are available on both the Xbox 360 and Xbox One. I turned to Call of Duty: Ghosts for this comparison as it’s broadly available on all platforms I’m comparing today.

Note that cross platform launch titles, particularly those available on previous generation consoles, end up being the worst examples of what’s possible on a next-generation platform. For the most part they’re optimized for the platform with the larger installed base (i.e. prior-gen hardware), and the visual uplift on new hardware isn’t as much as it could be. I’d say my subjective experience in playing a lot of the launch titles on Xbox One and PS4 mirrors this sentiment. Basic things like not having accurate/realistic cloth physics in games like CoD: Ghosts just screams port and not something that was designed specifically for these next gen systems. Just as we’ve seen in prior generations, it’s likely going to be a good 12 - 24 months before we see great examples of games on this new generation of hardware.

Now that I’ve adequately explained why this is a bad comparison, let’s get to the comparison. I’ve captured HDMI output on both consoles. They were both set to full range (0-255), however I had issues with the Xbox One respecting this setting for some reason. That combined with differences across Ghosts on both platforms left me with black levels that don’t seem equalized between the platforms. If you can ignore that, we can get to the comparison at hand.

All of these videos are encoded at 4K, with two 1080p captures placed side by side. Be sure to select the highest quality playback option YouTube offers.

The first scene is the intro to Ghosts. Here you can see clear differences in lighting, details in the characters, as well as some basic resolution/AA differences as well (Xbox 360 image sampleXbox One image sample).

The second scene is best described as Call of Duty meets Gravity. Here the scene is going by pretty quickly so you’re going to have to pause the video to get a good feel for any differences in the platforms. What’s most apparent here though is the fact that many present day users can likely get by sticking with older hardware due to the lack of titles that are truly optimized for the Xbox One/PS4.

Now getting to scenes more representative of actual gameplay, we have Riley riding around wanting badly to drive the military vehicle. Here the differences are huge. The Xbox One features more realistic lighting, you can see texture in Riley’s fur, shadows are more detailed and there seems to be a resolution/AA advantage as well. What’s funny is that although the Xbox One appears to have a resolution advantage, the 360 appears to have less aliasing as everything is just so blurry.

Speaking of aliasing, we have our final IQ test which is really the perfect test case for high resolution/AA. Once again we see a completely different scene comparing the Xbox One to Xbox 360. Completely different lighting, much more detail in the environments as well as objects on the ground. The 360 version of Ghosts is just significantly more blurry than what you get on the One, which unfortunately makes aliasing stand out even more on the One.

Even though it’ll be a little while before we get truly optimzed next-gen titles, there’s an appreciable improvement on those games we have today for anyone upgrading from an older console. The difference may be more subtle than in previous generations, but it’s there.

Performance - An Update Image Quality - Xbox One vs. PlayStation 4
Comments Locked

286 Comments

View All Comments

  • CubesTheGamer - Wednesday, November 20, 2013 - link

    You...you are a special kind of idiot.

    The PS4 has a more powerful GPU since it has more compute cores. What this means is that while Xbone has a slightly higher clock speed, there are more computer cores to do the work on PS4, so it can split up and done faster. Also, while the GPU might be able to read from both pools of memory at one time, that doesn't mean the RAM bandwidth is (60GB/s + 200GB/s) or whatever the numbers are.
  • Egg - Wednesday, November 20, 2013 - link

    "Microsoft has claimed publicly that actual bandwidth to the eSRAM is somewhere in the 140 - 150GB/s range, which is likely equal to the effective memory bandwidth (after overhead/efficiency losses) to the PS4’s GDDR5 memory interface. The difference being that you only get that bandwidth to your most frequently used data on the Xbox One."

    "The difference being that you only get that bandwidth to your most frequently used data on the Xbox One."

    No. This is effective bandwidth to the eSRAM only after protocol overhead, nothing more.
  • editorsorgtfo - Wednesday, November 20, 2013 - link

    Uhhh, no you can't... Are you serious?
  • bill5 - Wednesday, November 20, 2013 - link

    the gpu can read both pools at once. this is a fact. you can.

    period. no arguing this, it's a fact.

    it's not the same as a single pool, but you can add them.
  • szimm - Wednesday, November 20, 2013 - link

    You do realize, that telling someone they are not allowed to argue something, will only make them much more eager to do just that?
  • melgross - Wednesday, November 20, 2013 - link

    No, you can't. Well, maybe YOU can, but the systems can't.
  • Owls - Wednesday, November 20, 2013 - link

    Please provide some technical insight as to how you can magically add the two together to get your ridiculous throughput. We'll wait.

    Or don't since you are clearly astroturfing for MS.
  • Wolfpup - Wednesday, November 20, 2013 - link

    Good grief, we've got fanbois on Anandtech too? LOL Umm..the specs are right there. One quite obviously does not "have an edge".
  • editorsorgtfo - Wednesday, November 20, 2013 - link

    Ah so graphics card manufacturers can replace GDDR5 with cheap low frequency DDR3 on all of their boards and get equal/greater performance so long as they add a little chunk of SDRAM to the chip... Good to know man, thanks for that brilliant analysis. They should have come to you years ago to tap your knowledge of memory subsystems. Just think of all the money AMD and NVIDIA could have saved by doing so.
  • extide - Wednesday, November 20, 2013 - link

    Well, in theory they can... but it would cost nVidia/AMD MORE money as the GPU die would be bigger, and thus have less shader cores. So it's not a good solution for a discreet GPU, but it IS a decent solution in SOME cases, see Crystalwell, for instance. Honestly, I would say I think the PS4's setup is better, simple and fast, versus MS's more complex setup (and they ended up with a bigger die too, lol).

Log in

Don't have an account? Sign up now