Image Quality - Xbox One vs. PlayStation 4

This is the big one. We’ve already established that the PS4 has more GPU performance under the hood, but how does that delta manifest in games? My guess is we’re going to see two different situations. The first being what we have here today. For the most part I haven’t noticed huge differences in frame rate between Xbox One and PS4 versions of the same game, but I have noticed appreciable differences in resolution/AA. This could very well be the One’s ROP limitations coming into play. Quality per pixel seems roughly equivalent across consoles, the PS4 just has an easier time delivering more of those pixels.

The second situation could be one where an eager developer puts the PS4’s hardware to use and creates a game that doesn’t scale (exclusively) in resolution, but also in other aspects of image quality as well. My guess is the types of titles to fall into this second category will end up being PS4 exclusives (e.g. Uncharted 4) rather than something that’s cross-platform. There’s little motivation for a cross-platform developer to spend a substantial amount of time in optimizing for one console.

Call of Duty: Ghosts

Let’s start out with Call of Duty: Ghosts. Here I’m going to focus on two scenes: what we’ve been calling internally Let the Dog Drive, and the aliasing test. Once again I wasn’t able to completely normalize black levels across both consoles in Ghosts for some reason.

In motion both consoles look pretty good. You really start to see the PS4’s resolution/AA advantages at the very end of the sequence though (PS4 image sample, Xbox One image sample). The difference between these two obviously isn’t as great as from the 360 to Xbox One, but there is a definite resolution advantage to the PS4. It’s even more obvious if you look at our aliasing test:

Image quality otherwise looks comparable between the two consoles.

NBA 2K14

NBA 2K14 is one cross platform title where I swear I could sense slight frame rate differences between the two consoles (during high quality replays) but it’s not something I managed to capture on video. Once again we find ourselves in a situation where there is a difference in resolution and/or AA levels between the Xbox One and PS4 versions of the game.

Both versions look great. I’m not sure how much of this is the next-gen consoles since the last time I played an NBA 2K game was back when I was in college, but man have console basketball games significantly improved in their realism over the past decade. On a side note, NBA 2K14 does seem to make good use of the impulse triggers on the Xbox One’s controller.



Battlefield 4

I grabbed a couple of scenes from early on in Battlefield 4. Once again the differences here are almost entirely limited to the amount of aliasing in the scene as far as I can tell. The Xbox One version is definitely more distracting. In practice I notice the difference in resolution, but it’s never enough to force me to pick one platform over another. I’m personally more comfortable with the Xbox One’s controller than the PS4’s, which makes for an interesting set of tradeoffs.

Image Quality - Xbox 360 vs. Xbox One Power Consumption
Comments Locked

286 Comments

View All Comments

  • airmantharp - Wednesday, November 20, 2013 - link

    Having actual CPU resources, a unified GPU architecture with desktops (and many mobile SoCs), and tons of RAM are all big differences over the last generation's introduction.

    The Xbox expounds on that by adding in co-processors that allow for lots of difficult stuff to happen in real-time without affecting overall performance.
  • mikeisfly - Thursday, November 21, 2013 - link

    Thank god people didn't think like this when computers first started with switches and paper tape. Remember we have to start some where to move the technology forward. I want the Jarvis computer in Iron Man! You don't get there by making a console that can play games. You get there by making a console that can play games and has voice recognition and gestures and ......
    People get use to interacting with new input sources and then you find your self in a situation when you say how did I ever live without this. You guys sound like I did in the 80s when Microsoft was coming out with this stupid gui crap. "You will have to rip the command line from my cold dead fingers!" Where would we be today if everyone thought like me. Where would the Internet be if it was just command line. I for one applaud Microsoft for trying to expand the gaming market not just for hard core gamers but people like my girl too. I know the PS4 might have more power in terms of compute performance but that is not what games are about, it's about story line, immersiveness (made-up word), and to some extent graphics. Truth is there is really no difference between 1080 and 720 on a Big Screen, remember people this is not a PC monitor. And the X1 can do 1080p. I'm looking forward to what both systems can offer in this next generation but I'm more interested in the X1 due to it's forward thinking aspects. Only time will tell though.
  • douglord - Thursday, November 21, 2013 - link

    Rule of thumb is you need a 10x increase in power to get a 100% increase in visual fidelity. Look at 360 vs One. 6x the power and maybe games look 50% better. So we are talking about the PS4 looking 5% better than Xbox One. In this gen, it really is about who has the exclusives you want.

    And if you are looking out 5+ years you have to take into account Xbox's cloud initiative. Have you used OnLive? II can play Borderlands 2 on an Intel Atom. If MS puts the $ behind it, those 8 cores and pitiful CPU could be used just to power the OS and cloud terminal. Only way these consoles can keep up with midrange PCs.
  • Revdarian - Sunday, November 24, 2013 - link

    Interesting that you use numbers referring to visual fidelity, when it is a non quantifiable, perceptual, quality.

    Also there is no such Rule of Thumb regarding it, but what is known is that in certain games like CoD:Ghosts due to certain choices the xb1 is able to pump less than half the pixels that the ps4 can.

    If you believe in the Cloud for that kind of gaming, Sony has bought Gaikai and it is a project that started sooner than the MS counterpart, heck the MS counterpart hasn't been named.
  • RubyX - Wednesday, November 20, 2013 - link

    How do the noise levels of the consoles compare?
    According to other reviews they both seem to be fairly quiet, which is great, but is there a noticable difference between them?
  • szimm - Wednesday, November 20, 2013 - link

    I'm wondering the same - I've seen lots of people point out the fact that the Xbox One is designed to be bigger, but more cool and quiet. However, I haven't seen any confirmation that it is in fact more quiet than the PS4.
  • bill5 - Wednesday, November 20, 2013 - link

    15w standby, seems a bit high.

    Lets say you leave it on standby 24/7, as you would, that's 360 watts a day, almost 11 KWh/s month. I pay ~10cent poer Kwh in general, so 1.10/month.

    Could add up to $60+ over 5 years. More if the EPA enforces more regulations rising the cost of electricity as they typically are doing.
  • ydeer - Thursday, November 21, 2013 - link

    Yes, the standby power of the XBone and PS4 bothers me too. I often leave my TV and Consoles untouched for weeks, so the only sensible thing is to put them on a Master/Slave powerstrip which cuts them off the grid when the TV isn’t on.

    Of course that defeats the entire standby background downloads, but in the case of Sony, I have to wonder why they put a whole proprietary ARM SoC* (with 2GB of DDR3 RAM) on the board for "low power standby and background downloads" and then end up with unbelievable 70W figures.

    This is essentially a mobile phone without a display, I don’t think it should use more than 3 Watt idle with the HD spun down.

    My only explanation is that they couldn’t get the ARM software/OS side if things wrapped up in time for the launch, so for now they use the x86 CPU for background downloads even though it was never intended to do that.

    * http://www.ifixit.com/Teardown/PlayStation+4+Teard...
  • ydeer - Thursday, November 21, 2013 - link

    Correction, the SoC only has access to 2Gb (= 256 MB) of DDR3 RAM.

    However, I found a document that seems to confirm that the ARM Subsystem did not work as planned and Sony currently uses the APU for all standby/background tasks.

    Maybe somebody who is fluent in Japanese could give us a short abstract of the part that talks about the subsystem.

    http://translate.google.com/translate?u=http%3A//p...
  • tipoo - Wednesday, November 20, 2013 - link

    Hey Anand, did you see the Wii U GPU die shots? How many shaders do you think are in there? I think it's almost certainly 160 at this point, but there are a few holdouts saying 320 which seems impossible with the shader config/size. They are basing that off the clusters being a bit bigger than normal shader cores, but that could be down to process optimization.

Log in

Don't have an account? Sign up now