Image Quality and Settings

In retrospect, I probably should have just skipped the Ultra quality setting and opted for some form of custom settings. The texture data just overwhelms most GPUs at Ultra, and even High still struggles in many cases. Even more problematic is that there are only three texturing options: Low, High, and Ultra.

I also want to point you to NVIDIA's Assassin's Creed: Unity Graphics and Performance Guide, because if you're wanting a better look at what some of the graphics options really mean in terms of quality that article has everything you need to know. One item particularly worth noting is that NVIDIA recommends 2GB cards use Low textures, 3GB cards can do High, and Ultra is for 4GB cards (or maybe 6GB/8GB cards).

Anyway, here's a quick look at what the various presets do for quality. Let me start with a table showing what specific settings are applied for each of the presets. Again, the NVIDIA page linked above has a good explanation for what each of the settings does, and more importantly it has image sliders to let you do A/B comparisons for each setting. (Disregard their AA images, though, as it looks like they used 2560x1440 and shrunk them to 1080p – oops.)

Assassin's Creed: Unity Image Quality Presets
  Low Medium High Very High Ultra
Environmental Low Medium High Very High Ultra
Texture Low High High Ultra Ultra
Shadow Low Low High High Soft (PCSS)
Ambient Occlusion Off SSAO SSAO HBAO+ HBAO+
Anti-Aliasing Off FXAA 2xMSAA 2xMSAA 4xMSAA
Bloom Off On On On On
 

The main things to note is that there's a rather noticeable difference between Low and High texture quality, but not so much from High to Ultra. Environmental quality has a generally minor effect on the appearance of the game, especially at everything above Medium (though there are a few areas that are exceptions to this statement). The difference between Low and High shadows is also quite small, but the Soft Shadows implement PCSS (Percentage Closer Soft Shadows), which do look quite nice while also causing a moderate performance hit.

Anti-aliasing has a ton of settings, but the most useful are generally the MSAA options; those are also the most demanding. FXAA is as usual nearly "free" to enable and can help remove jaggies along with some other image details, which might be the best solution. TXAA performance is pretty similar to 4xMSAA I think, which means it's mostly for high-end rigs. Bloom is pretty much always on except at the lowest setting. Finally, ambient occlusion has two options along with off: SSAO or HBAO+. NVIDIA developed HBAO+ as a better version of AO, and in general I think they're right. It's also supposed to be faster than SSAO, at least on NVIDIA GPUs, so if you have NVIDIA hardware you'll probably want to enable that.

Looking at the presets, the difference between Ultra and Very High is visible in the right areas (e.g. placese with shadows), but they're overall pretty similar. There's a more noticeable drop from Very High to High, mostly with the change in textures, and at least for our test images the Medium and High settings look almost the same.

There are a few last items to note on benchmarking, just by way of reference. First, Assassin's Creed: Unity uses "dynamic" day/night cycles. They're not really dynamic, but Ubisoft has four preset times: morning, noon, dusk, and night. The reason this is important is that benchmarking the same sequence at different times of day can result in quite different results. There's also "dynamic" weather (or at least clouds) that can throw things off. Second, if you change certain image quality settings (which I'll get to next), specifically Texture Quality, you have to restart the game for the changes to take effect. Last, the game has dynamic crowds, which means the runs aren't fully deterministic, but in repeat testing the variance is generally less than 3% and closer to 1%.

The good news is that when you load up the game is always at the morning time slot, so basically you have to exit and reload between every setting change. Yes, it's quite tedious if you're benchmarking a dozen or so GPUs….

Test System and Benchmarks Closing Thoughts
Comments Locked

122 Comments

View All Comments

  • RafaelHerschel - Monday, November 24, 2014 - link

    I don’t mind somebody saying: “this game is perfectly playable for me at 40 fps”. I do mind it if people say that there is no perceivable difference between 40 fps and 60 fps (as stated in the comments) or when people say “the game runs smooth as butter” when it doesn't. The article was fair, some of the comments weren't.

    For me a game is not enjoyable at anything below 50 fps and I much prefer it to have Vsync enabled.

    I would say that most people accept 60 fps as a reasonable goal at medium settings (whatever they may be) with a high-end GPU. Depending on personal taste (graphics settings) and budget people can than choose to sacrifice fps for MSAA, AO, high-res textures and/or money.
    I strongly believe that studios should aim for 60 fps at medium settings with a high-end card and 60 fps with a medium-card at low settings (both at 1080).

    With smart design choices and quality control that is certainly possible. As it stands, I’m disappointed with both Far Cry 4 and Unity.
  • HisDivineOrder - Monday, November 24, 2014 - link

    1) Wonder if an i5 vs i7 (hyperthreading) matters.
    2) Wonder why you guys don't borrow a Titan Black and test it to see if the extra VRAM improves things. Surely, a contact at Asus, Gigabyte, nVidia, etc has a Titan Black with 6GB of RAM to lend you. Probably two for SLI. I'm curious to see if the game can use the VRAM because I'm hearing reports of Ultra taking 4GB and gobbling it up.
    3) Ultra settings preset includes MSAA. That's the first setting I'd turn off if my settings were taking a dive. It gobbles up memory AND processing like nobody's business. What happens if you turn it off?

    Seems like obvious questions to me. Until Batman Arkham Knight, this looks to be The Benchmark game in terms of crushing your system. Assuming they ever finish patching it.
  • RafaelHerschel - Monday, November 24, 2014 - link

    If the available VRAM makes a difference, then lowering texture quality and turning of all forms of AA will make a big difference.

    Unfortunately Ubisoft games don't scale well with lowering the settings.
  • Evenload - Wednesday, November 26, 2014 - link

    VRAM clearly makes a very big difference on this game. To answer the question above I maxed out the settings at 1080p on my GTX Titan (original) and just ran/jumped round Paris a bit while GPU-Z was set to data log. The file shows constantly high memory usage maxing out at about 4.4Gb. Interestingly with stock settings the GPU was often being pushed to relatively high clock rates by GPU boost so it looks like the GPU was not being worked extremely hard.

    Not a scientific test but potentially bad news for people with 2Gb and 3Gb cards as tweaking will not recover the difference. Interestingly I noticed that the main system memory the game takes is not that large and I wander if the issues people are experiencing are possibly related to the way the game has been programmed and the unified memory model PS and Xbox use. On the consoles the distinction between "graphics" memory and "system" memory would not matter in the same way that they do in a gaming PC with a graphics card.
  • joeh4384 - Tuesday, November 25, 2014 - link

    Lol at needing freaking SLI 970s for 60+ fps at 1080p. Do you think patches in time can make this playable for high end single card setups like a 290x on ultra.
  • Lerianis - Sunday, November 30, 2014 - link

    Unity is a good game once you get past the glitchfest. No, it is not a revolution of the Assassin's Creed series, more an evolution of Assassin's Creed 4. It is one awesome game (I played it on a friend's console and another one's PC) once you get past those issues.
    The only thing I don't like about it is that it is VERY VERY hungry for graphics power even at 1080p settings.
    To the point where the latest 980M's from NVidia struggle to push more than 30fps at those settings on Ultra.
    I'm wondering (considering I do not see much additional graphics prettiness) whether that is a sign that the game was not properly optimized for PC's and notebook PC's. If it is, that is something that Ubisoft (and other game makers) are going to have to take note of and fix.
  • Ramon Zarat - Sunday, November 30, 2014 - link

    I'll only say this: Fuck Ubisoft, the new E.A.
  • IUU - Tuesday, December 2, 2014 - link

    At last a breath of fresh air. Instead of getting everyone excited of how good you can play pacman at 10k, one company still serves as a reminder of the distance we still have to cross.

    Way to go Ubisoft, and if you make a game hardly playable at 1280X720, I will make a donation to you and create a church for you. We have had enough from the mobile devolution, touting meaningless resolutions(3 mega pixels on a tablet, oh my god). You will serve as a reminder that high resolution is good, but you have to have some real content to show on it.

    We need a new Crysis, rather not only one but several in succession.
  • wrayj - Tuesday, December 2, 2014 - link

    I've seen videos where dropping the resolution to 1600x900 is really the way to claw back performance.
  • is4u2p - Tuesday, December 9, 2014 - link

    I got way better than this with my i5-3570k and R9-290.

Log in

Don't have an account? Sign up now