Image Quality and Settings

In retrospect, I probably should have just skipped the Ultra quality setting and opted for some form of custom settings. The texture data just overwhelms most GPUs at Ultra, and even High still struggles in many cases. Even more problematic is that there are only three texturing options: Low, High, and Ultra.

I also want to point you to NVIDIA's Assassin's Creed: Unity Graphics and Performance Guide, because if you're wanting a better look at what some of the graphics options really mean in terms of quality that article has everything you need to know. One item particularly worth noting is that NVIDIA recommends 2GB cards use Low textures, 3GB cards can do High, and Ultra is for 4GB cards (or maybe 6GB/8GB cards).

Anyway, here's a quick look at what the various presets do for quality. Let me start with a table showing what specific settings are applied for each of the presets. Again, the NVIDIA page linked above has a good explanation for what each of the settings does, and more importantly it has image sliders to let you do A/B comparisons for each setting. (Disregard their AA images, though, as it looks like they used 2560x1440 and shrunk them to 1080p – oops.)

Assassin's Creed: Unity Image Quality Presets
  Low Medium High Very High Ultra
Environmental Low Medium High Very High Ultra
Texture Low High High Ultra Ultra
Shadow Low Low High High Soft (PCSS)
Ambient Occlusion Off SSAO SSAO HBAO+ HBAO+
Anti-Aliasing Off FXAA 2xMSAA 2xMSAA 4xMSAA
Bloom Off On On On On
 

The main things to note is that there's a rather noticeable difference between Low and High texture quality, but not so much from High to Ultra. Environmental quality has a generally minor effect on the appearance of the game, especially at everything above Medium (though there are a few areas that are exceptions to this statement). The difference between Low and High shadows is also quite small, but the Soft Shadows implement PCSS (Percentage Closer Soft Shadows), which do look quite nice while also causing a moderate performance hit.

Anti-aliasing has a ton of settings, but the most useful are generally the MSAA options; those are also the most demanding. FXAA is as usual nearly "free" to enable and can help remove jaggies along with some other image details, which might be the best solution. TXAA performance is pretty similar to 4xMSAA I think, which means it's mostly for high-end rigs. Bloom is pretty much always on except at the lowest setting. Finally, ambient occlusion has two options along with off: SSAO or HBAO+. NVIDIA developed HBAO+ as a better version of AO, and in general I think they're right. It's also supposed to be faster than SSAO, at least on NVIDIA GPUs, so if you have NVIDIA hardware you'll probably want to enable that.

Looking at the presets, the difference between Ultra and Very High is visible in the right areas (e.g. placese with shadows), but they're overall pretty similar. There's a more noticeable drop from Very High to High, mostly with the change in textures, and at least for our test images the Medium and High settings look almost the same.

There are a few last items to note on benchmarking, just by way of reference. First, Assassin's Creed: Unity uses "dynamic" day/night cycles. They're not really dynamic, but Ubisoft has four preset times: morning, noon, dusk, and night. The reason this is important is that benchmarking the same sequence at different times of day can result in quite different results. There's also "dynamic" weather (or at least clouds) that can throw things off. Second, if you change certain image quality settings (which I'll get to next), specifically Texture Quality, you have to restart the game for the changes to take effect. Last, the game has dynamic crowds, which means the runs aren't fully deterministic, but in repeat testing the variance is generally less than 3% and closer to 1%.

The good news is that when you load up the game is always at the morning time slot, so basically you have to exit and reload between every setting change. Yes, it's quite tedious if you're benchmarking a dozen or so GPUs….

Test System and Benchmarks Closing Thoughts
Comments Locked

122 Comments

View All Comments

  • funnyferrell - Thursday, November 20, 2014 - link

    Unless I'm totally blind, your CPU benchmarks don't appear to be up there.
  • JarredWalton - Thursday, November 20, 2014 - link

    As noted in the text, I only ran the i3-4330 simulation with one GPU, and furthermore I only ran it at 1080p (Ultra/High/Medium). Basically it couldn't do more than that so I left of further testing.
  • FITCamaro - Thursday, November 20, 2014 - link

    Yes but you mention charts and don't show any.
  • JarredWalton - Thursday, November 20, 2014 - link

    The i3-4330 + GTX 980 numbers are in black in the 1080p charts.
  • P39Airacobra - Tuesday, January 13, 2015 - link

    How was a i3 doing so bad? This game is basically the same engine as black flag except not optimized at all. And the i3 always performs almost identical in games vs the i5 and i7. Are you sure you did not fake that?
  • P39Airacobra - Tuesday, January 13, 2015 - link

    Also I know of some Pentiums like the G3258 model playing the game perfect with a 970.
  • P39Airacobra - Thursday, December 11, 2014 - link

    I suppose you are because the benchmarks are there, You just have to know how to use a webpage instead of only worrying about trends.
  • os6B8dbVUesnzqF - Thursday, November 20, 2014 - link

    I'm not sure why any of these frame rates are considered playable. Unless you have a gsync monitor, anything less than 60fps minimum frame rate is going to be awful.
  • JarredWalton - Thursday, November 20, 2014 - link

    "Playable" is not the same as "ideal". I've logged plenty of hours over the years playing games at well under 60 FPS. 30FPS is usually the point where things get "smooth enough" to play well. 40+ is definitely sufficient. G-SYNC is merely icing on the cake if you have it.
  • raghu78 - Thursday, November 20, 2014 - link

    Jared
    Testing must be done at settings which are playable. Why are you testing QHD with Ultra and 4k with High settings where not even a GTX 980 is playable ? You did not even bother to show what setting is playable at 1440p/4k on GTX 980. My guess is high at 1440p and medium or low at 4k would have been playable on GTX 980. Gameworks features like PCSS is killing fps on all cards. AMD definitely need to improve performance in AC Unity.

Log in

Don't have an account? Sign up now