Image Quality and Settings

In retrospect, I probably should have just skipped the Ultra quality setting and opted for some form of custom settings. The texture data just overwhelms most GPUs at Ultra, and even High still struggles in many cases. Even more problematic is that there are only three texturing options: Low, High, and Ultra.

I also want to point you to NVIDIA's Assassin's Creed: Unity Graphics and Performance Guide, because if you're wanting a better look at what some of the graphics options really mean in terms of quality that article has everything you need to know. One item particularly worth noting is that NVIDIA recommends 2GB cards use Low textures, 3GB cards can do High, and Ultra is for 4GB cards (or maybe 6GB/8GB cards).

Anyway, here's a quick look at what the various presets do for quality. Let me start with a table showing what specific settings are applied for each of the presets. Again, the NVIDIA page linked above has a good explanation for what each of the settings does, and more importantly it has image sliders to let you do A/B comparisons for each setting. (Disregard their AA images, though, as it looks like they used 2560x1440 and shrunk them to 1080p – oops.)

Assassin's Creed: Unity Image Quality Presets
  Low Medium High Very High Ultra
Environmental Low Medium High Very High Ultra
Texture Low High High Ultra Ultra
Shadow Low Low High High Soft (PCSS)
Ambient Occlusion Off SSAO SSAO HBAO+ HBAO+
Anti-Aliasing Off FXAA 2xMSAA 2xMSAA 4xMSAA
Bloom Off On On On On
 

The main things to note is that there's a rather noticeable difference between Low and High texture quality, but not so much from High to Ultra. Environmental quality has a generally minor effect on the appearance of the game, especially at everything above Medium (though there are a few areas that are exceptions to this statement). The difference between Low and High shadows is also quite small, but the Soft Shadows implement PCSS (Percentage Closer Soft Shadows), which do look quite nice while also causing a moderate performance hit.

Anti-aliasing has a ton of settings, but the most useful are generally the MSAA options; those are also the most demanding. FXAA is as usual nearly "free" to enable and can help remove jaggies along with some other image details, which might be the best solution. TXAA performance is pretty similar to 4xMSAA I think, which means it's mostly for high-end rigs. Bloom is pretty much always on except at the lowest setting. Finally, ambient occlusion has two options along with off: SSAO or HBAO+. NVIDIA developed HBAO+ as a better version of AO, and in general I think they're right. It's also supposed to be faster than SSAO, at least on NVIDIA GPUs, so if you have NVIDIA hardware you'll probably want to enable that.

Looking at the presets, the difference between Ultra and Very High is visible in the right areas (e.g. placese with shadows), but they're overall pretty similar. There's a more noticeable drop from Very High to High, mostly with the change in textures, and at least for our test images the Medium and High settings look almost the same.

There are a few last items to note on benchmarking, just by way of reference. First, Assassin's Creed: Unity uses "dynamic" day/night cycles. They're not really dynamic, but Ubisoft has four preset times: morning, noon, dusk, and night. The reason this is important is that benchmarking the same sequence at different times of day can result in quite different results. There's also "dynamic" weather (or at least clouds) that can throw things off. Second, if you change certain image quality settings (which I'll get to next), specifically Texture Quality, you have to restart the game for the changes to take effect. Last, the game has dynamic crowds, which means the runs aren't fully deterministic, but in repeat testing the variance is generally less than 3% and closer to 1%.

The good news is that when you load up the game is always at the morning time slot, so basically you have to exit and reload between every setting change. Yes, it's quite tedious if you're benchmarking a dozen or so GPUs….

Test System and Benchmarks Closing Thoughts
Comments Locked

122 Comments

View All Comments

  • silverblue - Saturday, November 22, 2014 - link

    Understood - definite incompetence and on a grand scale, too, considering somebody with multiple cards has put x times the money into the vendor than somebody who would purchase just the one. I would find it hard to believe that they were unaware from their own internal testing. There's the possibility that whoever presides over this was given their marching orders and AMD set about fixing the damage, but I guess we'll never know.

    I apologise for the pedantry as well.
  • D. Lister - Saturday, November 22, 2014 - link

    No problem at all, it takes a big man to take an opposing argument with such candor - well done.
  • FlushedBubblyJock - Wednesday, November 26, 2014 - link

    It's AMD's responsibility to work with game devs to make certain their cards work properly.
    Of course, AMD is notoriou for not doing that for many, many years, then of course, it's nVidia's fault.
    AMD might do well: " We take full responsibility."
    That would mean of course having Catalyst Makers doing more than emailing and whining, like showing up at game dev studios and taking an active hand and having game day drivers ready.
    Of course if they did that, what would their unbelievably as incompetent misplaced blame fans have to do ?
    I mean seriously, it's as bad as the worst politicians we've ever seen pointing fingers in every direction but their own.
  • Lerianis - Friday, November 28, 2014 - link

    Agreed.... should be year and a half at least for a game of this scale with the manpower allotted to Ubisoft Montreal.
  • JarredWalton - Thursday, November 20, 2014 - link

    1440p High is probably playable on a single GTX 980 -- I just ran GTX 970 on that and got results of 30.4/23.6 Avg/Min, which is about 40% faster (44% to be precise) on average FPS and 65% faster on minimum FPS. If 980 sees the same scaling, it will be around 35/26 FPS at 1440p High. There's not a huge difference in performance or quality between the High and Medium presets, which means you really would need to drop to Low (or close to it) for 4K gaming.

    Why did I test these settings? Because you have to choose something, and we generally go with "Ultra" at 1440p -- just to see how the GPUs fare. I've tested 4K at Ultra in the past, but that was completely unplayable across the board so I dropped to High for this game. If I had dropped 1440p to High, I'm sure I'd get people wanting to see Ultra numbers -- you can't please everyone.

    Anyway, as someone that has a 4K display, I can tell you I'd rather play at 1440p or even 1080p with better graphics quality (High, Very High, or Ultra) than run at native 4K with the Low/Medium settings. YMMV.
  • AnnonymousCoward - Saturday, November 22, 2014 - link

    IMHO, as a 30" owner I'm more interested in 2560-benchmarks at a quality setting that gives 60fps on non-SLI cards.
  • Akrovah - Thursday, November 20, 2014 - link

    I disagree. I find 30 perfectly playable. That's the effective frame rate of television. Movies are 24, and nobody has issues with them not being "smooth enough." Heck, people almsot got out pitch forks when someone dared film a movie at 48 fps.

    I mean yes, for gamign 60 fps is preferable and looks and feels better, but to call anythign under that "awful" is going a little far. Especially whent he game in question is not a twitch shooter. Action/adventure games like Assassin's Creed are perfectly enjoyable at 30 fps.
  • HanzNFranzen - Thursday, November 20, 2014 - link

    Well you know, this is the internet... comments must be exaggerated for effect. Either something is greatest of all time or it's awful, never any middle ground. Anyways, I have a GTX980 and a 5820k @ 4.0Ghz and I would say that my experience with "playability" in this game doesn't really mirror the benchmarks at 2560x1440/ultra. Perhaps there are more taxing areas on the game that I haven't seen yet but I'm not seeing frames dropping into the teens. I feel the controls hurt the playability of the game more than anything as they just seem clucky.
  • theMillen - Friday, November 21, 2014 - link

    Exactly my remarks, 3770k @ 4.8, and evga 980 acx oc'd to 1550... and at 1440/ultra it is completely playable, im about 4 hours in and am completely satisfied with results. would i love to stay above 60fps at all times? yes. am i satisfied? yup!
  • foxtrot1_1 - Thursday, November 20, 2014 - link

    There is a big difference between passively watching a 24fps film and interacting with a 24fps video game. I'm far from a pedant on these things and I find anything under 45-50 fps distractingly unplayable.

Log in

Don't have an account? Sign up now