Image Quality and Settings

In retrospect, I probably should have just skipped the Ultra quality setting and opted for some form of custom settings. The texture data just overwhelms most GPUs at Ultra, and even High still struggles in many cases. Even more problematic is that there are only three texturing options: Low, High, and Ultra.

I also want to point you to NVIDIA's Assassin's Creed: Unity Graphics and Performance Guide, because if you're wanting a better look at what some of the graphics options really mean in terms of quality that article has everything you need to know. One item particularly worth noting is that NVIDIA recommends 2GB cards use Low textures, 3GB cards can do High, and Ultra is for 4GB cards (or maybe 6GB/8GB cards).

Anyway, here's a quick look at what the various presets do for quality. Let me start with a table showing what specific settings are applied for each of the presets. Again, the NVIDIA page linked above has a good explanation for what each of the settings does, and more importantly it has image sliders to let you do A/B comparisons for each setting. (Disregard their AA images, though, as it looks like they used 2560x1440 and shrunk them to 1080p – oops.)

Assassin's Creed: Unity Image Quality Presets
  Low Medium High Very High Ultra
Environmental Low Medium High Very High Ultra
Texture Low High High Ultra Ultra
Shadow Low Low High High Soft (PCSS)
Ambient Occlusion Off SSAO SSAO HBAO+ HBAO+
Anti-Aliasing Off FXAA 2xMSAA 2xMSAA 4xMSAA
Bloom Off On On On On
 

The main things to note is that there's a rather noticeable difference between Low and High texture quality, but not so much from High to Ultra. Environmental quality has a generally minor effect on the appearance of the game, especially at everything above Medium (though there are a few areas that are exceptions to this statement). The difference between Low and High shadows is also quite small, but the Soft Shadows implement PCSS (Percentage Closer Soft Shadows), which do look quite nice while also causing a moderate performance hit.

Anti-aliasing has a ton of settings, but the most useful are generally the MSAA options; those are also the most demanding. FXAA is as usual nearly "free" to enable and can help remove jaggies along with some other image details, which might be the best solution. TXAA performance is pretty similar to 4xMSAA I think, which means it's mostly for high-end rigs. Bloom is pretty much always on except at the lowest setting. Finally, ambient occlusion has two options along with off: SSAO or HBAO+. NVIDIA developed HBAO+ as a better version of AO, and in general I think they're right. It's also supposed to be faster than SSAO, at least on NVIDIA GPUs, so if you have NVIDIA hardware you'll probably want to enable that.

Looking at the presets, the difference between Ultra and Very High is visible in the right areas (e.g. placese with shadows), but they're overall pretty similar. There's a more noticeable drop from Very High to High, mostly with the change in textures, and at least for our test images the Medium and High settings look almost the same.

There are a few last items to note on benchmarking, just by way of reference. First, Assassin's Creed: Unity uses "dynamic" day/night cycles. They're not really dynamic, but Ubisoft has four preset times: morning, noon, dusk, and night. The reason this is important is that benchmarking the same sequence at different times of day can result in quite different results. There's also "dynamic" weather (or at least clouds) that can throw things off. Second, if you change certain image quality settings (which I'll get to next), specifically Texture Quality, you have to restart the game for the changes to take effect. Last, the game has dynamic crowds, which means the runs aren't fully deterministic, but in repeat testing the variance is generally less than 3% and closer to 1%.

The good news is that when you load up the game is always at the morning time slot, so basically you have to exit and reload between every setting change. Yes, it's quite tedious if you're benchmarking a dozen or so GPUs….

Test System and Benchmarks Closing Thoughts
Comments Locked

122 Comments

View All Comments

  • poohbear - Friday, November 21, 2014 - link

    lets be honest, this is a poorly optimized game with an enormous amount of bugs that was so ridiuclously messed up it made the BBC news and their shares dropped 11%! It's a complete debacle.
  • dwade123 - Friday, November 21, 2014 - link

    Good thing I didn't buy GTX 980 for $460. It can't run next genports maxed comfortably. Bring out the real nextgen gpus!
  • maroon1 - Friday, November 21, 2014 - link

    Core i3 4130 with GTX 750 Ti runs this game as good as console version

    Eurogamers did test by matching the graphic quality of PC to console version (by running it with 900p and similar graphic settings to PS4), and the result that GTX 750 Ti plays it as good if not slightly better.
  • cmdrmonkey - Friday, November 21, 2014 - link

    When a game is barely playable on the most high-end video cards on the market at resolutions and settings PC gamers are accustomed to, you have utterly failed. Bravo Ubisoft. Bravo.
  • P39Airacobra - Friday, November 21, 2014 - link

    You can forget about Ubicrap fixing this! This is why Ubicrap gave the unreal PC requirements! They are getting money from GPU/CPU Hardware to help market for them! And they do care to spend more money on us scum customers anyway! So I say XXXXXXXXXXXX UBICRAP!!!!!
  • P39Airacobra - Friday, November 21, 2014 - link

    They should be arrested for doing this!
  • mr. president - Sunday, November 23, 2014 - link

    Any chance of testing CPU performance on AMD vs nvidia GPUs? I've seen a *ton* of recent games underperform on AMD GPUs due to what I think is their lack of support for deferred contexts aka 'multithreaded rendering'. It's particularly low-end CPUs that are affected.

    Unity pushes something like 50.000 draw calls each frame. Note the enormous disparity in minimum framerates between the two vendors on 1080p/medium where even slower nvidia GPUs get higher minimums than faster AMD GPUs. I think it's worth exploring as even low-end FX CPUs can almost double their performance on high-end nvidia GPUs vs. high-end AMD GPUs.
  • FlushedBubblyJock - Tuesday, November 25, 2014 - link

    That last line you have tells me AMD is offloading multiple boatloads of work to the cpu --- isn't that exactly why Mantle is for low end cpu's - it relieves the gigantic overburdening cheaty normal driver of AMD that hammers the puny AMD cpus.

    It's sad really - shortcuts and angles and scammy drivers that really only hurt everyone.
  • RafaelHerschel - Sunday, November 23, 2014 - link

    A few observations:

    60 frames per seconds isn’t some arbitrary value. With Vsync enabled and a refresh rate of 60Hz, dips below 60 fps are far more unpleasant. Adaptive Vsync addresses that but isn’t available to everybody. Disabling Vsync leads to screen tearing which some people (me included) find extremely annoying.

    In a game every frame consists of discrete information. In a movie each frame is slightly blurred or at least partially blurred, a natural effect of capturing moving objects in a frame. For a game to feel fluent at 24 or 30 fps it needs to add artificial blurring.

    In movies each frame has the same length. In games the length of each frame is different. So even 60 fps can feel choppy.

    Different people have different sensibilities. I always notice a low frame rate and frame drops. A steady 60 fps with Vsync enabled works best for me. Anything below 50 fps (in a game) feels off to me and above 60 I don’t notice that much difference. Likewise for gaming and movies I use screens with a fast response time since ghosting really distracts me.

    I feel that with a decent system a 60 fps minimum should be attainable. What bugs me is that in some games lowering the quality settings has little impact on the minimum frame rate.

    I’m always surprised by blanket statement like “30 fps per second is perfectly playable”. Depending on the game, the settings and the person playing the game it’s often not. For me another factor is how close I’m to the screen.
  • JarredWalton - Monday, November 24, 2014 - link

    FWIW, I've been playing games for about 35 years now (since I was 6 on a Magnovox Odyssey II), and when I say a game is "playable" at 40 FPS, what I'm saying is that as someone with years of game playing behind them feels the game works fine at that frame rate. I've also played ACU for many hours at sub-60 FPS rates (without G-SYNC being enabled) and didn't mind the experience. Of course I wasn't the one saying it was "perfectly playable" above, but it is most definitely playable and IMO acceptable for performance. If you want *ideal*, which is completely different, then yes: 60+ FPS is what you want. But then there are those with LCDs running at 120Hz who would want even higher frame rates. YMMV.

Log in

Don't have an account? Sign up now