Image Quality and Settings

In retrospect, I probably should have just skipped the Ultra quality setting and opted for some form of custom settings. The texture data just overwhelms most GPUs at Ultra, and even High still struggles in many cases. Even more problematic is that there are only three texturing options: Low, High, and Ultra.

I also want to point you to NVIDIA's Assassin's Creed: Unity Graphics and Performance Guide, because if you're wanting a better look at what some of the graphics options really mean in terms of quality that article has everything you need to know. One item particularly worth noting is that NVIDIA recommends 2GB cards use Low textures, 3GB cards can do High, and Ultra is for 4GB cards (or maybe 6GB/8GB cards).

Anyway, here's a quick look at what the various presets do for quality. Let me start with a table showing what specific settings are applied for each of the presets. Again, the NVIDIA page linked above has a good explanation for what each of the settings does, and more importantly it has image sliders to let you do A/B comparisons for each setting. (Disregard their AA images, though, as it looks like they used 2560x1440 and shrunk them to 1080p – oops.)

Assassin's Creed: Unity Image Quality Presets
  Low Medium High Very High Ultra
Environmental Low Medium High Very High Ultra
Texture Low High High Ultra Ultra
Shadow Low Low High High Soft (PCSS)
Ambient Occlusion Off SSAO SSAO HBAO+ HBAO+
Anti-Aliasing Off FXAA 2xMSAA 2xMSAA 4xMSAA
Bloom Off On On On On
 

The main things to note is that there's a rather noticeable difference between Low and High texture quality, but not so much from High to Ultra. Environmental quality has a generally minor effect on the appearance of the game, especially at everything above Medium (though there are a few areas that are exceptions to this statement). The difference between Low and High shadows is also quite small, but the Soft Shadows implement PCSS (Percentage Closer Soft Shadows), which do look quite nice while also causing a moderate performance hit.

Anti-aliasing has a ton of settings, but the most useful are generally the MSAA options; those are also the most demanding. FXAA is as usual nearly "free" to enable and can help remove jaggies along with some other image details, which might be the best solution. TXAA performance is pretty similar to 4xMSAA I think, which means it's mostly for high-end rigs. Bloom is pretty much always on except at the lowest setting. Finally, ambient occlusion has two options along with off: SSAO or HBAO+. NVIDIA developed HBAO+ as a better version of AO, and in general I think they're right. It's also supposed to be faster than SSAO, at least on NVIDIA GPUs, so if you have NVIDIA hardware you'll probably want to enable that.

Looking at the presets, the difference between Ultra and Very High is visible in the right areas (e.g. placese with shadows), but they're overall pretty similar. There's a more noticeable drop from Very High to High, mostly with the change in textures, and at least for our test images the Medium and High settings look almost the same.

There are a few last items to note on benchmarking, just by way of reference. First, Assassin's Creed: Unity uses "dynamic" day/night cycles. They're not really dynamic, but Ubisoft has four preset times: morning, noon, dusk, and night. The reason this is important is that benchmarking the same sequence at different times of day can result in quite different results. There's also "dynamic" weather (or at least clouds) that can throw things off. Second, if you change certain image quality settings (which I'll get to next), specifically Texture Quality, you have to restart the game for the changes to take effect. Last, the game has dynamic crowds, which means the runs aren't fully deterministic, but in repeat testing the variance is generally less than 3% and closer to 1%.

The good news is that when you load up the game is always at the morning time slot, so basically you have to exit and reload between every setting change. Yes, it's quite tedious if you're benchmarking a dozen or so GPUs….

Test System and Benchmarks Closing Thoughts
Comments Locked

122 Comments

View All Comments

  • FlushedBubblyJock - Thursday, November 20, 2014 - link

    Well only nVidia stock holders since AMD is the pit of hades in the red holing out of everyone's investment pocket.
  • Dribble - Thursday, November 20, 2014 - link

    Looks like it's basically cpu limited. Difference between ultra and medium is only a few fps for something like a 970 at 1080p. Would be interesting to try it with a 6 or 8 core intel processor and see how it scales with more cores?
  • JarredWalton - Thursday, November 20, 2014 - link

    On which setup are you seeing "only a few FPS"? 1080p Medium is 22% faster with 970 SLI on average FPS and 31% faster on minimums, and a single 970 is 49% faster average and minimum on Medium vs. Ultra. That's far more than a few FPS.

    The gap between Medium and High is much smaller, but then they both use High Quality textures and honestly they look very similar. There the performance is only about 10-30% faster (depending on GPU), though minimums still favor cards with 2GB VRAM by a large amount.
  • Dribble - Thursday, November 20, 2014 - link

    Well I'd expect a bigger performance difference between medium and ultra. Looking at the cpu's the 4 core pretty well doubles the 2 cores min frame rates, that shows cpu is having a much bigger impact. If that's the case what would 6 or 8 cores do?
  • JumpingJack - Thursday, November 20, 2014 - link

    Hahaha, we have a new king .... "but can it run Assassins Creed Unity"
  • Calista - Thursday, November 20, 2014 - link

    If you have the time I would like you to test further with even lower resolution. It's not much point knowing GPU x can do 18 fps@1080p since it's much easier to adopt to lower resolution as compared to lower frame-rate. Maybe you could use the slowest of the buch and try out 1600x900 and 1280x720 as well? If the system is still up and running I guess it would take much more than a few hours.
  • JarredWalton - Thursday, November 20, 2014 - link

    I did run 768p Low on most of the GPUs... I don't want to make a graph because really, desktop users don't want to run games below 1080p IMO. But if you're wondering about the laptops and lower end hardware...

    Performance at 768p Low (Avg/Min):
    860M: 35/25
    870M: 45/32
    880M: 49/37
    980M: 56/40
    R7-250X: 25/12
    R9-280: 37/24
    R9-280X: 43/26
    R9-290X: 49/27
    Intel HD 4600: 6.6/3.2 (Hahaha...)

    Of those, I should note that only the 860M and 250X are unable to hit "playable" frame rates at 900p Medium.
  • huaxshin - Thursday, November 20, 2014 - link

    CPU plays a big role in Assassin Creed Unity so the GTX 980M comparison against the desktop GPUs are skewed. The desktop GPUs are paired with 84W++ CPUs while the GTX 980M is paired with a 47W soldered lower clocked CPU.

    I expect the GTX 980M to be closer to GTX 780 if they ran the same clocks. Something that would be interesting to see from Anandtech, a review of GTX 980M against desktop if both had roughly the same CPU power.
    http://gamegpu.ru/images/remote/http--www.gamegpu....
  • JarredWalton - Thursday, November 20, 2014 - link

    The i3-4330 numbers are there for a look at where the CPU bottleneck would lie on lower end CPUs. I would guess that the mobile quad-core CPUs like the i7-4710HQ are generally keeping the GPU "filled" with work. 980M might be a bit faster with a higher clocked CPU, but I don't think it would come anywhere near the 780 or 970.

    I've got some numbers and basically across a large selection of games the 780 (with a desktop CPU compared to a mobile CPU) is around 25% faster than the 980M (and the 780 and 970 are basically tied in overall rankings -- like literally within 0.1% of each other).
  • anubis44 - Thursday, November 20, 2014 - link

    Jarred, I'd like to see these benchmarks on an AMD FX CPU as well. Forget the APUs, as they don't have level 3 cache, but the FX chips do.

Log in

Don't have an account? Sign up now