Image Quality and Settings

In retrospect, I probably should have just skipped the Ultra quality setting and opted for some form of custom settings. The texture data just overwhelms most GPUs at Ultra, and even High still struggles in many cases. Even more problematic is that there are only three texturing options: Low, High, and Ultra.

I also want to point you to NVIDIA's Assassin's Creed: Unity Graphics and Performance Guide, because if you're wanting a better look at what some of the graphics options really mean in terms of quality that article has everything you need to know. One item particularly worth noting is that NVIDIA recommends 2GB cards use Low textures, 3GB cards can do High, and Ultra is for 4GB cards (or maybe 6GB/8GB cards).

Anyway, here's a quick look at what the various presets do for quality. Let me start with a table showing what specific settings are applied for each of the presets. Again, the NVIDIA page linked above has a good explanation for what each of the settings does, and more importantly it has image sliders to let you do A/B comparisons for each setting. (Disregard their AA images, though, as it looks like they used 2560x1440 and shrunk them to 1080p – oops.)

Assassin's Creed: Unity Image Quality Presets
  Low Medium High Very High Ultra
Environmental Low Medium High Very High Ultra
Texture Low High High Ultra Ultra
Shadow Low Low High High Soft (PCSS)
Ambient Occlusion Off SSAO SSAO HBAO+ HBAO+
Anti-Aliasing Off FXAA 2xMSAA 2xMSAA 4xMSAA
Bloom Off On On On On
 

The main things to note is that there's a rather noticeable difference between Low and High texture quality, but not so much from High to Ultra. Environmental quality has a generally minor effect on the appearance of the game, especially at everything above Medium (though there are a few areas that are exceptions to this statement). The difference between Low and High shadows is also quite small, but the Soft Shadows implement PCSS (Percentage Closer Soft Shadows), which do look quite nice while also causing a moderate performance hit.

Anti-aliasing has a ton of settings, but the most useful are generally the MSAA options; those are also the most demanding. FXAA is as usual nearly "free" to enable and can help remove jaggies along with some other image details, which might be the best solution. TXAA performance is pretty similar to 4xMSAA I think, which means it's mostly for high-end rigs. Bloom is pretty much always on except at the lowest setting. Finally, ambient occlusion has two options along with off: SSAO or HBAO+. NVIDIA developed HBAO+ as a better version of AO, and in general I think they're right. It's also supposed to be faster than SSAO, at least on NVIDIA GPUs, so if you have NVIDIA hardware you'll probably want to enable that.

Looking at the presets, the difference between Ultra and Very High is visible in the right areas (e.g. placese with shadows), but they're overall pretty similar. There's a more noticeable drop from Very High to High, mostly with the change in textures, and at least for our test images the Medium and High settings look almost the same.

There are a few last items to note on benchmarking, just by way of reference. First, Assassin's Creed: Unity uses "dynamic" day/night cycles. They're not really dynamic, but Ubisoft has four preset times: morning, noon, dusk, and night. The reason this is important is that benchmarking the same sequence at different times of day can result in quite different results. There's also "dynamic" weather (or at least clouds) that can throw things off. Second, if you change certain image quality settings (which I'll get to next), specifically Texture Quality, you have to restart the game for the changes to take effect. Last, the game has dynamic crowds, which means the runs aren't fully deterministic, but in repeat testing the variance is generally less than 3% and closer to 1%.

The good news is that when you load up the game is always at the morning time slot, so basically you have to exit and reload between every setting change. Yes, it's quite tedious if you're benchmarking a dozen or so GPUs….

Test System and Benchmarks Closing Thoughts
Comments Locked

122 Comments

View All Comments

  • Jon Tseng - Thursday, November 20, 2014 - link

    Jared I know you didn't test for it but any thoughts on how system memory affects things? Minimum is for 6GB and 8 GB recommended I wonder what impact this has?

    (I've just fine from 4GB => 6GB to run this game; wondering if I need to replace the other two sticks two or whether the fact swapfile will be on SSD is enough)
  • WatcherCK - Thursday, November 20, 2014 - link

    I was looking forward to TC: The Division but given Ubisofts recent track history and inherent game factors (new engine, MMO, rpg aspects) im just not sure that it will be anything except a colossal ballsup?
  • Mondozai - Thursday, November 20, 2014 - link

    I agree with many other commenters about the strangely sanguine tone of this article, breezing past the massive performance bottlenecks and instead urging people to upgrade their hardware instead of pointing the finger where it belongs - Ubisoft - and attacking them for releasing what is essentially a botched game in terms of performance. You should be running 60+ at high 1080p settings with a 290/780. Instead you barely get to 45 frames with a 780.

    The fact that even a 980(!) can't get over 60 fps on 1080p high means that the game needs to be canned and not the reader base's hardware. Do better, Jared.
  • JarredWalton - Thursday, November 20, 2014 - link

    That's certainly not what I'm doing. Just because the last sentence says, "And if this is the shape of things to come, a lot of people might want a GPU upgrade this holiday season" doesn't mean I'm telling everyone to upgrade. What I am saying is that IF you want to run THIS game (and IF other games end up having similar requirements in the near future), then yes, a lot of people will need new hardware (or lower quality settings).

    When Crysis came out, nothing -- NOTHING -- could run it properly at maximum quality settings. People skewered Crytek and said they were lousy programmers, etc. and "the game doesn't even look that good". And yet, I don't really think that was the case -- they just decided to enable settings that pushed beyond what was currently available.

    Is Ubisoft intentionally doing that with their latest releases? Perhaps not in quite the same way (it is the holiday season after all), but the decision to drop support for older generation consoles in order to enable a higher quality experience certainly wasn't made to improve the sales of the game. Believe it or not, there are game developers that just really want to use the latest and greatest technologies, performance be damned.

    Fundamentally, we're not a sensationalist website. We're not in the market of pointing fingers, casting blame, etc. All I can say is how the game works right now on the hardware I tested, and it's up to the readers to draw conclusions. Was the game pushed out early? Almost certainly. Should they design all games so that 1080p High gets 60+ FPS? I'm not one to dictate whether that's the best thing to do or not, and I do like seeing companies push the performance envelope on occasion.

    It hurts when your high-end GPU can't run a game with settings you are accustomed to using, but I do have to say that their recreation of eighteenth century France is quite remarkable.
  • mcmilhouse - Friday, November 21, 2014 - link

    ^This. Plus Nvidia 900 series is still 28nm. We haven't had a 20nm card, Apple took all the TSMC production lines.
  • piroroadkill - Saturday, November 22, 2014 - link

    Crysis absolutely blew everything else away, graphically.

    That moment when you're done coming through the first forest and you hit the rays coming through the trees, and you look down over the cliffs.

    I don't think many people said it was coded badly (although they probably did), but it was such an incredible step up visually that people really took notice.

    Assassin's Creed Unity may also be a fantastic game visually, and I will get it at some point, but the fact is, console hardware is a measly set of Jaguar cores and low to midrange previous generation Radeons.

    People are right to expect their massively more powerful machine could run the game at 60 FPS.
  • Milite777 - Thursday, November 20, 2014 - link

    I've just a laptop with i7 2nd gen, 8 gb ram and 6770m (2 gb vram). I know that this conf is too poor for any serious gaming session.. But I'd like to play ACU, like I do with prev episodes... Could I get at least 25 fps with the lowest settings and a resolution of 1366x768? I don't need the best graphics, I just want to know the story... And of course I have to buy this game to try it... Need help guys :)
  • chizow - Thursday, November 20, 2014 - link

    Interesting findings Jarred with texture setting, it looks like Santa (Ryan?) sent you some early X-mas presents too with the GPU upgrades. I would also be interested to see a kind of "feature expense" comparison, where you go through some of the main settings to give an idea of what kind of perf hit you take when enabling them at different settings.

    For example, I remember a time when setting textures to max was an automatic, but now it seems in this age with 2K and now 4K textures with next-gen console ports, that's no longer possible since those textures will fill VRAM in a heartbeat. Also, did you have any >4GB cards or high bandwidth cards to test to see if they helped with the texture situation at all? Like Titan Black?

    But lately I have seen textures and MSAA creating a much bigger perf hit than in the past due to the amount of VRAM they take up. There was a time where VRAM didn't make as much of a difference as shading power and you could just crank up the textures and use MSAA without the crazy hits to perf we see today.
  • iceveiled - Thursday, November 20, 2014 - link

    I opted to turn everything up to max on my 970 (but with soft shadows turned off) and with a 30 fps locked frame rate (1080p). It plays butter smooth, but man, if any game benefits from that 60 fps frame rate it's assassin's creed, with it's wonky input lag (I play it with a controller) and even wonkier world traversal / parkour.

    Takes a bit of getting used to, but at 30 fps it ain't all that bad and it's a damn nice looking game with the settings maxed out.
  • D. Lister - Friday, November 21, 2014 - link

    Ubisoft is just forming a pattern here of poorly optimised software. They have some of the best artists, but apparently some of the worst software developers. Also, I don't believe them for a second when they try to offload their incompetence on a hardware manufacturer.

Log in

Don't have an account? Sign up now