Test System and Benchmarks

With that introduction out of the way, let's just get straight to the benchmarks, and then I'll follow up with a discussion of image quality and other aspects at the end. As usual, the test system is what I personally use, which is a relatively high-end Haswell configuration. Most of the hardware was purchased at retail over the past year or so, and that means I don't have access to every GPU configuration available, but I did just get a second ZOTAC GTX 970 so I can at least finally provide some SLI numbers (which I'll add to the previous Benchmarked articles in the near future).

Gaming Benchmarks Test Systems
CPU Intel Core i7-4770K (4x 3.5-3.9GHz, 8MB L3)
Overclocked to 4.1GHz

Underclocked to 3.5GHz with two cores ("i3-4330")
Motherboard Gigabyte G1.Sniper M5 Z87
Memory 2x8GB Corsair Vengeance Pro DDR3-1866 CL9
GPUs Desktop GPUs:
Sapphire Radeon R9 280
Sapphire Radeon R9 280X
Gigabyte Radeon R9 290X
EVGA GeForce GTX 770
EVGA GeForce GTX 780
Zotac GeForce GTX 970
Reference GeForce GTX 980

Laptops:
GeForce GTX 980M (MSI GT72 Dominator Pro)
GeForce GTX 880M (MSI GT70 Dominator Pro)
GeForce GTX 870M (MSI GS60 Ghost 3K Pro)
GeForce GTX 860M (MSI GE60 Apache Pro)
Storage Corsair Neutron GTX 480GB
Power Supply Rosewill Capstone 1000M
Case Corsair Obsidian 350D
Operating System Windows 7 64-bit

We're testing with NVIDIA's 344.65 drivers, which are "Game Ready" for Assassin's Creed: Unity. (I also ran a couple sanity checks with the latest 344.75 drivers and found no difference in performance.) On the AMD side, testing was done with the Catalyst 14.11.2 driver that was released to better support ACU. AMD also released a new beta driver for Far Cry 4 and Dragon Age: Inquisition (14.11.2B), but I have not had a chance to check performance with that yet. No mention is made of improvements for ACU with the driver, so it should be the same as the 14.11.2 driver we used.

One final note is that thanks to the unlocked nature of the i7-4770K and the Gigabyte motherboard BIOS, I'm able to at least mostly simulate lower performance Haswell CPUs. I didn't run a full suite of tests with a second "virtual" CPU, but I did configure the i7-4770K to run similar to a Core i3-4330 (3.5GHz, 2C/4T) – the main difference being the CPU still has 8MB L3 cache where the i3-4330 only has 4MB L3. I tested just one GPU with the slower CPU configuration, the GeForce GTX 980, but this should be the best-case result for what you could get from a Core i3-4330.

Assassins Creed: Unity 4K High

Assassins Creed: Unity QHD Ultra

Assassins Creed: Unity 1080p Ultra

Assassins Creed: Unity 1080p High

Assassins Creed: Unity 1080p Medium

Did I mention that Assassin's Creed: Unity is a beast to run? Yeah, OUCH! 4K gaming is basically out of the question on current hardware, and even QHD is too much at the default Ultra settings. Also notice how badly the GTX 770 does at the Ultra settings, which appears to be due to the 2GB of VRAM; I logged system usage for the GTX 770 at QHD Ultra and found that the game was trying to allocate nearly 3GB of VRAM use, which on a 2GB card means there's going to be a lot of texture thrashing. (4K with High quality also uses around 3GB of VRAM, if you're wondering.) The asterisk is there because I couldn't actually run the benchmark, so I used a "Synchronize" from the top of a tower instead, which is typically slightly less demanding than our actual benchmark run.

Anyway, all of the single GPUs are basically unplayable at QHD Ultra settings, and a big part of that looks to be the higher resolution textures. Dropping the texture quality to High can help, but really the game needs a ton of GPU horsepower to make QHD playable. GTX 970 SLI basically gets there, though again I'd suggest dropping the texture quality to High in order to keep minimum frame rates closer to 30. Even at 1080p, I'd suggest avoiding the Ultra setting – or at least Ultra texture quality – as there's just a lot of stutter. Sadly, the GTX 980M and 880M both have 8GB GDDR5, but their performance with Ultra settings is too low to really be viable, though they do show a bit better minimums relative to the other GPUs.

As we continue down the charts, NVIDIA's GTX 780 and 970 (and faster) cards finally reach the point where performance is totally acceptable at 1080p High (and you can tweak a few settings like turning on HBAO+ and Soft Shadows without too much trouble). What's scary is that looking at the minimum frame rates along with the average FPS, the vast majority of GPUs are still struggling at 1080p High, and it's really only 1080p Medium where most midrange and above GPUs reach the point of playability.

There's a secondary aspect to the charts that you've probably noticed as well. Sadly, AMD's GPUs really don't do well right now with Assassin's Creed: Unity. Some of it is almost certainly drivers, and some of it may be due to the way things like GameWorks come into play. Whatever the cause, ACU is not going to be a great experience on any of the Radeon GPUs right now.

I did some testing of CrossFire R9 290X as well, and while it didn't fail to run, performance was not better than a single 290X – and minimum frame rates were down – so CrossFire (without any attempt to create a custom profile) isn't viable yet. Also note that while SLI "works", there are also rendering issues at times. Entering/exiting the menu/map, or basically any time there's a full screen post processing filter, you get severe flicker (a good example is when you jump off a tower into a hay cart, you'll notice flicker on the peripheral as well as on Arno's clothing). I believe these issues happen on all the multi-GPU rigs, so it might be more of a game issue than a driver issue.

I even went all the way down to 1600x900 Medium to see if that would help any of AMD's GPUs; average frame rates on the R9 290X basically top out at 48FPS with minimums still at 25 or so. I did similar testing on NVIDIA and found that with the overclocked i7-4770K ACU maxes out at just over 75 FPS with minimums of 50+ FPS. We'll have to see if AMD and/or Ubisoft Montreal can get things working better on Radeon GPUs, but for now it's pretty rough. That's not to say the game is unplayable on an R9 290X, as you can certainly run 1080p High, but there are going to be occasional stutters. Anything less than the R9 290/290X and you'll basically want to use Low or Medium quality (with some tweaking).

Finally, I mentioned how 2GB GPUs are really going to have problems, especially at higher texture quality settings. The GeForce GTX 770 is a prime example of this; even at 1080p High, minimum frame rates are consistently dropping into the low teens and occasionally even single digits, and Medium quality still has very poor minimum frame rates. Interestingly, at 1600x900 Medium the minimum FPS basically triples compared to 1080p Medium, so if the game is using more than 2GB VRAM at 1080p Medium it's not by much. This also affects the GTX 860M (1366x768 Low is pretty much what you need to run on that GPU), and the 1GB R7 250X can't even handle that. And it probably goes without saying, but Intel's HD 4600 completely chokes with ACU – 3-7 FPS at 1366x768 is all it can manage.

What About the CPU?

I mentioned earlier that I also underclocked the Core i7-4770K and disabled a couple CPU cores to simulate a Core i3-4330. It's not a fully accurate simulation, but just by way of reference the multi-threaded Cinebench 11.5 score went from 8.08 down to 3.73, which looks about right give or take a few percent. I only tested the GTX 980 with the slower CPU, but this is basically the "best case" for what a Core i3 could do.

Looking at the above 1080p charts, you can see that with the slower CPU the GTX 980 takes quite the hit to performance. In fact, the GTX 980 with a "Core i3" Haswell CPU starts looking an awful lot like the R9 290X: it's playable in a pinch, but the minimum frame rates will definitely create some choppiness at times. I don't have an AMD rig handy to do any testing, unfortunately, but I'd be surprised if the APUs are much faster than the Core i3.

In short, not only do you need a fast GPU, but you also need a fast CPU. And the "just get a $300 console" argument doesn't really work either, as frame rates on the consoles aren't particularly stellar either from what I've read. At least one site has found that both the PS4 and Xbox One fail to maintain a consistent 30FPS or higher frame rate. 

Benchmarked - Assassin's Creed: Unity Image Quality and Settings
POST A COMMENT

122 Comments

View All Comments

  • RafaelHerschel - Monday, November 24, 2014 - link

    I don’t mind somebody saying: “this game is perfectly playable for me at 40 fps”. I do mind it if people say that there is no perceivable difference between 40 fps and 60 fps (as stated in the comments) or when people say “the game runs smooth as butter” when it doesn't. The article was fair, some of the comments weren't.

    For me a game is not enjoyable at anything below 50 fps and I much prefer it to have Vsync enabled.

    I would say that most people accept 60 fps as a reasonable goal at medium settings (whatever they may be) with a high-end GPU. Depending on personal taste (graphics settings) and budget people can than choose to sacrifice fps for MSAA, AO, high-res textures and/or money.
    I strongly believe that studios should aim for 60 fps at medium settings with a high-end card and 60 fps with a medium-card at low settings (both at 1080).

    With smart design choices and quality control that is certainly possible. As it stands, I’m disappointed with both Far Cry 4 and Unity.
    Reply
  • HisDivineOrder - Monday, November 24, 2014 - link

    1) Wonder if an i5 vs i7 (hyperthreading) matters.
    2) Wonder why you guys don't borrow a Titan Black and test it to see if the extra VRAM improves things. Surely, a contact at Asus, Gigabyte, nVidia, etc has a Titan Black with 6GB of RAM to lend you. Probably two for SLI. I'm curious to see if the game can use the VRAM because I'm hearing reports of Ultra taking 4GB and gobbling it up.
    3) Ultra settings preset includes MSAA. That's the first setting I'd turn off if my settings were taking a dive. It gobbles up memory AND processing like nobody's business. What happens if you turn it off?

    Seems like obvious questions to me. Until Batman Arkham Knight, this looks to be The Benchmark game in terms of crushing your system. Assuming they ever finish patching it.
    Reply
  • RafaelHerschel - Monday, November 24, 2014 - link

    If the available VRAM makes a difference, then lowering texture quality and turning of all forms of AA will make a big difference.

    Unfortunately Ubisoft games don't scale well with lowering the settings.
    Reply
  • Evenload - Wednesday, November 26, 2014 - link

    VRAM clearly makes a very big difference on this game. To answer the question above I maxed out the settings at 1080p on my GTX Titan (original) and just ran/jumped round Paris a bit while GPU-Z was set to data log. The file shows constantly high memory usage maxing out at about 4.4Gb. Interestingly with stock settings the GPU was often being pushed to relatively high clock rates by GPU boost so it looks like the GPU was not being worked extremely hard.

    Not a scientific test but potentially bad news for people with 2Gb and 3Gb cards as tweaking will not recover the difference. Interestingly I noticed that the main system memory the game takes is not that large and I wander if the issues people are experiencing are possibly related to the way the game has been programmed and the unified memory model PS and Xbox use. On the consoles the distinction between "graphics" memory and "system" memory would not matter in the same way that they do in a gaming PC with a graphics card.
    Reply
  • joeh4384 - Tuesday, November 25, 2014 - link

    Lol at needing freaking SLI 970s for 60+ fps at 1080p. Do you think patches in time can make this playable for high end single card setups like a 290x on ultra. Reply
  • Lerianis - Sunday, November 30, 2014 - link

    Unity is a good game once you get past the glitchfest. No, it is not a revolution of the Assassin's Creed series, more an evolution of Assassin's Creed 4. It is one awesome game (I played it on a friend's console and another one's PC) once you get past those issues.
    The only thing I don't like about it is that it is VERY VERY hungry for graphics power even at 1080p settings.
    To the point where the latest 980M's from NVidia struggle to push more than 30fps at those settings on Ultra.
    I'm wondering (considering I do not see much additional graphics prettiness) whether that is a sign that the game was not properly optimized for PC's and notebook PC's. If it is, that is something that Ubisoft (and other game makers) are going to have to take note of and fix.
    Reply
  • Ramon Zarat - Sunday, November 30, 2014 - link

    I'll only say this: Fuck Ubisoft, the new E.A. Reply
  • IUU - Tuesday, December 2, 2014 - link

    At last a breath of fresh air. Instead of getting everyone excited of how good you can play pacman at 10k, one company still serves as a reminder of the distance we still have to cross.

    Way to go Ubisoft, and if you make a game hardly playable at 1280X720, I will make a donation to you and create a church for you. We have had enough from the mobile devolution, touting meaningless resolutions(3 mega pixels on a tablet, oh my god). You will serve as a reminder that high resolution is good, but you have to have some real content to show on it.

    We need a new Crysis, rather not only one but several in succession.
    Reply
  • wrayj - Tuesday, December 2, 2014 - link

    I've seen videos where dropping the resolution to 1600x900 is really the way to claw back performance. Reply
  • is4u2p - Tuesday, December 9, 2014 - link

    I got way better than this with my i5-3570k and R9-290. Reply

Log in

Don't have an account? Sign up now