Test System and Benchmarks

With that introduction out of the way, let's just get straight to the benchmarks, and then I'll follow up with a discussion of image quality and other aspects at the end. As usual, the test system is what I personally use, which is a relatively high-end Haswell configuration. Most of the hardware was purchased at retail over the past year or so, and that means I don't have access to every GPU configuration available, but I did just get a second ZOTAC GTX 970 so I can at least finally provide some SLI numbers (which I'll add to the previous Benchmarked articles in the near future).

Gaming Benchmarks Test Systems
CPU Intel Core i7-4770K (4x 3.5-3.9GHz, 8MB L3)
Overclocked to 4.1GHz

Underclocked to 3.5GHz with two cores ("i3-4330")
Motherboard Gigabyte G1.Sniper M5 Z87
Memory 2x8GB Corsair Vengeance Pro DDR3-1866 CL9
GPUs Desktop GPUs:
Sapphire Radeon R9 280
Sapphire Radeon R9 280X
Gigabyte Radeon R9 290X
EVGA GeForce GTX 770
EVGA GeForce GTX 780
Zotac GeForce GTX 970
Reference GeForce GTX 980

GeForce GTX 980M (MSI GT72 Dominator Pro)
GeForce GTX 880M (MSI GT70 Dominator Pro)
GeForce GTX 870M (MSI GS60 Ghost 3K Pro)
GeForce GTX 860M (MSI GE60 Apache Pro)
Storage Corsair Neutron GTX 480GB
Power Supply Rosewill Capstone 1000M
Case Corsair Obsidian 350D
Operating System Windows 7 64-bit

We're testing with NVIDIA's 344.65 drivers, which are "Game Ready" for Assassin's Creed: Unity. (I also ran a couple sanity checks with the latest 344.75 drivers and found no difference in performance.) On the AMD side, testing was done with the Catalyst 14.11.2 driver that was released to better support ACU. AMD also released a new beta driver for Far Cry 4 and Dragon Age: Inquisition (14.11.2B), but I have not had a chance to check performance with that yet. No mention is made of improvements for ACU with the driver, so it should be the same as the 14.11.2 driver we used.

One final note is that thanks to the unlocked nature of the i7-4770K and the Gigabyte motherboard BIOS, I'm able to at least mostly simulate lower performance Haswell CPUs. I didn't run a full suite of tests with a second "virtual" CPU, but I did configure the i7-4770K to run similar to a Core i3-4330 (3.5GHz, 2C/4T) – the main difference being the CPU still has 8MB L3 cache where the i3-4330 only has 4MB L3. I tested just one GPU with the slower CPU configuration, the GeForce GTX 980, but this should be the best-case result for what you could get from a Core i3-4330.

Assassins Creed: Unity 4K High

Assassins Creed: Unity QHD Ultra

Assassins Creed: Unity 1080p Ultra

Assassins Creed: Unity 1080p High

Assassins Creed: Unity 1080p Medium

Did I mention that Assassin's Creed: Unity is a beast to run? Yeah, OUCH! 4K gaming is basically out of the question on current hardware, and even QHD is too much at the default Ultra settings. Also notice how badly the GTX 770 does at the Ultra settings, which appears to be due to the 2GB of VRAM; I logged system usage for the GTX 770 at QHD Ultra and found that the game was trying to allocate nearly 3GB of VRAM use, which on a 2GB card means there's going to be a lot of texture thrashing. (4K with High quality also uses around 3GB of VRAM, if you're wondering.) The asterisk is there because I couldn't actually run the benchmark, so I used a "Synchronize" from the top of a tower instead, which is typically slightly less demanding than our actual benchmark run.

Anyway, all of the single GPUs are basically unplayable at QHD Ultra settings, and a big part of that looks to be the higher resolution textures. Dropping the texture quality to High can help, but really the game needs a ton of GPU horsepower to make QHD playable. GTX 970 SLI basically gets there, though again I'd suggest dropping the texture quality to High in order to keep minimum frame rates closer to 30. Even at 1080p, I'd suggest avoiding the Ultra setting – or at least Ultra texture quality – as there's just a lot of stutter. Sadly, the GTX 980M and 880M both have 8GB GDDR5, but their performance with Ultra settings is too low to really be viable, though they do show a bit better minimums relative to the other GPUs.

As we continue down the charts, NVIDIA's GTX 780 and 970 (and faster) cards finally reach the point where performance is totally acceptable at 1080p High (and you can tweak a few settings like turning on HBAO+ and Soft Shadows without too much trouble). What's scary is that looking at the minimum frame rates along with the average FPS, the vast majority of GPUs are still struggling at 1080p High, and it's really only 1080p Medium where most midrange and above GPUs reach the point of playability.

There's a secondary aspect to the charts that you've probably noticed as well. Sadly, AMD's GPUs really don't do well right now with Assassin's Creed: Unity. Some of it is almost certainly drivers, and some of it may be due to the way things like GameWorks come into play. Whatever the cause, ACU is not going to be a great experience on any of the Radeon GPUs right now.

I did some testing of CrossFire R9 290X as well, and while it didn't fail to run, performance was not better than a single 290X – and minimum frame rates were down – so CrossFire (without any attempt to create a custom profile) isn't viable yet. Also note that while SLI "works", there are also rendering issues at times. Entering/exiting the menu/map, or basically any time there's a full screen post processing filter, you get severe flicker (a good example is when you jump off a tower into a hay cart, you'll notice flicker on the peripheral as well as on Arno's clothing). I believe these issues happen on all the multi-GPU rigs, so it might be more of a game issue than a driver issue.

I even went all the way down to 1600x900 Medium to see if that would help any of AMD's GPUs; average frame rates on the R9 290X basically top out at 48FPS with minimums still at 25 or so. I did similar testing on NVIDIA and found that with the overclocked i7-4770K ACU maxes out at just over 75 FPS with minimums of 50+ FPS. We'll have to see if AMD and/or Ubisoft Montreal can get things working better on Radeon GPUs, but for now it's pretty rough. That's not to say the game is unplayable on an R9 290X, as you can certainly run 1080p High, but there are going to be occasional stutters. Anything less than the R9 290/290X and you'll basically want to use Low or Medium quality (with some tweaking).

Finally, I mentioned how 2GB GPUs are really going to have problems, especially at higher texture quality settings. The GeForce GTX 770 is a prime example of this; even at 1080p High, minimum frame rates are consistently dropping into the low teens and occasionally even single digits, and Medium quality still has very poor minimum frame rates. Interestingly, at 1600x900 Medium the minimum FPS basically triples compared to 1080p Medium, so if the game is using more than 2GB VRAM at 1080p Medium it's not by much. This also affects the GTX 860M (1366x768 Low is pretty much what you need to run on that GPU), and the 1GB R7 250X can't even handle that. And it probably goes without saying, but Intel's HD 4600 completely chokes with ACU – 3-7 FPS at 1366x768 is all it can manage.

What About the CPU?

I mentioned earlier that I also underclocked the Core i7-4770K and disabled a couple CPU cores to simulate a Core i3-4330. It's not a fully accurate simulation, but just by way of reference the multi-threaded Cinebench 11.5 score went from 8.08 down to 3.73, which looks about right give or take a few percent. I only tested the GTX 980 with the slower CPU, but this is basically the "best case" for what a Core i3 could do.

Looking at the above 1080p charts, you can see that with the slower CPU the GTX 980 takes quite the hit to performance. In fact, the GTX 980 with a "Core i3" Haswell CPU starts looking an awful lot like the R9 290X: it's playable in a pinch, but the minimum frame rates will definitely create some choppiness at times. I don't have an AMD rig handy to do any testing, unfortunately, but I'd be surprised if the APUs are much faster than the Core i3.

In short, not only do you need a fast GPU, but you also need a fast CPU. And the "just get a $300 console" argument doesn't really work either, as frame rates on the consoles aren't particularly stellar either from what I've read. At least one site has found that both the PS4 and Xbox One fail to maintain a consistent 30FPS or higher frame rate. 

Benchmarked - Assassin's Creed: Unity Image Quality and Settings


View All Comments

  • Jon Tseng - Thursday, November 20, 2014 - link

    Jared I know you didn't test for it but any thoughts on how system memory affects things? Minimum is for 6GB and 8 GB recommended I wonder what impact this has?

    (I've just fine from 4GB => 6GB to run this game; wondering if I need to replace the other two sticks two or whether the fact swapfile will be on SSD is enough)
  • WatcherCK - Thursday, November 20, 2014 - link

    I was looking forward to TC: The Division but given Ubisofts recent track history and inherent game factors (new engine, MMO, rpg aspects) im just not sure that it will be anything except a colossal ballsup? Reply
  • Mondozai - Thursday, November 20, 2014 - link

    I agree with many other commenters about the strangely sanguine tone of this article, breezing past the massive performance bottlenecks and instead urging people to upgrade their hardware instead of pointing the finger where it belongs - Ubisoft - and attacking them for releasing what is essentially a botched game in terms of performance. You should be running 60+ at high 1080p settings with a 290/780. Instead you barely get to 45 frames with a 780.

    The fact that even a 980(!) can't get over 60 fps on 1080p high means that the game needs to be canned and not the reader base's hardware. Do better, Jared.
  • JarredWalton - Thursday, November 20, 2014 - link

    That's certainly not what I'm doing. Just because the last sentence says, "And if this is the shape of things to come, a lot of people might want a GPU upgrade this holiday season" doesn't mean I'm telling everyone to upgrade. What I am saying is that IF you want to run THIS game (and IF other games end up having similar requirements in the near future), then yes, a lot of people will need new hardware (or lower quality settings).

    When Crysis came out, nothing -- NOTHING -- could run it properly at maximum quality settings. People skewered Crytek and said they were lousy programmers, etc. and "the game doesn't even look that good". And yet, I don't really think that was the case -- they just decided to enable settings that pushed beyond what was currently available.

    Is Ubisoft intentionally doing that with their latest releases? Perhaps not in quite the same way (it is the holiday season after all), but the decision to drop support for older generation consoles in order to enable a higher quality experience certainly wasn't made to improve the sales of the game. Believe it or not, there are game developers that just really want to use the latest and greatest technologies, performance be damned.

    Fundamentally, we're not a sensationalist website. We're not in the market of pointing fingers, casting blame, etc. All I can say is how the game works right now on the hardware I tested, and it's up to the readers to draw conclusions. Was the game pushed out early? Almost certainly. Should they design all games so that 1080p High gets 60+ FPS? I'm not one to dictate whether that's the best thing to do or not, and I do like seeing companies push the performance envelope on occasion.

    It hurts when your high-end GPU can't run a game with settings you are accustomed to using, but I do have to say that their recreation of eighteenth century France is quite remarkable.
  • mcmilhouse - Friday, November 21, 2014 - link

    ^This. Plus Nvidia 900 series is still 28nm. We haven't had a 20nm card, Apple took all the TSMC production lines. Reply
  • piroroadkill - Saturday, November 22, 2014 - link

    Crysis absolutely blew everything else away, graphically.

    That moment when you're done coming through the first forest and you hit the rays coming through the trees, and you look down over the cliffs.

    I don't think many people said it was coded badly (although they probably did), but it was such an incredible step up visually that people really took notice.

    Assassin's Creed Unity may also be a fantastic game visually, and I will get it at some point, but the fact is, console hardware is a measly set of Jaguar cores and low to midrange previous generation Radeons.

    People are right to expect their massively more powerful machine could run the game at 60 FPS.
  • Milite777 - Thursday, November 20, 2014 - link

    I've just a laptop with i7 2nd gen, 8 gb ram and 6770m (2 gb vram). I know that this conf is too poor for any serious gaming session.. But I'd like to play ACU, like I do with prev episodes... Could I get at least 25 fps with the lowest settings and a resolution of 1366x768? I don't need the best graphics, I just want to know the story... And of course I have to buy this game to try it... Need help guys :) Reply
  • chizow - Thursday, November 20, 2014 - link

    Interesting findings Jarred with texture setting, it looks like Santa (Ryan?) sent you some early X-mas presents too with the GPU upgrades. I would also be interested to see a kind of "feature expense" comparison, where you go through some of the main settings to give an idea of what kind of perf hit you take when enabling them at different settings.

    For example, I remember a time when setting textures to max was an automatic, but now it seems in this age with 2K and now 4K textures with next-gen console ports, that's no longer possible since those textures will fill VRAM in a heartbeat. Also, did you have any >4GB cards or high bandwidth cards to test to see if they helped with the texture situation at all? Like Titan Black?

    But lately I have seen textures and MSAA creating a much bigger perf hit than in the past due to the amount of VRAM they take up. There was a time where VRAM didn't make as much of a difference as shading power and you could just crank up the textures and use MSAA without the crazy hits to perf we see today.
  • iceveiled - Thursday, November 20, 2014 - link

    I opted to turn everything up to max on my 970 (but with soft shadows turned off) and with a 30 fps locked frame rate (1080p). It plays butter smooth, but man, if any game benefits from that 60 fps frame rate it's assassin's creed, with it's wonky input lag (I play it with a controller) and even wonkier world traversal / parkour.

    Takes a bit of getting used to, but at 30 fps it ain't all that bad and it's a damn nice looking game with the settings maxed out.
  • D. Lister - Friday, November 21, 2014 - link

    Ubisoft is just forming a pattern here of poorly optimised software. They have some of the best artists, but apparently some of the worst software developers. Also, I don't believe them for a second when they try to offload their incompetence on a hardware manufacturer. Reply

Log in

Don't have an account? Sign up now