Similar to the last game we looked at, Lords of the Fallen, Assassin's Creed: Unity has had a bit of a rocky start with bugs and other issues needing to be ironed out. It also happens to be a very demanding game to run – at maximum quality, it will basically chew up any GPU you throw at it and spit out crispy bits of silicon. And it's not just GPUs that get eaten, as CPU power can have a substantial impact as well. Finally, and this is not necessarily correlated with the other items in this list, Assassin's Creed: Unity (ACU) is an NVIDIA "The Way It's Meant To Be Played" title, and it's also one of the notable games for NVIDIA's GameWorks toolset – ACU includes support for HBAO+, TXAA, PCSS, Tessellation (coming in a future patch), and now MFAA (which we looked at yesterday).

There's an interesting corollary to the above items that's worth getting out of the way: reviews of Assassin's Creed: Unity have so far been rather lackluster, with an overall average Metacritic score currently sitting at 70%. That's not particularly good for a series that has otherwise had good reviews – e.g. the last game, Black Flag, has an average score of 84%. Perhaps more telling is that the current average user review at Metacritic is an abysmal 2.1. Looking at the comments and reviews makes it abundantly clear that ACU tends to run like a slug on a lot of systems.

I think part of the problem is the mistaken idea that many gamers have that they should be able to max out most settings on games. Assassin's Creed has never been a particularly light series in terms of requirements, though at lower detail settings it was usually playable on a wide selection of hardware. With ACU, the requirements have basically shot up, especially for higher quality settings; at the same time, the rendering quality even at Low is still quite good, and Medium is enough that most users should be content with the way it looks. But if you want to run at High, Very High, or Ultra quality, you'd better be packing some serious GPU heat. The other part of the problem is that the game was likely pushed out the door for the Christmas shopping season before it was fully baked, but that happens every year it seems.

There's another element to the Assassin's Creed: Unity launch worth pointing out; this is a multi-platform release, coming out simultaneously on PC, PS4, and Xbox One. By dropping support for the PS3 and Xbox 360, Ubisoft has opened the doors to much higher quality settings, but the requirements may also be too high for a lot of PCs. With the new generation of consoles now sporting 8GB RAM, we've seen a large jump in resource requirements for textures in particular. I mentioned in the Lords of the Fallen article that GPUs with less than 4GB VRAM may need to opt for lower quality settings; with ACU (at least in the current state of patch 1.2), you can drop the "may" from that statement and just go in knowing full well that GPUs with 2GB RAM are going to struggle at times.

Test System and Benchmarks
Comments Locked

122 Comments

View All Comments

  • FlushedBubblyJock - Thursday, November 20, 2014 - link

    Well only nVidia stock holders since AMD is the pit of hades in the red holing out of everyone's investment pocket.
  • Dribble - Thursday, November 20, 2014 - link

    Looks like it's basically cpu limited. Difference between ultra and medium is only a few fps for something like a 970 at 1080p. Would be interesting to try it with a 6 or 8 core intel processor and see how it scales with more cores?
  • JarredWalton - Thursday, November 20, 2014 - link

    On which setup are you seeing "only a few FPS"? 1080p Medium is 22% faster with 970 SLI on average FPS and 31% faster on minimums, and a single 970 is 49% faster average and minimum on Medium vs. Ultra. That's far more than a few FPS.

    The gap between Medium and High is much smaller, but then they both use High Quality textures and honestly they look very similar. There the performance is only about 10-30% faster (depending on GPU), though minimums still favor cards with 2GB VRAM by a large amount.
  • Dribble - Thursday, November 20, 2014 - link

    Well I'd expect a bigger performance difference between medium and ultra. Looking at the cpu's the 4 core pretty well doubles the 2 cores min frame rates, that shows cpu is having a much bigger impact. If that's the case what would 6 or 8 cores do?
  • JumpingJack - Thursday, November 20, 2014 - link

    Hahaha, we have a new king .... "but can it run Assassins Creed Unity"
  • Calista - Thursday, November 20, 2014 - link

    If you have the time I would like you to test further with even lower resolution. It's not much point knowing GPU x can do 18 fps@1080p since it's much easier to adopt to lower resolution as compared to lower frame-rate. Maybe you could use the slowest of the buch and try out 1600x900 and 1280x720 as well? If the system is still up and running I guess it would take much more than a few hours.
  • JarredWalton - Thursday, November 20, 2014 - link

    I did run 768p Low on most of the GPUs... I don't want to make a graph because really, desktop users don't want to run games below 1080p IMO. But if you're wondering about the laptops and lower end hardware...

    Performance at 768p Low (Avg/Min):
    860M: 35/25
    870M: 45/32
    880M: 49/37
    980M: 56/40
    R7-250X: 25/12
    R9-280: 37/24
    R9-280X: 43/26
    R9-290X: 49/27
    Intel HD 4600: 6.6/3.2 (Hahaha...)

    Of those, I should note that only the 860M and 250X are unable to hit "playable" frame rates at 900p Medium.
  • huaxshin - Thursday, November 20, 2014 - link

    CPU plays a big role in Assassin Creed Unity so the GTX 980M comparison against the desktop GPUs are skewed. The desktop GPUs are paired with 84W++ CPUs while the GTX 980M is paired with a 47W soldered lower clocked CPU.

    I expect the GTX 980M to be closer to GTX 780 if they ran the same clocks. Something that would be interesting to see from Anandtech, a review of GTX 980M against desktop if both had roughly the same CPU power.
    http://gamegpu.ru/images/remote/http--www.gamegpu....
  • JarredWalton - Thursday, November 20, 2014 - link

    The i3-4330 numbers are there for a look at where the CPU bottleneck would lie on lower end CPUs. I would guess that the mobile quad-core CPUs like the i7-4710HQ are generally keeping the GPU "filled" with work. 980M might be a bit faster with a higher clocked CPU, but I don't think it would come anywhere near the 780 or 970.

    I've got some numbers and basically across a large selection of games the 780 (with a desktop CPU compared to a mobile CPU) is around 25% faster than the 980M (and the 780 and 970 are basically tied in overall rankings -- like literally within 0.1% of each other).
  • anubis44 - Thursday, November 20, 2014 - link

    Jarred, I'd like to see these benchmarks on an AMD FX CPU as well. Forget the APUs, as they don't have level 3 cache, but the FX chips do.

Log in

Don't have an account? Sign up now