Similar to the last game we looked at, Lords of the Fallen, Assassin's Creed: Unity has had a bit of a rocky start with bugs and other issues needing to be ironed out. It also happens to be a very demanding game to run – at maximum quality, it will basically chew up any GPU you throw at it and spit out crispy bits of silicon. And it's not just GPUs that get eaten, as CPU power can have a substantial impact as well. Finally, and this is not necessarily correlated with the other items in this list, Assassin's Creed: Unity (ACU) is an NVIDIA "The Way It's Meant To Be Played" title, and it's also one of the notable games for NVIDIA's GameWorks toolset – ACU includes support for HBAO+, TXAA, PCSS, Tessellation (coming in a future patch), and now MFAA (which we looked at yesterday).

There's an interesting corollary to the above items that's worth getting out of the way: reviews of Assassin's Creed: Unity have so far been rather lackluster, with an overall average Metacritic score currently sitting at 70%. That's not particularly good for a series that has otherwise had good reviews – e.g. the last game, Black Flag, has an average score of 84%. Perhaps more telling is that the current average user review at Metacritic is an abysmal 2.1. Looking at the comments and reviews makes it abundantly clear that ACU tends to run like a slug on a lot of systems.

I think part of the problem is the mistaken idea that many gamers have that they should be able to max out most settings on games. Assassin's Creed has never been a particularly light series in terms of requirements, though at lower detail settings it was usually playable on a wide selection of hardware. With ACU, the requirements have basically shot up, especially for higher quality settings; at the same time, the rendering quality even at Low is still quite good, and Medium is enough that most users should be content with the way it looks. But if you want to run at High, Very High, or Ultra quality, you'd better be packing some serious GPU heat. The other part of the problem is that the game was likely pushed out the door for the Christmas shopping season before it was fully baked, but that happens every year it seems.

There's another element to the Assassin's Creed: Unity launch worth pointing out; this is a multi-platform release, coming out simultaneously on PC, PS4, and Xbox One. By dropping support for the PS3 and Xbox 360, Ubisoft has opened the doors to much higher quality settings, but the requirements may also be too high for a lot of PCs. With the new generation of consoles now sporting 8GB RAM, we've seen a large jump in resource requirements for textures in particular. I mentioned in the Lords of the Fallen article that GPUs with less than 4GB VRAM may need to opt for lower quality settings; with ACU (at least in the current state of patch 1.2), you can drop the "may" from that statement and just go in knowing full well that GPUs with 2GB RAM are going to struggle at times.

Test System and Benchmarks
Comments Locked

122 Comments

View All Comments

  • RafaelHerschel - Monday, November 24, 2014 - link

    I don’t mind somebody saying: “this game is perfectly playable for me at 40 fps”. I do mind it if people say that there is no perceivable difference between 40 fps and 60 fps (as stated in the comments) or when people say “the game runs smooth as butter” when it doesn't. The article was fair, some of the comments weren't.

    For me a game is not enjoyable at anything below 50 fps and I much prefer it to have Vsync enabled.

    I would say that most people accept 60 fps as a reasonable goal at medium settings (whatever they may be) with a high-end GPU. Depending on personal taste (graphics settings) and budget people can than choose to sacrifice fps for MSAA, AO, high-res textures and/or money.
    I strongly believe that studios should aim for 60 fps at medium settings with a high-end card and 60 fps with a medium-card at low settings (both at 1080).

    With smart design choices and quality control that is certainly possible. As it stands, I’m disappointed with both Far Cry 4 and Unity.
  • HisDivineOrder - Monday, November 24, 2014 - link

    1) Wonder if an i5 vs i7 (hyperthreading) matters.
    2) Wonder why you guys don't borrow a Titan Black and test it to see if the extra VRAM improves things. Surely, a contact at Asus, Gigabyte, nVidia, etc has a Titan Black with 6GB of RAM to lend you. Probably two for SLI. I'm curious to see if the game can use the VRAM because I'm hearing reports of Ultra taking 4GB and gobbling it up.
    3) Ultra settings preset includes MSAA. That's the first setting I'd turn off if my settings were taking a dive. It gobbles up memory AND processing like nobody's business. What happens if you turn it off?

    Seems like obvious questions to me. Until Batman Arkham Knight, this looks to be The Benchmark game in terms of crushing your system. Assuming they ever finish patching it.
  • RafaelHerschel - Monday, November 24, 2014 - link

    If the available VRAM makes a difference, then lowering texture quality and turning of all forms of AA will make a big difference.

    Unfortunately Ubisoft games don't scale well with lowering the settings.
  • Evenload - Wednesday, November 26, 2014 - link

    VRAM clearly makes a very big difference on this game. To answer the question above I maxed out the settings at 1080p on my GTX Titan (original) and just ran/jumped round Paris a bit while GPU-Z was set to data log. The file shows constantly high memory usage maxing out at about 4.4Gb. Interestingly with stock settings the GPU was often being pushed to relatively high clock rates by GPU boost so it looks like the GPU was not being worked extremely hard.

    Not a scientific test but potentially bad news for people with 2Gb and 3Gb cards as tweaking will not recover the difference. Interestingly I noticed that the main system memory the game takes is not that large and I wander if the issues people are experiencing are possibly related to the way the game has been programmed and the unified memory model PS and Xbox use. On the consoles the distinction between "graphics" memory and "system" memory would not matter in the same way that they do in a gaming PC with a graphics card.
  • joeh4384 - Tuesday, November 25, 2014 - link

    Lol at needing freaking SLI 970s for 60+ fps at 1080p. Do you think patches in time can make this playable for high end single card setups like a 290x on ultra.
  • Lerianis - Sunday, November 30, 2014 - link

    Unity is a good game once you get past the glitchfest. No, it is not a revolution of the Assassin's Creed series, more an evolution of Assassin's Creed 4. It is one awesome game (I played it on a friend's console and another one's PC) once you get past those issues.
    The only thing I don't like about it is that it is VERY VERY hungry for graphics power even at 1080p settings.
    To the point where the latest 980M's from NVidia struggle to push more than 30fps at those settings on Ultra.
    I'm wondering (considering I do not see much additional graphics prettiness) whether that is a sign that the game was not properly optimized for PC's and notebook PC's. If it is, that is something that Ubisoft (and other game makers) are going to have to take note of and fix.
  • Ramon Zarat - Sunday, November 30, 2014 - link

    I'll only say this: Fuck Ubisoft, the new E.A.
  • IUU - Tuesday, December 2, 2014 - link

    At last a breath of fresh air. Instead of getting everyone excited of how good you can play pacman at 10k, one company still serves as a reminder of the distance we still have to cross.

    Way to go Ubisoft, and if you make a game hardly playable at 1280X720, I will make a donation to you and create a church for you. We have had enough from the mobile devolution, touting meaningless resolutions(3 mega pixels on a tablet, oh my god). You will serve as a reminder that high resolution is good, but you have to have some real content to show on it.

    We need a new Crysis, rather not only one but several in succession.
  • wrayj - Tuesday, December 2, 2014 - link

    I've seen videos where dropping the resolution to 1600x900 is really the way to claw back performance.
  • is4u2p - Tuesday, December 9, 2014 - link

    I got way better than this with my i5-3570k and R9-290.

Log in

Don't have an account? Sign up now