Similar to the last game we looked at, Lords of the Fallen, Assassin's Creed: Unity has had a bit of a rocky start with bugs and other issues needing to be ironed out. It also happens to be a very demanding game to run – at maximum quality, it will basically chew up any GPU you throw at it and spit out crispy bits of silicon. And it's not just GPUs that get eaten, as CPU power can have a substantial impact as well. Finally, and this is not necessarily correlated with the other items in this list, Assassin's Creed: Unity (ACU) is an NVIDIA "The Way It's Meant To Be Played" title, and it's also one of the notable games for NVIDIA's GameWorks toolset – ACU includes support for HBAO+, TXAA, PCSS, Tessellation (coming in a future patch), and now MFAA (which we looked at yesterday).

There's an interesting corollary to the above items that's worth getting out of the way: reviews of Assassin's Creed: Unity have so far been rather lackluster, with an overall average Metacritic score currently sitting at 70%. That's not particularly good for a series that has otherwise had good reviews – e.g. the last game, Black Flag, has an average score of 84%. Perhaps more telling is that the current average user review at Metacritic is an abysmal 2.1. Looking at the comments and reviews makes it abundantly clear that ACU tends to run like a slug on a lot of systems.

I think part of the problem is the mistaken idea that many gamers have that they should be able to max out most settings on games. Assassin's Creed has never been a particularly light series in terms of requirements, though at lower detail settings it was usually playable on a wide selection of hardware. With ACU, the requirements have basically shot up, especially for higher quality settings; at the same time, the rendering quality even at Low is still quite good, and Medium is enough that most users should be content with the way it looks. But if you want to run at High, Very High, or Ultra quality, you'd better be packing some serious GPU heat. The other part of the problem is that the game was likely pushed out the door for the Christmas shopping season before it was fully baked, but that happens every year it seems.

There's another element to the Assassin's Creed: Unity launch worth pointing out; this is a multi-platform release, coming out simultaneously on PC, PS4, and Xbox One. By dropping support for the PS3 and Xbox 360, Ubisoft has opened the doors to much higher quality settings, but the requirements may also be too high for a lot of PCs. With the new generation of consoles now sporting 8GB RAM, we've seen a large jump in resource requirements for textures in particular. I mentioned in the Lords of the Fallen article that GPUs with less than 4GB VRAM may need to opt for lower quality settings; with ACU (at least in the current state of patch 1.2), you can drop the "may" from that statement and just go in knowing full well that GPUs with 2GB RAM are going to struggle at times.

Test System and Benchmarks
Comments Locked

122 Comments

View All Comments

  • poohbear - Friday, November 21, 2014 - link

    lets be honest, this is a poorly optimized game with an enormous amount of bugs that was so ridiuclously messed up it made the BBC news and their shares dropped 11%! It's a complete debacle.
  • dwade123 - Friday, November 21, 2014 - link

    Good thing I didn't buy GTX 980 for $460. It can't run next genports maxed comfortably. Bring out the real nextgen gpus!
  • maroon1 - Friday, November 21, 2014 - link

    Core i3 4130 with GTX 750 Ti runs this game as good as console version

    Eurogamers did test by matching the graphic quality of PC to console version (by running it with 900p and similar graphic settings to PS4), and the result that GTX 750 Ti plays it as good if not slightly better.
  • cmdrmonkey - Friday, November 21, 2014 - link

    When a game is barely playable on the most high-end video cards on the market at resolutions and settings PC gamers are accustomed to, you have utterly failed. Bravo Ubisoft. Bravo.
  • P39Airacobra - Friday, November 21, 2014 - link

    You can forget about Ubicrap fixing this! This is why Ubicrap gave the unreal PC requirements! They are getting money from GPU/CPU Hardware to help market for them! And they do care to spend more money on us scum customers anyway! So I say XXXXXXXXXXXX UBICRAP!!!!!
  • P39Airacobra - Friday, November 21, 2014 - link

    They should be arrested for doing this!
  • mr. president - Sunday, November 23, 2014 - link

    Any chance of testing CPU performance on AMD vs nvidia GPUs? I've seen a *ton* of recent games underperform on AMD GPUs due to what I think is their lack of support for deferred contexts aka 'multithreaded rendering'. It's particularly low-end CPUs that are affected.

    Unity pushes something like 50.000 draw calls each frame. Note the enormous disparity in minimum framerates between the two vendors on 1080p/medium where even slower nvidia GPUs get higher minimums than faster AMD GPUs. I think it's worth exploring as even low-end FX CPUs can almost double their performance on high-end nvidia GPUs vs. high-end AMD GPUs.
  • FlushedBubblyJock - Tuesday, November 25, 2014 - link

    That last line you have tells me AMD is offloading multiple boatloads of work to the cpu --- isn't that exactly why Mantle is for low end cpu's - it relieves the gigantic overburdening cheaty normal driver of AMD that hammers the puny AMD cpus.

    It's sad really - shortcuts and angles and scammy drivers that really only hurt everyone.
  • RafaelHerschel - Sunday, November 23, 2014 - link

    A few observations:

    60 frames per seconds isn’t some arbitrary value. With Vsync enabled and a refresh rate of 60Hz, dips below 60 fps are far more unpleasant. Adaptive Vsync addresses that but isn’t available to everybody. Disabling Vsync leads to screen tearing which some people (me included) find extremely annoying.

    In a game every frame consists of discrete information. In a movie each frame is slightly blurred or at least partially blurred, a natural effect of capturing moving objects in a frame. For a game to feel fluent at 24 or 30 fps it needs to add artificial blurring.

    In movies each frame has the same length. In games the length of each frame is different. So even 60 fps can feel choppy.

    Different people have different sensibilities. I always notice a low frame rate and frame drops. A steady 60 fps with Vsync enabled works best for me. Anything below 50 fps (in a game) feels off to me and above 60 I don’t notice that much difference. Likewise for gaming and movies I use screens with a fast response time since ghosting really distracts me.

    I feel that with a decent system a 60 fps minimum should be attainable. What bugs me is that in some games lowering the quality settings has little impact on the minimum frame rate.

    I’m always surprised by blanket statement like “30 fps per second is perfectly playable”. Depending on the game, the settings and the person playing the game it’s often not. For me another factor is how close I’m to the screen.
  • JarredWalton - Monday, November 24, 2014 - link

    FWIW, I've been playing games for about 35 years now (since I was 6 on a Magnovox Odyssey II), and when I say a game is "playable" at 40 FPS, what I'm saying is that as someone with years of game playing behind them feels the game works fine at that frame rate. I've also played ACU for many hours at sub-60 FPS rates (without G-SYNC being enabled) and didn't mind the experience. Of course I wasn't the one saying it was "perfectly playable" above, but it is most definitely playable and IMO acceptable for performance. If you want *ideal*, which is completely different, then yes: 60+ FPS is what you want. But then there are those with LCDs running at 120Hz who would want even higher frame rates. YMMV.

Log in

Don't have an account? Sign up now