Similar to the last game we looked at, Lords of the Fallen, Assassin's Creed: Unity has had a bit of a rocky start with bugs and other issues needing to be ironed out. It also happens to be a very demanding game to run – at maximum quality, it will basically chew up any GPU you throw at it and spit out crispy bits of silicon. And it's not just GPUs that get eaten, as CPU power can have a substantial impact as well. Finally, and this is not necessarily correlated with the other items in this list, Assassin's Creed: Unity (ACU) is an NVIDIA "The Way It's Meant To Be Played" title, and it's also one of the notable games for NVIDIA's GameWorks toolset – ACU includes support for HBAO+, TXAA, PCSS, Tessellation (coming in a future patch), and now MFAA (which we looked at yesterday).

There's an interesting corollary to the above items that's worth getting out of the way: reviews of Assassin's Creed: Unity have so far been rather lackluster, with an overall average Metacritic score currently sitting at 70%. That's not particularly good for a series that has otherwise had good reviews – e.g. the last game, Black Flag, has an average score of 84%. Perhaps more telling is that the current average user review at Metacritic is an abysmal 2.1. Looking at the comments and reviews makes it abundantly clear that ACU tends to run like a slug on a lot of systems.

I think part of the problem is the mistaken idea that many gamers have that they should be able to max out most settings on games. Assassin's Creed has never been a particularly light series in terms of requirements, though at lower detail settings it was usually playable on a wide selection of hardware. With ACU, the requirements have basically shot up, especially for higher quality settings; at the same time, the rendering quality even at Low is still quite good, and Medium is enough that most users should be content with the way it looks. But if you want to run at High, Very High, or Ultra quality, you'd better be packing some serious GPU heat. The other part of the problem is that the game was likely pushed out the door for the Christmas shopping season before it was fully baked, but that happens every year it seems.

There's another element to the Assassin's Creed: Unity launch worth pointing out; this is a multi-platform release, coming out simultaneously on PC, PS4, and Xbox One. By dropping support for the PS3 and Xbox 360, Ubisoft has opened the doors to much higher quality settings, but the requirements may also be too high for a lot of PCs. With the new generation of consoles now sporting 8GB RAM, we've seen a large jump in resource requirements for textures in particular. I mentioned in the Lords of the Fallen article that GPUs with less than 4GB VRAM may need to opt for lower quality settings; with ACU (at least in the current state of patch 1.2), you can drop the "may" from that statement and just go in knowing full well that GPUs with 2GB RAM are going to struggle at times.

Test System and Benchmarks
Comments Locked

122 Comments

View All Comments

  • JarredWalton - Thursday, November 20, 2014 - link

    I'll bet you a dollar you're CPU limited at mid to high 70s when you're down on the streets. Anyway, I ran the Medium numbers as well at 1080p, which is basically FXAA with High textures and a few other items turned down a notch that don't really affect things that much. As to what's "an accurate representation of the kind of performance you can get", well, the numbers don't lie. If you want to run different settings, the numbers change, but there's a reason the developers don't just use FXAA as the default at all settings.
  • Carfax - Thursday, November 20, 2014 - link

    I probably am CPU limited with V-sync off, but considering I'm above 60 FPS and how much is being rendered (the game is absolutely massive in scope and detail), I would say that the engine is still fairly optimized. When I'm playing the game, my CPU is usually around 50 to 60% loaded on all 12 threads with V-sync on. I haven't tested CPU usage with V-sync off though.

    The game definitely uses a hex core processor, so that's probably why your frame rates are lower than mine..
  • mcmilhouse - Thursday, November 20, 2014 - link

    I wonder if Apple didn' take all the 20nm production this year, and amd/nvidia had 20nm cards, if we wouldnt have a $200-300 card that outputs 60fps easy at 1080p ultra. We really should of been at 20nm this year.
  • Crazyeyeskillah - Thursday, November 20, 2014 - link

    Why don't u turn off AA and show people what the game can actually run at. I don't know why this is a must have when you can't get solid frame rates. If you ran all the same benches without any AA i don't see why it would be so abysmal. AA is a luxury not mandatory.
  • JarredWalton - Thursday, November 20, 2014 - link

    You mean like the 1080p Medium graph? That uses FXAA, which is nearly "free" to enable.
  • Crazyeyeskillah - Friday, November 21, 2014 - link

    No, more on some of the high end numbers where aa starts to get redundant, especially at 4k. I loved crysis when it came out and it slapped my 7900gtx sli around because I knew it was the start of something great to come. This game does have some nice touches, especially in the quantity of npc's on the screen, use of ai and level of detail for such an expansive city, but is nowhere close to heralding in a new concept look of what's to come in terms of textures and reach. Most people are gonna set it to highest textures, turn off AA and get their playable fps at whatever resolution their card supports so I have to admit this is the first time i've really felt a little leary at the state of the game presented on ANANDTECH. I've been reading the site since it was launched but this game benchmark just didn't make me come off with a sense of what performance is really going to be like across various setups.
  • FITCamaro - Thursday, November 20, 2014 - link

    I don't understand how they can do a poorer job of porting the game to PC on AMD hardware than Nvidia when the consoles are using AMD GPUs. Unless they built it for PC with Nvidia in mind and then did a crappy job of porting it to consoles. Of course given the poor performance of the game on consoles, that isn't hard to believe.

    Ubisoft is quickly becoming the new EA. I won't be buying this game this year. Probably in a year when it's down to $20 and they've maybe patched it to a reasonable state. I say maybe because Watch Dogs has been out for months and is still pretty bad.
  • FlushedBubblyJock - Thursday, November 20, 2014 - link

    The bleeding edge has to be pushed, lest there be no need for more.
    Same thing was said about Crysis, then it wound up being the most famous FPS freak game ever, and still is, until perhaps now.
    So getting down on the leading edge games that present a challenge to GPU designers is not in all of our best interest.
    Also it's nice to see a "port" frustrate the highest end elite desktops and see the whining not be about how cruddy for any sort of gaming ported games are, but in this case how " slow my thousands of dollars are ".
    Very glad too see it crushing the best of the best, we need more of this at a faster rate, then we hopefully won't hear so much about and so often " the increase with the new core isn't worth it ".

    Now the GPU makers must overcome, a challenge is a good thing.
  • Horza - Thursday, November 20, 2014 - link

    This would be a reasonable sentiment if in fact the game was "bleeding edge" graphically. Crysis was a landmark visually (and still looks impressive) and I feel very safe to wager that Unity will not be remembered in even close to the same way. Anyone can make a game that brings "elite" hardware to it's knees, it's not an impressive feat on it's own if it doesn't deliver the experience to justify it.
  • Daggard - Thursday, November 20, 2014 - link

    *shrug* runs fine on my PS4. I'd give it more of an 8.5 personally. Paris is the best playground yet for this series. Online features are still being ironed out, but the game is great :)

Log in

Don't have an account? Sign up now