Similar to the last game we looked at, Lords of the Fallen, Assassin's Creed: Unity has had a bit of a rocky start with bugs and other issues needing to be ironed out. It also happens to be a very demanding game to run – at maximum quality, it will basically chew up any GPU you throw at it and spit out crispy bits of silicon. And it's not just GPUs that get eaten, as CPU power can have a substantial impact as well. Finally, and this is not necessarily correlated with the other items in this list, Assassin's Creed: Unity (ACU) is an NVIDIA "The Way It's Meant To Be Played" title, and it's also one of the notable games for NVIDIA's GameWorks toolset – ACU includes support for HBAO+, TXAA, PCSS, Tessellation (coming in a future patch), and now MFAA (which we looked at yesterday).

There's an interesting corollary to the above items that's worth getting out of the way: reviews of Assassin's Creed: Unity have so far been rather lackluster, with an overall average Metacritic score currently sitting at 70%. That's not particularly good for a series that has otherwise had good reviews – e.g. the last game, Black Flag, has an average score of 84%. Perhaps more telling is that the current average user review at Metacritic is an abysmal 2.1. Looking at the comments and reviews makes it abundantly clear that ACU tends to run like a slug on a lot of systems.

I think part of the problem is the mistaken idea that many gamers have that they should be able to max out most settings on games. Assassin's Creed has never been a particularly light series in terms of requirements, though at lower detail settings it was usually playable on a wide selection of hardware. With ACU, the requirements have basically shot up, especially for higher quality settings; at the same time, the rendering quality even at Low is still quite good, and Medium is enough that most users should be content with the way it looks. But if you want to run at High, Very High, or Ultra quality, you'd better be packing some serious GPU heat. The other part of the problem is that the game was likely pushed out the door for the Christmas shopping season before it was fully baked, but that happens every year it seems.

There's another element to the Assassin's Creed: Unity launch worth pointing out; this is a multi-platform release, coming out simultaneously on PC, PS4, and Xbox One. By dropping support for the PS3 and Xbox 360, Ubisoft has opened the doors to much higher quality settings, but the requirements may also be too high for a lot of PCs. With the new generation of consoles now sporting 8GB RAM, we've seen a large jump in resource requirements for textures in particular. I mentioned in the Lords of the Fallen article that GPUs with less than 4GB VRAM may need to opt for lower quality settings; with ACU (at least in the current state of patch 1.2), you can drop the "may" from that statement and just go in knowing full well that GPUs with 2GB RAM are going to struggle at times.

Test System and Benchmarks
Comments Locked

122 Comments

View All Comments

  • Jon Tseng - Thursday, November 20, 2014 - link

    Jared I know you didn't test for it but any thoughts on how system memory affects things? Minimum is for 6GB and 8 GB recommended I wonder what impact this has?

    (I've just fine from 4GB => 6GB to run this game; wondering if I need to replace the other two sticks two or whether the fact swapfile will be on SSD is enough)
  • WatcherCK - Thursday, November 20, 2014 - link

    I was looking forward to TC: The Division but given Ubisofts recent track history and inherent game factors (new engine, MMO, rpg aspects) im just not sure that it will be anything except a colossal ballsup?
  • Mondozai - Thursday, November 20, 2014 - link

    I agree with many other commenters about the strangely sanguine tone of this article, breezing past the massive performance bottlenecks and instead urging people to upgrade their hardware instead of pointing the finger where it belongs - Ubisoft - and attacking them for releasing what is essentially a botched game in terms of performance. You should be running 60+ at high 1080p settings with a 290/780. Instead you barely get to 45 frames with a 780.

    The fact that even a 980(!) can't get over 60 fps on 1080p high means that the game needs to be canned and not the reader base's hardware. Do better, Jared.
  • JarredWalton - Thursday, November 20, 2014 - link

    That's certainly not what I'm doing. Just because the last sentence says, "And if this is the shape of things to come, a lot of people might want a GPU upgrade this holiday season" doesn't mean I'm telling everyone to upgrade. What I am saying is that IF you want to run THIS game (and IF other games end up having similar requirements in the near future), then yes, a lot of people will need new hardware (or lower quality settings).

    When Crysis came out, nothing -- NOTHING -- could run it properly at maximum quality settings. People skewered Crytek and said they were lousy programmers, etc. and "the game doesn't even look that good". And yet, I don't really think that was the case -- they just decided to enable settings that pushed beyond what was currently available.

    Is Ubisoft intentionally doing that with their latest releases? Perhaps not in quite the same way (it is the holiday season after all), but the decision to drop support for older generation consoles in order to enable a higher quality experience certainly wasn't made to improve the sales of the game. Believe it or not, there are game developers that just really want to use the latest and greatest technologies, performance be damned.

    Fundamentally, we're not a sensationalist website. We're not in the market of pointing fingers, casting blame, etc. All I can say is how the game works right now on the hardware I tested, and it's up to the readers to draw conclusions. Was the game pushed out early? Almost certainly. Should they design all games so that 1080p High gets 60+ FPS? I'm not one to dictate whether that's the best thing to do or not, and I do like seeing companies push the performance envelope on occasion.

    It hurts when your high-end GPU can't run a game with settings you are accustomed to using, but I do have to say that their recreation of eighteenth century France is quite remarkable.
  • mcmilhouse - Friday, November 21, 2014 - link

    ^This. Plus Nvidia 900 series is still 28nm. We haven't had a 20nm card, Apple took all the TSMC production lines.
  • piroroadkill - Saturday, November 22, 2014 - link

    Crysis absolutely blew everything else away, graphically.

    That moment when you're done coming through the first forest and you hit the rays coming through the trees, and you look down over the cliffs.

    I don't think many people said it was coded badly (although they probably did), but it was such an incredible step up visually that people really took notice.

    Assassin's Creed Unity may also be a fantastic game visually, and I will get it at some point, but the fact is, console hardware is a measly set of Jaguar cores and low to midrange previous generation Radeons.

    People are right to expect their massively more powerful machine could run the game at 60 FPS.
  • Milite777 - Thursday, November 20, 2014 - link

    I've just a laptop with i7 2nd gen, 8 gb ram and 6770m (2 gb vram). I know that this conf is too poor for any serious gaming session.. But I'd like to play ACU, like I do with prev episodes... Could I get at least 25 fps with the lowest settings and a resolution of 1366x768? I don't need the best graphics, I just want to know the story... And of course I have to buy this game to try it... Need help guys :)
  • chizow - Thursday, November 20, 2014 - link

    Interesting findings Jarred with texture setting, it looks like Santa (Ryan?) sent you some early X-mas presents too with the GPU upgrades. I would also be interested to see a kind of "feature expense" comparison, where you go through some of the main settings to give an idea of what kind of perf hit you take when enabling them at different settings.

    For example, I remember a time when setting textures to max was an automatic, but now it seems in this age with 2K and now 4K textures with next-gen console ports, that's no longer possible since those textures will fill VRAM in a heartbeat. Also, did you have any >4GB cards or high bandwidth cards to test to see if they helped with the texture situation at all? Like Titan Black?

    But lately I have seen textures and MSAA creating a much bigger perf hit than in the past due to the amount of VRAM they take up. There was a time where VRAM didn't make as much of a difference as shading power and you could just crank up the textures and use MSAA without the crazy hits to perf we see today.
  • iceveiled - Thursday, November 20, 2014 - link

    I opted to turn everything up to max on my 970 (but with soft shadows turned off) and with a 30 fps locked frame rate (1080p). It plays butter smooth, but man, if any game benefits from that 60 fps frame rate it's assassin's creed, with it's wonky input lag (I play it with a controller) and even wonkier world traversal / parkour.

    Takes a bit of getting used to, but at 30 fps it ain't all that bad and it's a damn nice looking game with the settings maxed out.
  • D. Lister - Friday, November 21, 2014 - link

    Ubisoft is just forming a pattern here of poorly optimised software. They have some of the best artists, but apparently some of the worst software developers. Also, I don't believe them for a second when they try to offload their incompetence on a hardware manufacturer.

Log in

Don't have an account? Sign up now