Test System and Benchmarks

With that introduction out of the way, let's just get straight to the benchmarks, and then I'll follow up with a discussion of image quality and other aspects at the end. As usual, the test system is what I personally use, which is a relatively high-end Haswell configuration. Most of the hardware was purchased at retail over the past year or so, and that means I don't have access to every GPU configuration available, but I did just get a second ZOTAC GTX 970 so I can at least finally provide some SLI numbers (which I'll add to the previous Benchmarked articles in the near future).

Gaming Benchmarks Test Systems
CPU Intel Core i7-4770K (4x 3.5-3.9GHz, 8MB L3)
Overclocked to 4.1GHz

Underclocked to 3.5GHz with two cores ("i3-4330")
Motherboard Gigabyte G1.Sniper M5 Z87
Memory 2x8GB Corsair Vengeance Pro DDR3-1866 CL9
GPUs Desktop GPUs:
Sapphire Radeon R9 280
Sapphire Radeon R9 280X
Gigabyte Radeon R9 290X
EVGA GeForce GTX 770
EVGA GeForce GTX 780
Zotac GeForce GTX 970
Reference GeForce GTX 980

Laptops:
GeForce GTX 980M (MSI GT72 Dominator Pro)
GeForce GTX 880M (MSI GT70 Dominator Pro)
GeForce GTX 870M (MSI GS60 Ghost 3K Pro)
GeForce GTX 860M (MSI GE60 Apache Pro)
Storage Corsair Neutron GTX 480GB
Power Supply Rosewill Capstone 1000M
Case Corsair Obsidian 350D
Operating System Windows 7 64-bit

We're testing with NVIDIA's 344.65 drivers, which are "Game Ready" for Assassin's Creed: Unity. (I also ran a couple sanity checks with the latest 344.75 drivers and found no difference in performance.) On the AMD side, testing was done with the Catalyst 14.11.2 driver that was released to better support ACU. AMD also released a new beta driver for Far Cry 4 and Dragon Age: Inquisition (14.11.2B), but I have not had a chance to check performance with that yet. No mention is made of improvements for ACU with the driver, so it should be the same as the 14.11.2 driver we used.

One final note is that thanks to the unlocked nature of the i7-4770K and the Gigabyte motherboard BIOS, I'm able to at least mostly simulate lower performance Haswell CPUs. I didn't run a full suite of tests with a second "virtual" CPU, but I did configure the i7-4770K to run similar to a Core i3-4330 (3.5GHz, 2C/4T) – the main difference being the CPU still has 8MB L3 cache where the i3-4330 only has 4MB L3. I tested just one GPU with the slower CPU configuration, the GeForce GTX 980, but this should be the best-case result for what you could get from a Core i3-4330.

Assassins Creed: Unity 4K High

Assassins Creed: Unity QHD Ultra

Assassins Creed: Unity 1080p Ultra

Assassins Creed: Unity 1080p High

Assassins Creed: Unity 1080p Medium

Did I mention that Assassin's Creed: Unity is a beast to run? Yeah, OUCH! 4K gaming is basically out of the question on current hardware, and even QHD is too much at the default Ultra settings. Also notice how badly the GTX 770 does at the Ultra settings, which appears to be due to the 2GB of VRAM; I logged system usage for the GTX 770 at QHD Ultra and found that the game was trying to allocate nearly 3GB of VRAM use, which on a 2GB card means there's going to be a lot of texture thrashing. (4K with High quality also uses around 3GB of VRAM, if you're wondering.) The asterisk is there because I couldn't actually run the benchmark, so I used a "Synchronize" from the top of a tower instead, which is typically slightly less demanding than our actual benchmark run.

Anyway, all of the single GPUs are basically unplayable at QHD Ultra settings, and a big part of that looks to be the higher resolution textures. Dropping the texture quality to High can help, but really the game needs a ton of GPU horsepower to make QHD playable. GTX 970 SLI basically gets there, though again I'd suggest dropping the texture quality to High in order to keep minimum frame rates closer to 30. Even at 1080p, I'd suggest avoiding the Ultra setting – or at least Ultra texture quality – as there's just a lot of stutter. Sadly, the GTX 980M and 880M both have 8GB GDDR5, but their performance with Ultra settings is too low to really be viable, though they do show a bit better minimums relative to the other GPUs.

As we continue down the charts, NVIDIA's GTX 780 and 970 (and faster) cards finally reach the point where performance is totally acceptable at 1080p High (and you can tweak a few settings like turning on HBAO+ and Soft Shadows without too much trouble). What's scary is that looking at the minimum frame rates along with the average FPS, the vast majority of GPUs are still struggling at 1080p High, and it's really only 1080p Medium where most midrange and above GPUs reach the point of playability.

There's a secondary aspect to the charts that you've probably noticed as well. Sadly, AMD's GPUs really don't do well right now with Assassin's Creed: Unity. Some of it is almost certainly drivers, and some of it may be due to the way things like GameWorks come into play. Whatever the cause, ACU is not going to be a great experience on any of the Radeon GPUs right now.

I did some testing of CrossFire R9 290X as well, and while it didn't fail to run, performance was not better than a single 290X – and minimum frame rates were down – so CrossFire (without any attempt to create a custom profile) isn't viable yet. Also note that while SLI "works", there are also rendering issues at times. Entering/exiting the menu/map, or basically any time there's a full screen post processing filter, you get severe flicker (a good example is when you jump off a tower into a hay cart, you'll notice flicker on the peripheral as well as on Arno's clothing). I believe these issues happen on all the multi-GPU rigs, so it might be more of a game issue than a driver issue.

I even went all the way down to 1600x900 Medium to see if that would help any of AMD's GPUs; average frame rates on the R9 290X basically top out at 48FPS with minimums still at 25 or so. I did similar testing on NVIDIA and found that with the overclocked i7-4770K ACU maxes out at just over 75 FPS with minimums of 50+ FPS. We'll have to see if AMD and/or Ubisoft Montreal can get things working better on Radeon GPUs, but for now it's pretty rough. That's not to say the game is unplayable on an R9 290X, as you can certainly run 1080p High, but there are going to be occasional stutters. Anything less than the R9 290/290X and you'll basically want to use Low or Medium quality (with some tweaking).

Finally, I mentioned how 2GB GPUs are really going to have problems, especially at higher texture quality settings. The GeForce GTX 770 is a prime example of this; even at 1080p High, minimum frame rates are consistently dropping into the low teens and occasionally even single digits, and Medium quality still has very poor minimum frame rates. Interestingly, at 1600x900 Medium the minimum FPS basically triples compared to 1080p Medium, so if the game is using more than 2GB VRAM at 1080p Medium it's not by much. This also affects the GTX 860M (1366x768 Low is pretty much what you need to run on that GPU), and the 1GB R7 250X can't even handle that. And it probably goes without saying, but Intel's HD 4600 completely chokes with ACU – 3-7 FPS at 1366x768 is all it can manage.

What About the CPU?

I mentioned earlier that I also underclocked the Core i7-4770K and disabled a couple CPU cores to simulate a Core i3-4330. It's not a fully accurate simulation, but just by way of reference the multi-threaded Cinebench 11.5 score went from 8.08 down to 3.73, which looks about right give or take a few percent. I only tested the GTX 980 with the slower CPU, but this is basically the "best case" for what a Core i3 could do.

Looking at the above 1080p charts, you can see that with the slower CPU the GTX 980 takes quite the hit to performance. In fact, the GTX 980 with a "Core i3" Haswell CPU starts looking an awful lot like the R9 290X: it's playable in a pinch, but the minimum frame rates will definitely create some choppiness at times. I don't have an AMD rig handy to do any testing, unfortunately, but I'd be surprised if the APUs are much faster than the Core i3.

In short, not only do you need a fast GPU, but you also need a fast CPU. And the "just get a $300 console" argument doesn't really work either, as frame rates on the consoles aren't particularly stellar either from what I've read. At least one site has found that both the PS4 and Xbox One fail to maintain a consistent 30FPS or higher frame rate. 

Benchmarked - Assassin's Creed: Unity Image Quality and Settings
POST A COMMENT

122 Comments

View All Comments

  • silverblue - Thursday, November 20, 2014 - link

    There's been a rather intense furore following AC:U's launch. Essentially, Ubisoft have blamed AMD for poor performance, and then partially retracted said statement. Considering how closely Ubisoft works with NVIDIA, it would sound like they've only developed for GeForce cards... but some NVIDIA users are having issues as well. What's more, Far Cry 4 is also causing issues with AMD hardware. Both were developed by the same software house.

    All in all, it looks more likely that Ubisoft Montreal's internal testing and QA is not up to scratch. You can't simply blame one vendor's CPUs and GPUs for poor performance when you've not bothered to optimise your code for anybody barring NVIDIA. I've even heard that playing offline results in a HUGE performance boost across the board...
    Reply
  • Friendly0Fire - Thursday, November 20, 2014 - link

    More like a yearly release schedule is untenable for a game of this scale. Corners had to be cut somewhere. Reply
  • silverblue - Thursday, November 20, 2014 - link

    Logical, but even then, it's poor form for UbiSoft to slate AMD for what is most likely their fault as opposed to poor drivers. Reply
  • FlushedBubblyJock - Thursday, November 20, 2014 - link

    That's amazing how it's never AMD's fault no matter what. No matter how poorly they do. No matter how many features they do not have or only have as a ridiculed braggart's vaporware. No matter how long it takes them to continue failing and not delivering, it's not their fault.
    Never AMD's Fault should be their saying since Never Settle is the opposite of the truth with AMD.
    Reply
  • ppi - Thursday, November 20, 2014 - link

    While I would agree with you AMD has been relagated to ultra-low-budget-inconsequential player on the CPU on front, with respect to GPUs I am not certain where you have been living last couple years, whether on Mars or under some rock.

    Since HD 4000 series, AMD has been running neck-for-neck with nVidia, sometimes kicking it in the rearside soundly, e.g. Radeon 5870 vs. rebadged GeForce 8800, sometimes being a bit behind, until the Maxwell 980 and 970 parts came couple months ago. But even now, the fastest AMD offering is still at least on par with 2nd fastest nVidia offering performance-wise (the issue is rather power consumption). And drivers-wise, there's lot of games coming out with very good graphical fidelity that have no issues on AMD cards.

    Who failed here big time are Ubisoft's managers, who (probably wishing to please the shareholders) wanted to rush the games before the December holiday season to get extra bucks, and allowed proper Q&A to be skipped. There is absolutely no no excuse whatsoever for neglecting GPUs that still make 1/3 of the market (and mind you, nVidia performance is reportedly far from perfect as well). If the AMD cards did not work, they either should not have released the game at all, or release it nVidia only/AMD beta-only.

    I do hope it backfires them at Ubisoft in such a way, that instead of now, these games will be rather bought a year later, in 2015 Steam sale season.
    Reply
  • D. Lister - Friday, November 21, 2014 - link

    (the issue is rather power consumption)


    Imagine what the nvidia hardware could do with the same power budget. And it isn't just power, but also temps and noise. How come AMD default coolers are the worst in the market yet the nvidia default coolers, esp. for the higher-end models are some of the best? How come it took AMD more than a decade to address the multi-gpu micro-stutter issue in the drivers? And how about the alleged speed boost in CPU performance that AMD promised with Win 8, that never quite took off?

    AMD hires from the same talent pool as their competition, but ultimately, it is their consistent corner-cutting and false promises that hurt their business and relegates them to a lower tier.

    I apologise if I offended any AMD fans, but please understand this, you aren't on AMD's side and I'm not on nvidia/intel's side... it is us, the consumers who are all on the same side, and unless we demand the quality that we are paying for, every now and then someone would try to get away by BSing us out of our hard-earned cash.
    Reply
  • FragAU - Friday, November 21, 2014 - link

    You are kidding right? I have been fortunate enough to essentially own every top-end GPU since the days of 3DFX Voodoo (and before!). AMD has certainly released some absolute monster cards and has been responsible for keeping Nvidia in check since all other competition ceased to exist. Both companies have had their fair share of the performance crown.

    Currently I own 2x 290X and have since their launch - I run every single game without issue (aside from the topic of this one) at Ultra settings with no issues (Both watercooled so nice and silent too). Ubi soft is just plain rubbish these days, heck look at the status of their cruddy GTA wannabe watch dogs? That game had issues on any PC. Tell me how black flag can run flawless and then this game just run like absolute crud? Sure a 980 should be in front but the 780ti/290x shouldn't be that far behind.

    I will freely admit that Nvidia usually do have more solid drivers in early releases but nothing that has been a deal breaker. Having run SLi and CF since early days I can tell you that both have share of issues .. Anyway all I can say is you better hope that AMD keep on the heels of Nvidia or you will be paying $700 for the same GPU for 3 generations.
    Reply
  • silverblue - Friday, November 21, 2014 - link

    CrossfireX was only introduced in September 2005. Granted, the time from then to a viable fix was about 8 years (which is still a very long time) but there's two factors to consider here - how long has it been a problem, and how long has it taken AMD to acknowledge it themselves? The discrepancy between the two is what they really need to be judged upon, not how long it took for a solution to reach the user base. Promising fixes is one thing, burying your head in the sand and flat out ignoring your customers is quite another.

    FlushedBubblyJock mentioned it never being AMD's fault for this, that and the other. You'd have to be blinkered to subscribe to that particular theory. AMD's biggest problem is delivery - VCE support was a joke for a long time; some might say their DirectX drivers are in need of serious work; TrueAudio doesn't appear to be having any impact... to name a few. Team Green is in the ascendency right now, and AMD can't release a counter quickly enough, so they look to have no answer for Maxwell. It's almost all down to delivery, and we can only hope they improve in this area. It's not about being a fanboy, but bringing some objectivity to the discussion.
    Reply
  • ppi - Friday, November 21, 2014 - link

    Yes, right. But my point was mainly that graphical glitches and poor performance in ONE PARTICULAR GAME, sponsored by AMD's competitor, should be blamed on Ubisoft Q&A and them rushing to get the game out for x-mas, rather than on AMD.

    AMD do disapoint me though. Case example: When LCDs came out, I thought - great, now we will be able to get variable refresh rates. But lo and behold, 10 years pass and nothing, until nVidia comes with G-Sync. And then we learn AMD had done it, they had it RIGHT IN FRONT OF THEIR EYES, and they did not see the benefits, but instead tried to sell it as some crappy energy saving thingy. *facepalm* It is clear their development lacks some people who would focus on improving *game experience*.

    (btw, from my last 6 gfx cards, 3 were nVidia, 3 AMD/ATI)
    Reply
  • D. Lister - Saturday, November 22, 2014 - link

    @ silverblue

    "CrossfireX was only introduced in September 2005..."

    I'm sorry, -almost- a decade then. Because it is really inconsequential how long a particular phase takes in the entire process of solving a problem - what matters is how long it took the end-users to get their money's worth.

    Secondly, the defence that they just didn't know any better, while the competition apparently did, to the point that the competition had to develop a tool (FCAT) for AMD to actually see (or recognise) the problem, merely suggests that if they weren't being deliberately callous, they were just outright incompetent. Either way, the point is that they need to step up their game, because their customers and fans deserve better than what been bringing forth, and because the free market needs good competition.
    Reply

Log in

Don't have an account? Sign up now