Crysis 3

Still one of our most punishing benchmarks, Crysis 3 needs no introduction. With Crysis 3, Crytek has gone back to trying to kill computers and still holds the “most punishing shooter” title in our benchmark suite. Only in a handful of setups can we even run Crysis 3 at its highest (Very High) settings, and that’s still without AA. Crysis 1 was an excellent template for the kind of performance required to drive games for the next few years, and Crysis 3 looks to be much the same for 2015.

Crysis 3 - 3840x2160 - High Quality + FXAA

Crysis 3 - 3840x2160 - Low Quality + FXAA

Crysis 3 - 2560x1440 - High Quality + FXAA

Under Crysis 3 the R9 Fury once again has the lead, though there is a clear amount of variation in that lead depending on the resolution. At 4K it’s 14% or so, but at 1440p it’s just 5%. This is consistent with the general trend for AMD and NVIDIA cards, which is that AMD sees better performance scaling at higher resolutions, and is a big part of the reason why AMD is pushing 4K for the R9 Fury X and R9 Fury. Still, based on absolute performance, the R9 Fury’s performance probably makes it better suited for 1440p.

Meanwhile the R9 Fury cards once again consistently trail the R9 Fury X by no more than 7%. Crysis 3 is generally more sensitive to changes in shader throughput, so it’s interesting to see that the performance gap is as narrow as it is here. These kinds of results imply that the R9 Fury X’s last 512 stream processors aren’t being put to very good use, since most of the performance difference can be accounted for in the clockspeed difference.

Battlefield 4 Middle Earth: Shadow of Mordor
Comments Locked

288 Comments

View All Comments

  • Oxford Guy - Thursday, July 16, 2015 - link

    "What exactly is the logic there?"

    I really need to spell it out for you?

    The logic is that the 480 was a successful product despite having horrid performance per watt and a very inefficient (both in terms of noise and temps) cooler. It didn't get nearly the gnashing of teeth the recent AMD cards are getting and people routinely bragged about running more than one of them in SLI.
  • CiccioB - Thursday, July 16, 2015 - link

    No, it was not a successful product at all, though it was still the fastest card on market.
    The successful card was the 460 launched few months later and surely the 570/580 cards which brought the corrections to the original GF100 that nvidia itself said it was bugged.
    Here, instead, we have a card which uses a lot of power, it is not on top of the charts and there's really no fix at the horizont for it.
    The difference was that with GF100 nvidia messed up the implementation of the architecture which was then fixxed, here we are seeing what is the most advanced implementation of a really not so good architecture that for 3 years has struggled to keep the pace of the competitions which at the end has decided to go with a 1024 shaders + 128bit wide bus in a 220mm^2 die space against a 1792 shader + 256bit wide bus in a 356mm^2 die space instead of trying to have the latest fps longer bar war.
    AMD, please, review your architecture completely or we are doomed with next PP.
  • Oxford Guy - Tuesday, July 21, 2015 - link

    "No, it was not a successful product at all"

    It was successful. Enthusiasts bought them in a significant number and review sites showed off their two and three card rigs. The only site that even showed their miserable performance per watt was techpowerup
  • Count Vladimir - Thursday, July 16, 2015 - link

    So we are discussing 6 year old products now? Is that your version of logic? Yes, it was hot, yes, it was buggy but it was still the fastest video card in its era, that's why people bragged about SLI'ing it. Fury X isn't.
  • Oxford Guy - Tuesday, July 21, 2015 - link

    "So we are discussing 6 year old products now?" strawman
  • celebrevida - Thursday, July 16, 2015 - link

    Looks like Jason Evangelho of PCWorld has the matter settled. In his article:
    http://www.pcworld.com/article/2947547/components-...

    He shows that R9 Fury x2 is on par with GTX 980 Ti x 2 and blows away GTX 980 x2. Considering that R9 Fury x2 is much cheaper than GTX 980 Ti x2 and also R9 Fury is optimized for upcoming DX12, it looks like R9 Fury is the clear winner in cost/performance.
  • xplane - Saturday, October 17, 2015 - link

    So with this GPU I could use 5 monitors simultaneously? Right?
  • kakapoopoo - Wednesday, January 4, 2017 - link

    i got the sapphire version up to 1150 stably using msi after burner w/o changing anything else

Log in

Don't have an account? Sign up now