Crysis 3

Still one of our most punishing benchmarks, Crysis 3 needs no introduction. With Crysis 3, Crytek has gone back to trying to kill computers and still holds the “most punishing shooter” title in our benchmark suite. Only in a handful of setups can we even run Crysis 3 at its highest (Very High) settings, and that’s still without AA. Crysis 1 was an excellent template for the kind of performance required to drive games for the next few years, and Crysis 3 looks to be much the same for 2015.

Crysis 3 - 3840x2160 - High Quality + FXAA

Crysis 3 - 3840x2160 - Low Quality + FXAA

Crysis 3 - 2560x1440 - High Quality + FXAA

Under Crysis 3 the R9 Fury once again has the lead, though there is a clear amount of variation in that lead depending on the resolution. At 4K it’s 14% or so, but at 1440p it’s just 5%. This is consistent with the general trend for AMD and NVIDIA cards, which is that AMD sees better performance scaling at higher resolutions, and is a big part of the reason why AMD is pushing 4K for the R9 Fury X and R9 Fury. Still, based on absolute performance, the R9 Fury’s performance probably makes it better suited for 1440p.

Meanwhile the R9 Fury cards once again consistently trail the R9 Fury X by no more than 7%. Crysis 3 is generally more sensitive to changes in shader throughput, so it’s interesting to see that the performance gap is as narrow as it is here. These kinds of results imply that the R9 Fury X’s last 512 stream processors aren’t being put to very good use, since most of the performance difference can be accounted for in the clockspeed difference.

Battlefield 4 Middle Earth: Shadow of Mordor
Comments Locked

288 Comments

View All Comments

  • Oxford Guy - Saturday, July 11, 2015 - link

    2% cost difference is likely to be erased by sale pricing at various times.
  • darkfalz - Saturday, July 11, 2015 - link

    My 980 is about 15% from stock, and it's a poor overclocker despite running cool. These cards struggle to hit 10%. I also can't go back 6 months ago and buy a R9 Fury. And Nvidia's next release is likely around the corner. I think they're approximately equal value - which is good for AMD fans, but it's been a long wait for them to have a card comparable to what NVIDIA enthusiasts have been enjoying for a year!
  • Flunk - Friday, July 10, 2015 - link

    It's nice to see AMD win a segment. I'm not sure that the Fury X matters that much in the grand scheme of things, seeing that it's the same price as the better performing Geforce 980 TI.

    The Fury seems to overclock to almost match the Fury X, making it a good enthusiast buy.
  • cmikeh2 - Friday, July 10, 2015 - link

    If you're willing to over clock though, you can get a good 15+ percent out of the 980 and pretty much bring it even with an OCed Fury for a little less money.
  • looncraz - Friday, July 10, 2015 - link

    But as soon as voltage control is unlocked the fury will probably eek out at least another 100MHz or more, which will put it healthily out of reach of the 980. And, once a few more driver issues (such as GTA V performance) the performance of the Fury will improve even more.

    HBM has a different performance profile, and AMD is still accommodating that. And, of course, if you turn the nVidia image quality up to AMD levels, nVidia loses a few extra percent of performance.

    The GTX 980 vs R9 Fury question is easy to answer (until a 980 price drop). The Fury X vs 980 Ti question is slightly more difficult (but the answer tends to go the other way, the AIO cooler being the Fury X's main draw).
  • D. Lister - Saturday, July 11, 2015 - link

    "if you turn the nVidia image quality up to AMD levels, nVidia loses a few extra percent of performance."

    Surely we have some proof to go along with that allegation... ?
  • silverblue - Saturday, July 11, 2015 - link

    I've heard the same thing, although I believe it was concerning the lack of anisotropic filtering on the NVIDIA side. However, anisotropic filtering is very cheap nowadays as far as I'm aware, so it's not really going to shake things up much whether it's on OR off, though image quality does improve noticeably.
  • D. Lister - Saturday, July 11, 2015 - link

    Err...

    http://international.download.nvidia.com/webassets...

    You mean to say that it doesn't work like it is supposed to?
  • silverblue - Monday, July 13, 2015 - link

    I'm not sure what you're getting at. In any case, I was trying to debunk the myth that turning off AF makes a real difference to performance.
  • FlushedBubblyJock - Wednesday, July 15, 2015 - link

    no, there's no proof, the proof of course is inside the raging gourd of the amd fanboy, never be unlocked by merely sane mortal beings.

Log in

Don't have an account? Sign up now