Crysis 3

Still one of our most punishing benchmarks, Crysis 3 needs no introduction. With Crysis 3, Crytek has gone back to trying to kill computers and still holds “most punishing shooter” title in our benchmark suite. Only in a handful of setups can we even run Crysis 3 at its highest (Very High) settings, and that’s still without AA. Crysis 1 was an excellent template for the kind of performance required to drive games for the next few years, and Crysis 3 looks to be much the same for 2014.

Crysis 3 - 1920x1080 - Medium Quality + FXAA

Crysis 3 - 1920x1080 - Low Quality + FXAA

Crysis 3 ends up being the other game that AMD’s latest cards have some trouble with. The R7 260 simply doesn’t have what it takes to catch the GTX 650 Ti, seeming due to a memory bandwidth bottleneck. Meanwhile the R7 265 isn’t going to catch up to the GTX 660, but with that full 256-bit memory bus and the higher memory clockspeeds Pitcairn parts enjoy on the 200 series versus the 7800 series, it has enough memory bandwidth to hold close to the GTX 660.

Which on that note, this is by far the biggest lead the R7 265 has over the 7850. Between the higher clockspeed and the even greater memory bandwidth increase, it pulls well ahead of its most direct predecessor, giving it enough performance to average better than 60fps even on Medium settings, which is quite the accomplishment for a sub-$150 card on Crysis 3.

Battlefield 4 Crysis: Warhead
Comments Locked

52 Comments

View All Comments

  • just4U - Thursday, February 13, 2014 - link

    While you may be right... AMD/Ati does like throwing popular configurations into the mix.. The 265 reminds me a lot of the 4830 and while that card was fairly short lived it was a hot seller for them as it straddled two performance areas but came in at a nicer price point.
  • jabber - Friday, February 14, 2014 - link

    Indeed I swapped from being a longtime Nvidia user to AMD back in 2009 as I got fed up with Nvidia regurgitating the old 8800 chips three times in a row for the mid level.

    Stuff doesn't have to change radically performance wise but its nice to know new features are added and other things get revised and tweaked. A simple name change isn't enough really.
  • MrSpadge - Thursday, February 13, 2014 - link

    I'm actually happy they're finally making use of that last digit in their 3-number scheme. From my point of view they could have ditched the X altogether and make the R9-270X an R9-275 (or whatever is appropriate). And speaking of R9: they could have given the R7 265 the rating R9 265 to more closely connect it with R9 270. Or just drop that prefix as well, if the numbers don't overlap anyway and the R9/7/3 is not related to features either!

    Speaking about the cards:
    - boost clocks additional 25 MHz again? I have no idea why these are there. Make it 100+ MHz of leave it.
    - 1.175 V for a mere 925 MHz? The chip should be able to do 1.0 GHz at ~1.0 V, maybe 1.10 V for guaranteed clocks
    - same for R7 260 - that voltage is ridiculously high

    Anyway, the cards themselves are fine (just like the 7000 series) and the coolers really fit them.
  • silverblue - Thursday, February 13, 2014 - link

    The single GPU frame latency issue has been fixed for more than six months. I doubt it's going to become a problem again like with AMD's handling of 2D a while back.

    There are remarks concerning the availability of the R9 270 series and the inability for these parts to keep to their RRP, both of which may not be present if this was some sort of fanboy review.
  • Spuke - Thursday, February 13, 2014 - link

    Has it been 6 months? I thought they recently fixed that problem.
  • silverblue - Thursday, February 13, 2014 - link

    It was fixed in Cat 13.8 Beta 1, dated 1st August.
  • silverblue - Thursday, February 13, 2014 - link

    My bad - that's when CrossFire had its first fix. Apparently, single-GPU was fixed beforehand, though I can't find which driver version it was.
  • Solid State Brain - Thursday, February 13, 2014 - link

    Anandtech: it would be interesting if you tested idle power consumption in multi monitor scenarios. I think you will find out some surprises.
  • creed3020 - Thursday, February 13, 2014 - link

    Excellent point!

    I had a friend with a 6950 and he was furious that his video card would never idle down in gpu/memory frequencies when he had a second monitor connected.

    I personally have a 6850 and two 20" LCDs connected over DVI. I have not looked for the same behaviour but would not be surprised if it were the same.

    Power efficiencies are out the window once the user chooses to go multi-monitor to be more productive.e
  • Solid State Brain - Thursday, February 13, 2014 - link

    I have the same issue with my HD7770 to a lesser extent and my workaround for that is connecting my two secondary displays on the integrated Intel GPU. This saves a significant amount of power.

Log in

Don't have an account? Sign up now