Crysis: Warhead

Kicking things off as always is Crysis: Warhead. It’s no longer the toughest game in our benchmark suite, but it’s still a technically complex game that has proven to be a very consistent benchmark. Thus even four years since the release of the original Crysis, “but can it run Crysis?” is still an important question, and the answer continues to be “no.” While we’re closer than ever, full Enthusiast settings at a 60fps is still beyond the grasp of a single-GPU card.

Crysis: Warhead - 2560x1600 - Frost Bench - Enthusiast Quality + 4xAA

Crysis: Warhead - 1920x1200 - Frost Bench - Enthusiast Quality + 4xAA

Crysis: Warhead - 1680x1050 - Frost Bench - E Shaders/G Quality + 4xAA

While Crysis was a strong game for the GTX 580, the same cannot be said of the GTX 680. NVIDIA is off to a very poor start here, with the Radeon HD 7970 easily outperforming the GTX 680, and even the 7950 is tied or nearly tied with the GTX 680 depending on the resolution. On the bright side the GTX 680 does manage to outperform the GTX 580, but only by a relatively meager 17%.

Given the large gap in theoretical performance between the GTX 680 and GTX 580, as it turns out we’ve run into one of the few scenarios where the GTX 680 doesn’t improve on the GTX 580: memory bandwidth. In our overclocking results we discovered that a core overclock had almost no impact on Crysis, whereas a memory overclock improved performance by 8%, almost exactly as much as the memory overclock itself. When it comes to the latest generation of cards it appears that Crysis loves memory bandwidth, and this is something the Radeon HD 7900 series has in spades but the GTX 680 does not. Thankfully for NVIDIA not every game is like Crysis.

Crysis: Warhead - Minimum Frame Rate - 2560x1600

Crysis: Warhead - Minimum Frame Rate - 1920x1200

Crysis: Warhead - Minimum Frame Rate - 1680x1050

The minimum framerate situation is even worse for NVIDIA here, with the GTX 680 clearly falling behind the 7950, and improving on the GTX 580 by only 10%. At its worst Crysis is absolutely devouring memory bandwidth here, and that leaves the GTX 680 underprepared.

The Test Metro 2033
Comments Locked

404 Comments

View All Comments

  • CeriseCogburn - Sunday, March 25, 2012 - link

    They get to show amd "catching up" so they like it. They get to try to puke out Kepler's 2G ram and make amd's 3G shine, so they "can't resist" - and when frame rates fall below playable, "they all of a sudden" "don't care", even when the puking attempt fails. They haven't been able to resist since the 580 w 1.5G vs 2G 6950/6970 it was a great blame the low ram game for any changes.
    Then they checked 6950 1G 2G and 2G was slower...but so what.
    Now 2G Kepler has put the ram lie to rest even in triple monitor gaming... but any lesser win or loss or slimming margin can still be blamed on that, it gets people "buying the amd card" and they get real frustrated here when they can't figure out why Nvidia is winning when they don't believe it should be. It's always expressed in the article how shocked they are. So ram is a convenient scapegoat. It's always used a "future proofing" notion as well, though no evidence has ever surfaced for that.
  • _vor_ - Sunday, March 25, 2012 - link

    What's with all the nerdrage? Do you work for NVIDIA?
  • formulav8 - Sunday, March 25, 2012 - link

    Get over yourself already. NVidia doesn't even like You. Can't believe how people feel about a stinking stupid corporation.
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    It's not about a corporation it's about facts guy. Facts mean my friends and my readers get the best they can get for the buck they are paying.
    Just because amd is behind and therefore lies are told, does not mean the truth should not shine through !
    The truth shall shine through !
  • AnnonymousCoward - Sunday, March 25, 2012 - link

    Personally, I don't care if the card has 64kB of RAM. Or 8 million stream processors. Performance, cost, power, and noise are what matter.

    And back to my point: performance in the 20-50fps range at 2560x1600 4xAA is meaningless and not a criteria for judgment.
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    I never disagreed with that point I merely explained why things are done in such and such a way while ignoring other things.
    It's not difficult at all.
  • Zephyr66z0r - Sunday, March 25, 2012 - link

    Well I understand 'some' of the tech behind the GTX680 one thing stands out. 256bit bus width, when you see that with nvidia its along the lines of GTX560.... so does that mean there's going be a 384bit (mid-high) or 512bit(high-enth, 256bit + 256bit + 2 GPU) card/s coming out?

    I can't wait, anyone done SLi with it yet?
  • dmnwlv - Sunday, March 25, 2012 - link

    First off, I think nVidia has done a good job with the new GTX680.

    However I do not need a game that is already running at 100+ frames to be even faster.
    It needs to be fast at where it counts - games that are still running slow at 60 fps and below.

    For this, of 3 relevant games, nVidia is faster at just one of them. Experience (if you also remember) has shown that the results could be very different once frames for some settings/games hit below 60fps.

    Hence I cannot agree with all the big f about GTX680 is so much faster fuss.
    You guys are led by the heart (much alike ati fanboys you used to call) than the brain.

    And all other compute tests are non-relevant to me (and majority of you to be honest).
  • gramboh - Monday, March 26, 2012 - link

    What about a little game (that several million people play) called Battlefield 3? NV has a massive lead with the GTX 680 over the 7970/7950. AT only benches single player, but the game is even more punishing in 64 player multiplayer. Having a smooth framerate at max detail with 4X AA/16X AF is a big competitive advantage and makes the game significantly more enjoyable.

    Kind of disappointed the card isn't faster in Witcher 2, which I think has the best graphics of a single player game.
  • CeriseCogburn - Tuesday, March 27, 2012 - link

    Have all of you people repeating that FUD forgotten Shogun 2 Total War ?
    It's the hardest game in the bench set according to anandtech...
    How is it that THE HARDEST GAME that Nvidia swept top to bottom at every resolution is suddenly and completely forgotten about, while we hear these other FUD declarations ?
    How does that work, just repeat what some other mistaken fudder spewed ?

Log in

Don't have an account? Sign up now