Crysis: Warhead

Kicking things off as always is Crysis: Warhead. It’s no longer the toughest game in our benchmark suite, but it’s still a technically complex game that has proven to be a very consistent benchmark. Thus even four years since the release of the original Crysis, “but can it run Crysis?” is still an important question, and the answer continues to be “no.” While we’re closer than ever, full Enthusiast settings at a 60fps is still beyond the grasp of a single-GPU card.

Crysis: Warhead - 2560x1600 - Frost Bench - Enthusiast Quality + 4xAA

Crysis: Warhead - 1920x1200 - Frost Bench - Enthusiast Quality + 4xAA

Crysis: Warhead - 1680x1050 - Frost Bench - E Shaders/G Quality + 4xAA

While Crysis was a strong game for the GTX 580, the same cannot be said of the GTX 680. NVIDIA is off to a very poor start here, with the Radeon HD 7970 easily outperforming the GTX 680, and even the 7950 is tied or nearly tied with the GTX 680 depending on the resolution. On the bright side the GTX 680 does manage to outperform the GTX 580, but only by a relatively meager 17%.

Given the large gap in theoretical performance between the GTX 680 and GTX 580, as it turns out we’ve run into one of the few scenarios where the GTX 680 doesn’t improve on the GTX 580: memory bandwidth. In our overclocking results we discovered that a core overclock had almost no impact on Crysis, whereas a memory overclock improved performance by 8%, almost exactly as much as the memory overclock itself. When it comes to the latest generation of cards it appears that Crysis loves memory bandwidth, and this is something the Radeon HD 7900 series has in spades but the GTX 680 does not. Thankfully for NVIDIA not every game is like Crysis.

Crysis: Warhead - Minimum Frame Rate - 2560x1600

Crysis: Warhead - Minimum Frame Rate - 1920x1200

Crysis: Warhead - Minimum Frame Rate - 1680x1050

The minimum framerate situation is even worse for NVIDIA here, with the GTX 680 clearly falling behind the 7950, and improving on the GTX 580 by only 10%. At its worst Crysis is absolutely devouring memory bandwidth here, and that leaves the GTX 680 underprepared.

The Test Metro 2033
Comments Locked

404 Comments

View All Comments

  • sngbrdb - Friday, March 30, 2012 - link

    *from : P
  • Mombasa69 - Wednesday, April 4, 2012 - link

    This is just a rebadged mid-range card, the 680 has less memory bandwidth than GPU's brought out 4 years ago lol, what a ripp, I can see the big fat directors at Nvidia laughing at all the mugs that have gone out and bought one, thinking this is the real big boy to replace the 580... muppets. lol.
  • N4v1N - Wednesday, April 4, 2012 - link

    Nvidia is the bestest! No AMD is the betterest!
    lol...
  • CeriseCogburn - Friday, April 6, 2012 - link

    Yes Nvidia clocked the ram over 6Ghz because their ram controller is so rockin'.
    In any case, the 7970 is now being overclocked, both are to 7000Ghz ram.
    Unfortunately the 7970 still winds up behind most of the time, even in 2650X1200 screen triple gaming.
  • raghu78 - Saturday, April 7, 2012 - link

    In the Reference Radeon HD 7970 AND XFX RADEON HD 7970 review the DirectX 11 compute shader Fluid simulation perfomance is far more than in this review.

    http://www.anandtech.com/show/5261/amd-radeon-hd-7...

    http://www.anandtech.com/show/5314/xfxs-radeon-hd-...

    http://images.anandtech.com/graphs/graph5314/43383...

    Reference HD 7970 -133 and XFX HD 7970 -145 . In this review Reference HD 7970 -115.5

    What has changed in between these reviews. Is it that performance has actually decreased with the latest drivers
  • oddnutz - Thursday, April 12, 2012 - link

    well i have been an ATI fanboi forever. So I am due a gfx upgrade which would of already happened if ATI priced their latest cards similar to previous generations. I will watch ATI prices over the next few weeks but it looks like i might be turning green soon.
  • blanarahul - Friday, April 13, 2012 - link

    Actually the GTX 680 REFERENCE BOARD was designed for 375 Watts of power.
    It has a total of 2 6-pin and one 8-pin connector on the board! I realized this after seeing the back of the board.
  • Commander Bubble - Thursday, April 19, 2012 - link

    I agree with some of the sensible posts littered in here that Witcher 2 should be included as a comparison point, and most notably the ubersampling setup.
    i run 2x 580GTX SLI @1920 and i can't manage a minimum 60fps with that turned on. That would be a good test for current cards as it absoultely hammers them.

    also, i don't know whether CeriseCogburn is right or wrong, and i don't care, but i'm just sick of seeing his name in the comment list. go outside and meet people, do something else. you are clearly spending way too much time on here...
  • beiker44 - Tuesday, April 24, 2012 - link

    I can't wait to get one...or wait for the bad ace Dual 690!!! decisions decisions
  • Oxford Guy - Thursday, July 5, 2012 - link

    "At the end of the day NVIDIA already had a strong architecture in Fermi, so with Kepler they’ve gone and done the most logical thing to improve their performance: they’ve simply doubled Fermi."

    Fermi Lite, you mean.

    "Now how does the GTX 680 fare in load noise? The answer depends on what you want to compare it to. Compared to the GTX 580, the GTX 680 is practically tied – no better and no worse – which reflects NVIDIA’s continued use of a conservative cooling strategy that favors noise over temperatures."

    No, the 680's cooling performance is inferior because it doesn't use a vapor chamber. Nvidia skimped on the cooling to save money, it seems.

Log in

Don't have an account? Sign up now