Crysis: Warhead

Kicking things off as always is Crysis: Warhead, still one of the toughest game in our benchmark suite. Even 2 years since the release of the original Crysis, “but can it run Crysis?” is still an important question, and the answer continues to be “no.” One of these years we’ll actually be able to run it with full Enthusiast settings…

Right off the bat we see the GTX 580 do well, which as the successor to what was already the fastest single-GPU card on the market is nothing less than we expect. At 2560 it’s around 16% faster than the GTX 480, and at 1920 that drops to 12%. Bear in mind that the theoretical performance improvement for clock + shader is 17%, so in reality it would be nearly impossible get that close without the architectural improvements also playing a role.

Meanwhile AMD’s double-GPU double-trouble lineup of the 5970 and 6870CF both outscore the GTX 580 by around 12% and 27% respectively. It shouldn’t come as a shock that they’re going to win most tests – ultimately they’re priced much more competitively than the GTX 580, making them price-practical alternatives to the GTX 580.

And speaking of competition the GTX 470 SLI is in much the same boat, handily surpassing the GTX 580. This will come full circle however when we look at power consumption.

Meanwhile looking at minimum framerates we have a different story. AMD’s memory management in CrossFire mode has long been an issue with Crysis at 2560, and it continues to show here with a minimum framerate that simply craters. At 2560 there’s a world of difference between NVIDIA and AMD here, and it’s all in NVIDIA’s favor. 1920 however roughly mirrors our earlier averages, with the 580 taking a decent lead over the GTX 480, but falling to multi-GPU cards.

The Test BattleForge DX10
Comments Locked

160 Comments

View All Comments

  • knutjb - Tuesday, November 9, 2010 - link

    I agree guys it should be a 485 not a 580.

    The 6870 is a sore spot on an otherwise solid refinement. Curious to see its SLI performance. $559 on Newegg this am.
  • dtham - Tuesday, November 9, 2010 - link

    Anyone know if aftermarket cooling for the GTX 480 will work for the GTX 580? It would be great to be able to reuse a waterblock from a GTX 480 for the new 580s. Looking at the picture the layout looks similar.
  • mac2j - Tuesday, November 9, 2010 - link

    In Europe the GTX 580 was launched at 399 Euros and in response ATI has lowered the 5970 to 389 Euros (if you believe the rumors).

    This can only bode well for holiday prices of the 6970 vs 580.
  • samspqr - Tuesday, November 9, 2010 - link

    it's already listed and in stock at alternate.de, but the cheapest one is 480eur

    the only 5970 still in stock there is 540eur
  • yzkbug - Tuesday, November 9, 2010 - link

    I moved all my gaming to the living room on a big screen TV and HTPC (a next next gen console in a sense). But, Optimus would be the only way to use this card on HTPC.
  • slatr - Tuesday, November 9, 2010 - link

    Ryan,

    Would you be able to test with Octane Renderer?

    I am interested to see if Octane gets throttled.

    Thanks
  • Andyburgos - Tuesday, November 9, 2010 - link

    Ryan:

    I hold you in the most absolute respect. Actually, in my first post a while ago I praised your work, and I think you´re quite didactic and fun to read. On that, thanks for the review.

    However, I need to ask you: W.T.F. is wrong with you? Aren´t you pissed off by the fact that GTX480 was a half baked chip (wouldn´t say the same about GTX460) and now that we get the real version they decided to call it 580? Why isn´t a single complain about that in the article?

    If, as I understand, you think that the new power / temperature / noise / performance balance has improved dramatically from the 480, I think you are smart enough to see that it was because the 480 was very, very, unpolished chip. This renaming takes us for stupid, is even worse than what AMD did.

    /rant

    AT & staff, I think you have a duty to tell off lousy tactics such as the Barts being renamed 68x0, or the 8800 becoming 9800 then GTS250 as you always did. You have failed so badly to do that here that you look really biased. For me, a loyal argentinian reader since 2001, that is absolutely imposible, but with the GXT460 and this you are acomplishing that.

    +1 for this card deserving an indifferent thumbs up, as Ryan graciously said, not for the card itself (wich is great) but for the nVidia tactics and the half baked 480 they gave us. Remember the FX5800 (as bad or worse than the 480) becoming the 5900... gosh, I think those days are over. Maybe that´s why I stick with my 7300 GT, haha.

    I respectfully disent with your opinion, but thanks for the great review.

    Best regards,
    Andy
  • ViRGE - Tuesday, November 9, 2010 - link

    Huh, are we reading the same article? See page 4.
  • chizow - Tuesday, November 9, 2010 - link

    I'd have to agree he probably didn't read the article thoroughly, beside explicitly saying this is the 2nd worst excuse for a new naming denomination, Ryan takes jabs at the 480 throughout by repeatedly hinting the 580 is what Fermi should've been to begin with.

    Sounds like just another short-sighted rant about renaming that conveniently forgets all the renaming ATI has done in the past. See how many times ATI renamed their R200 and R300 designs, even R600 and RV670 fall into the same exact vein as the G92 renaming he bemoans......
  • Haydyn323 - Tuesday, November 9, 2010 - link

    Nvidia has done no different than ATI has as far as naming in their new cards. They simply jumped on the naming bandwagon for marketing and competetive purposes since ATI had already done so.... at least the 580 is actually faster than the 480. ATI releasing a 6870 that is far inferior to a 5870 is worse in my mind.

    It should indeed have been a 485, but since ATI calls their new card a 6870 when it really should be a 5860 or something, it only seems fair.

Log in

Don't have an account? Sign up now