Crysis: Warhead

Kicking things off as always is Crysis: Warhead, still one of the toughest games in our benchmark suite. Even three years since the release of the original Crysis, “but can it run Crysis?” is still an important question, and for three years the answer was “no.” Dual-GPU halo cards can now play it at Enthusiast settings at high resolutions, but for everything else max settings are still beyond the grasp of a single card.

Crysis is often a bellwether for overall performance; if that’s the case here, then NVIDIA and the GTX 590 is not off to a good start at the all-important resolution of 2560x1600.

AMD gets some really good CrossFire scaling under Crysis, and as a result the 6990 has no problem taking the lead here. At a roughly 10% disadvantage it won’t make or break the game for NVIDIA, but given the similar prices they don’t want to lose too many games.

Meanwhile amongst NVIDIA’s own stable of cards, the stock GTX 590 ends up slightly underperforming the GTX 570 SLI. As we discussed in our look at theoretical numbers, the GTX 590’s advantage/disadvantage depends on what the game in question taxes the most. Crysis is normally shader and memory bandwidth heavy, which is why the GTX 590 never falls too far behind with its memory bandwidth advantage. EVGA’s mild overclock is enough to close the gap however, delivering identical performance. A further overclock can improve performance some more, but surprisingly not by all that much.

The minimum framerate ends up looking better for NVIDIA. The GTX 590 is still behind the 6990, but now it’s only by about 5%, while the EVGA GTX 590 squeezes past by all of .1 frame per second.

OCP Refined, A Word On Marketing, & The Test BattleForge
Comments Locked

123 Comments

View All Comments

  • valenti - Thursday, March 24, 2011 - link

    Ryan, I commented last week on the 550 review. Just to echo that comment here: how are you getting the "nodes per day" numbers? Have you considered switching to a points per day metric? Very few people can explain what nodes per day are, and they aren't a very good measure for real world folding performance.

    (also, it seems like you should double the number for this review, since I'm guessing it was just ignoring the second GPU)
  • Ryan Smith - Thursday, March 24, 2011 - link

    Last year NVIDIA worked with the F@H group to provide a special version of the client for benchmark purposes. Nodes per day is how the client reports its results. Since points are arbitrary based on how the F@H group is scoring things, I can't really make a conversion.
  • poohbear - Thursday, March 24, 2011 - link

    Good to see that a $700 finally has a decent cooler! Why would somebody spend $700 & then go and hafta spend another $40 for an aftermarket cooler??? nvidia & AMD really need to just charge $750 and hve an ultra quiet card, these people in this price range are'nt gonna squabble over an extra $50 for petes sake!!!! it makes no sense that they skimp on the cooler at this price range! this is the top of the line where money isnt the issue!
  • Guspaz - Thursday, March 24, 2011 - link

    Let's get this straight, nVidia. Slapping two of your existing GPUs together does not make this a "next-generation card". Saying that you've been working on it for two years is also misleading; I doubt it took two years just to lay out the PCB to get two GPUs on a single board.

    SLI and Crossfire still feel like kludges. Take Crysis 2 for example. The game comes out, and I try to play it on my 295. It runs, but only on one GPU. So I go looking online; it turns out that there's an SLI profile update for the game, but only for the latest beta drivers. If you install those drivers *and* the profile update, you'll get the speed boost, but also various graphical corruption issues involving flickering of certain types of effects (that seem universal rather than isolated).

    After two goes at SLI (first dual 285s, next a 295), I've come to the conclusion that SLI is just not worth the headache. You'll end up dealing with constant compatibility issues.
  • strikeback03 - Thursday, March 24, 2011 - link

    And that is why people still buy the 6970/580, rather than having 2 cheaper cards in SLI like so many recommend.
  • JarredWalton - Thursday, March 24, 2011 - link

    For the record, I've had three goes at CrossFire (2 x 3870, 4870X2, and now 2 x 5850). I'm equally disappointed with day-of-release gaming results. But, if you stick to titles that are 2-3 months old, it's a lot better. (Yeah, spend $600 on GPUs just so you can wait two months after a game release before buying....)
  • Guspaz - Friday, March 25, 2011 - link

    I don't know about that, the original Crysis still has a lot of issues with SLI.
  • Nentor - Thursday, March 24, 2011 - link

    "For the GTX 590 launch, NVIDIA once again sampled partner cards rather than sampling reference cards directly to the press. Even with this, all of the cards launching today are more-or-less reference with a few cosmetic changes, so everything we’re describing here applies to all other GTX 590 cards unless otherwise noted.

    With that out of the way, the card we were sampled is the EVGA GeForce GTX 590 Classified, a premium GTX 590 offering from EVGA. The important difference from the reference GTX 590 is that GTX 590 Classified ships at slightly higher clocks—630/864 vs. 607/853.5—and comes with a premium package, which we will get into later. The GTX 590 Classified also commands a premium price of $729."

    Are we calling overclocked cards "more-or-less reference" cards now? That's a nice way to put it, I'll use it the next time I get stopped by a police officer. Sir, I was going more or less 100mph.

    Reference is ONE THING. It is the basis and does not waver. Anything that is not it is either overclocked or underclocked.
  • strikeback03 - Thursday, March 24, 2011 - link

    Bad example, as in the US at least your speedometer is only required to be accurate within 10%, meaning you can't get ticketed at less than 10% over the speed limit. This card is only overclocked by 4%. More importantly, they a) weren't sent a reference card, and b) included full tests at stock clocks. Would you rather they not review it since it isn't a reference card?
  • Nentor - Thursday, March 24, 2011 - link

    That is a good point actually, I didn't think of that.

    Maybe reject the card yes, but that is not going to happen. Nvidia is just showing who is boss by sending a non reference card. AT will have to swallow whatever Nvidia feeds them if they want to keep bringing the news.

Log in

Don't have an account? Sign up now