Crysis: Warhead

Kicking things off as always is Crysis: Warhead. It’s no longer the toughest game in our benchmark suite, but it’s still a technically complex game that has proven to be a very consistent benchmark. Thus even four years since the release of the original Crysis, “but can it run Crysis?” is still an important question, and the answer when it comes to setups using a pair of high-end 28nm GPUs is “you better damn well believe it.”

Crysis was a game that Kepler didn’t improve upon by a great deal compared to the Fermi based GTX 580. NVIDIA sees some good SLI scaling here, but AMD’s performance lead with a single GPU translates into an equally impressive lead with multiple GPUs; in spite of all of its capabilities the GTX 690 trails the 7970CF by 18% here. So long as AMD gets good Crossfire scaling here, there’s just no opening for Kepler to win, allowing AMD to handily trounce the GTX 690 here.

As for the intra-NVIDIA comparisons, the GTX 690 does well for itself here. Performance relative to the GTX 680 SLI at 2560 is 98%, which represents a 77% lead over the GTX 680. Overall performance is quite solid; at 55.7fps we’re nearly to 60fps on Enthusiast quality at 2560 with 4x MSAA, which is the holy grail for a video card. Even 5760 is over 60fps, albeit at lower quality settings and without AA.

It’s taken nearly 4 years, but we’re almost there; Crysis at maximum on a single video card.

Our minimum framerates are much the same story for NVIDIA. The GTX 690 once again just trails the GTX 680 SLI, while interestingly enough the dual-GPU NVIDIA solutions manage to erode AMD’s lead at a single point: 2560. Here they only trail by 8%, versus 20%+ at 5760 and 1920. Though at 1920 we also see another interesting outcome: the GTX 580 SLI beats the GTX 680 SLI and GTX 690 in minimum framerates. This would further support our theory that the GTX 680 is memory bandwidth starved in Crysis, especially at the lowest performance points.

GeForce Experience & The Test Metro 2033
Comments Locked

200 Comments

View All Comments

  • JPForums - Thursday, May 3, 2012 - link

    Sadly, it is a very uncommon resolution for new monitors. Almost every 22-24" monitor your buy today is 1080p instead of 1200p. :(


    Not mine. I'm running a 1920x1200 IPS.
    1920x1200 is more common in the higher end monitor market.
    A quick glance at newegg shows 16 1920x1200 models with at 24" alone. (starting at $230)
    Besides, I can't imagine many buy a $1000 dollar video card and pair it with a single $200 display.

    It makes more sense to me to check 1920x1200 performance than 1920x1080 for several reasons:
    1) 1920x1200 splits the difference between 16x10 and 25x14 or 25x16 better than 1920x1080.
    1680x1050 = ~1.7MP
    1920x1080=~2MP
    1920x1200=~2.3MP
    2560*1440=~3.7MP
    2560x1600=~4MP

    2) People willing to spend $1000 for a video card are generally in a better position to get a nicer monitor. 1920x1200 monitors are more common at higher prices.

    3) They already have three of them around to run 5760x1200. Why go get another monitor?

    Opinionated Side Points:
    Movies transitioned to resolutions much wider than 1080P long ago. A little extra black space really makes no difference.
    1920x1200 is a perfectly valid resolution. If Nvidia is having trouble with it, I want to know. When particular resolutions don't scale properly, it is probable that there is either a bug or shenanigans are at work in the more common resolutions.
    I prefer using 1920x1200 as a starting point for moving to triple screen setups. I already thing 1920x1080 looks squashed, so 5760x1080 looks downright flattened. Also 3240x1920 just doesn't look very surround to me (3600x1920 seems borderline surround).
  • CeriseCogburn - Saturday, May 5, 2012 - link

    There are only 18 models available in all of newegg with 1920x1200 resolution - only 6 of those are under $400, they are all over $300.
    +
    There are 242 models available in 1920x1080, with nearly 150 models under $300.
    You people are literally a bad joke when it comes to even a tiny shred of honesty.
  • Lerianis - Sunday, May 6, 2012 - link

    I don't know about the 'sadly' there in all honesty. I personally like 1920*1080 better than *1200, because nearly everything is done in the former resolution.
  • Stuka87 - Thursday, May 3, 2012 - link

    Who buys a GTX690 to play on a 1080P display? Even a 680 is overkill for 1080. You can save a lot of money with a 7870 and still run everything out there.
  • vladanandtechy - Thursday, May 3, 2012 - link

    Stuka i agree with you.....but when you buy such a card....you think in the future....5 maybe 6 years....and i can't gurantee that we will do gaming in 1080p then:)....
  • retrospooty - Thursday, May 3, 2012 - link

    "Stuka i agree with you.....but when you buy such a card....you think in the future....5 maybe 6 years....and i can't gurantee that we will do gaming in 1080p then:)...."

    I have to totally disagree with that. Anyone that pays $500+ for a video card is a certain "type" of buyer. That type of buyer will NEVER wait 5-6 years for an upgrade. That guy is getting the latest and greatest of every other generation, if not every generation of cards.
  • vladanandtechy - Thursday, May 3, 2012 - link

    You shouldn't "totally disagree".......meet me...."the exception"....i am the type of buyer who is looking for the "long run"....but i must confess....if i could....i would be the type of buyer you describe....cya
  • orionismud - Thursday, May 3, 2012 - link

    retrospooty and I mean you no disrespect, but if you're spending $500 and buying for the "long run," you're doing it wrong.

    If you had spent $250, you could have 80% of the performance for 2.5 years, then spend another $250 and have 200% of the performance for the remaining 2.5 years.
  • von Krupp - Thursday, May 3, 2012 - link

    Don't say that.

    I bought two (2) HD 7970s on the premise that I'm not going to upgrade them for a good long while. At least four years, probably closer to six. I ran from 2005 to 2012 with a GeForce 7800GT just fine and my single core AMD CPU was actually the larger reason why I needed to move on.

    Now granted, I also purchased a snazzy U2711 just so the power of these cards wouldn't go to waste (though I'm quite CPU-bound by this i7-3820), but I don't consider dropping AA in future titles to maintain performance to be that big of a loss; I already only run with 8x AF because , frankly, I'm too busy killing things to notice otherwise. I intend to drive this rig for the same mileage. It costs less for me to buy the best of the best at the time of purchase for $1000 and play it into the ground than it is to keep buying $350 cards to barely keep up every two years, all over a seven year duration. Since I now have this fancy 2560x1440 resolution and want to use it, the $250-$300 offerings don't cut it. And the, don't forget to adjust for inflation year over year.

    So yes, I'm going to be waiting between 4 and 6 years to upgrade. Under certain conditions, buying the really expensive stuff is as much of an economical move as it is a power grab. Not all of us who build $3000 computers do it on a regular basis.

    P.S. Thank you consoles for extending PC hardware life cycles. Makes it easier to make purchases.
  • Makaveli - Thursday, May 3, 2012 - link

    lol agree let put a $500 videocard with a $200 TN panel at 1920x1080 umm ya no!

Log in

Don't have an account? Sign up now