Crysis: Warhead

Kicking things off as always is Crysis: Warhead, still one of the toughest game in our benchmark suite. Even 2 years since the release of the original Crysis, “but can it run Crysis?” is still an important question, and the answer continues to be “no.” While we’re closer than ever, full Enthusiast settings at a playable framerate is still beyond the grasp of a single card.

Crysis end up setting the stage for the rest of this article. As a GTX 480 replacement the GTX 570 is effectively tied with it at 2560 and 1680, and only at 1920 do we see the GTX 570 fall behind by all of 4%. Meanwhile compared to the GTX 470 it’s 20% faster (and 40% more expensive), while it falls to the GTX 460 1GB SLI by over 10%. Overall the GTX 570 is at near parity with the GTX 480, and should be equally capable of playing just about everything at 1920.

As for AMD’s cards, the 5870 (which was never too far behind the GTX 480) nips at the GTX 570’s heels – at times the GTX 570 is no more than 5% faster and no better than 15% faster, showcasing why the 5870 is a threat to the GTX 570 as a value threat. Meanwhile the 6850 CF is tops here at 1920 by a wide margin for only around $20-$30 more than the GTX 570. As was the case with the GTX 580, a pair of lesser AMD cards is going to offer better gaming performance in exchange for the drawbacks of a multi-GPU setup.

Looking at our minimum framerates, the story is much the same. Outside of 2560 where the extra memory provides a stark advantage for the NVIDIA cards, the GTX 570 and GTX 480 are close together except for 1920 where the 570 falls behind by a bit more than we’d expect. The 5870 isn’t nearly as threatening here as it is with average framerates, but the 460SLI/6850CF configurations are still well ahead.

The Test BattleForge: DX10
Comments Locked

54 Comments

View All Comments

  • ilkhan - Tuesday, December 7, 2010 - link

    I love my HDMI connection. It falls out of my monitor about once a month and I have to flip the screen around to plug it back in. Thanks TV industry!
  • Mr Perfect - Tuesday, December 7, 2010 - link

    It is somewhat disappointing. People with existing screens probably don't care, and the cheap TN screens still pimp the DVI interface, but all of the high end IPS panel displays include either HDMI, DP or both. Why wouldn't a high end video card have the matching outputs?
  • EnzoFX - Tuesday, December 7, 2010 - link

    High-End gaming card is probably for serious gamers, which should probably go with TN as they are the best against input lag =P.
  • Mr Perfect - Tuesday, December 7, 2010 - link

    Input lag depends on the screen's controller, you're thinking pixel response time. Yes, TN is certainly faster then IPS for that. I still wouldn't get a TN though, the IPS isn't far enough behind in response time to negate the picture quality improvement.
  • MrSpadge - Tuesday, December 7, 2010 - link

    Agreed. The pixel response time of my eIPS is certainly good enough to be of absolutely no factor. The image quality, on the other hand, is worth every cent.

    MrS
  • DanNeely - Tuesday, December 7, 2010 - link

    Due to the rarity of HDMI 1.4 devices (needed to go above 1920x1200) replacing a DVI port with an HDMI port would result in a loss of capability. This is aggravated by the fact that due to their stickerprice 30" monitors have a much longer lifetime than 1080p displays and owners who would get even more outraged as being told they had to replace their screens to use a new GPU. MiniDVI isn't an option either because it's singlelink and has the same 1920x1200 cap as HDMI 1.3.

    Unfortunately there isn't room for anything except a single miniHDMI/miniDP port to the side of 2 DVI's, installing it on the top half of a double height card like ATI has done cuts into the cards exhaust airflow and hurts cooling. With the 5xx series still limited to 2 outputs that's not a good tradeoff, and HDMI is much more ubiquitous.

    The fiasco with DP-DVI adapters and the 5xxx series cards doesn't exactly make them an appealing option either to consumers.
  • Mr Perfect - Wednesday, December 8, 2010 - link

    That makes good sense too, you certainty wouldn't want to drop an existing port to add DP. I guess it really comes down to that cooling vs port selection problem.

    I wonder why ATI stacked the DVI ports? Those are the largest ports out of the three and so block the most ventilation. If you could stack a mini-DP over the mini HDMI, it would be a pretty small penalty. It might even be possible to mount the mini ports on edge instead of horizontally to keep them all on one slot.
  • BathroomFeeling - Tuesday, December 7, 2010 - link

    "...Whereas the GTX 580 took a two-tiered approach on raising the bar on GPU performance while simultaneously reducing power consumption, the GeForce GTX 470 takes a much more single-tracked approach. It is for all intents and purposes the new GTX 480, offering gaming performance..."
  • Lonyo - Tuesday, December 7, 2010 - link

    Any comments on how many will be available? In the UK sites are expecting cards on the 9th~11th December, so not a hard launch there.
    Newegg seems to only have limited stock.

    Not to mention an almost complete lack of UK availability of GTX580s, and minimal models and quantities on offer from US sites (Newegg).
  • Kef71 - Tuesday, December 7, 2010 - link

    Or maybe they are a nvidia "feature" only?

Log in

Don't have an account? Sign up now