Crysis: Warhead

Up next is our legacy title for 2013, Crysis: Warhead. The stand-alone expansion to 2007’s Crysis, at over 4 years old Crysis: Warhead can still beat most systems down. Crysis was intended to be future-looking as far as performance and visual quality goes, and it has clearly achieved that. We’ve only finally reached the point where single-GPU cards have come out that can hit 60fps at 1920 with 4xAA.

At 2560 we still have a bit of a distance to go before any single-GPU card can crack 60fps. In lieu of that Titan is the winner as expected. Leading the GTX 680 by 54%, this is Titan’s single biggest win over its predecessor, actually exceeding the theoretical performance advantage based on the increase in functional units alone. For some reason GTX 680 never did gain much in the way of performance here versus the GTX 580, and while it’s hard to argue that Titan has reversed that, it has at least corrected some of the problem in order to push more than 50% out.

In the meantime, with GTX 680’s languid performance, this has been a game the latest Radeon cards have regularly cleared. For whatever reason they’re a good match for Crysis, meaning even with all its brawn, Titan can only clear the 7970GE by 21%.

On the other hand, our multi-GPU cards are a mixed bag. Once more Titan loses to both, but the GTX 690 only leads by 15% thanks to GK104’s aforementioned weak Crysis performance. Meanwhile the 7990 takes a larger lead at 33%.

I’d also note that we’ve thrown in a “bonus round” here just to see when Crysis will be playable at 1080p with its highest settings and with 4x SSAA for that picture-perfect experience. As it stands AMD multi-GPU cards can already cross 60fps, but for everything else we’re probably a generation off yet before Crysis is completely and utterly conquered.

Moving on, we once again have minimum framerates for Crysis.

When it comes to Titan, the relative improvement in minimum framerates over GTX 680 is nothing short of obscene. Whatever it was that was holding back GTX 680 is clearly having a hard time slowing down Titan, leading to Titan offering 71% better minimum framerates. There’s clearly much more going on here than just an increase in function units.

Meanwhile, though Titan’s gains here over the 7970GE aren’t quite as high as they were with the GTX 680, the lead over the 7970GE still grows a bit to 26%. As for our mutli-GPU cards, this appears to be a case where SLI is struggling; the GTX 690 is barely faster than Titan here. Though at 31% faster than Titan, the 7990 doesn’t seem to be faltering much.

Sleeping Dogs Far Cry 3
Comments Locked

337 Comments

View All Comments

  • vps - Thursday, February 21, 2013 - link

    For compute benchmark you might want to take a look at FAHBench
    FAHBench is the official Folding@Home GPU benchmark. It measures the compute performance of GPUs for Folding@Home.
    http://proteneer.com/blog/?page_id=1671

    Some reference scores are here:
    http://foldingforum.org/viewtopic.php?f=38&t=2...
  • Ryan Smith - Thursday, February 21, 2013 - link

    FAHBench is primarily an OpenCL benchmark (there's a CUDA path, but it's effectively on its way out). It's on our list, and is one of the things we couldn't run due to the fact that OpenCL is not currently working on Titan.
  • Hrel - Thursday, February 21, 2013 - link

    PowerDirector still uses CUDA
  • atlr - Thursday, February 21, 2013 - link

    Not sure if this helps. I found CLBenchmark results of a Titan versus a 7970 here.
    http://clbenchmark.com/compare.jsp?config_0=144702...
  • atlr - Thursday, February 21, 2013 - link

    Ville Timonen posted results running his own code on a Tesla K20 versus the usual suspects. Might be helpful to folks considering options for GPGPU computation.
    http://wili.cc/blog/gpgpu-faceoff.html
  • chizow - Thursday, February 21, 2013 - link

    That's the first thing that comes to mind now when I think of Nvidia, which is a shame because that name used to be synonymous with Awesome. That's gone and replaced with a disdain for ridiculous levels of usury with their last two high-end product launches. I'm not going to be disingenuous and claim I'm going AMD, because the fact of the matter is, Nvidia products are still in a class of it's own for my usage needs, but I will certainly not be spending as much on Nvidia parts as I used to.

    Kepler has basically set back Nvidia's product stack back by half a generation, but my price:performance metrics will stay the same. Nvidia has their ultra-premium "Xtreme Edition" GPU this round, but that only came about as a result of AMD's ridiculous pricing and overall lackluster performance of the 7970 for a "flagship" card. Either way, I think it will be difficult for Nvidia to sustain this price point, as expectations just got higher at that $1K range.

    @Ryan: I'm disappointed you didn't write a harsher commentary on the fact Nvidia is now charging 2x for the same class of GPU, pricing that has stayed true since 2006. Even the G80 Ultra didn't approach this $1K mark. Given how many times Nvidia has had to backpedal and apologize about the Ultra's pricing, you would think they would learn from their mistakes. I guess not, I hope they are prepared to deal with the long-term ramifications, backlash, and loss of goodwill stemming from this pricing decision.
  • CeriseCogburn - Thursday, February 21, 2013 - link

    7.1 billion transistors and 6G of ram.

    I for one am sick of you people constantly whining.

    If we check the whine log from the ATI32 days you were doing it then, too.

    It's all you people do. Every time, all the time.
  • chizow - Friday, February 22, 2013 - link

    And all you do is post inane, barely intelligible nonsense in defense of Nvidia. If you check your "whine logs" you'll see I've done my fair share of defending Nvidia, but I can't and won't give them a pass for what they've done with Kepler. AMD started it for sure with the terribad price:performance of Tahiti, but Nvidia has taken it to a new level of greed.

    And for all the idiots who are going to reply "herr dueerr Nvidai need make money not a charity derrr", my advanced reply is that Nvidia has made money in all but 2-3 quarters since 2006 without selling a single $1K desktop GPU. In fact, they enjoyed record profits, margin and revenue on the back of a $250 GPU, the 8800GT in 2007-2008.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    Nope, you are the crying whining baby who says the same thing in 100 different posts here, and has no clue what "the economy" of "the world" has been doing for the past several years.

    Whatever.

    Please go cry about all the other computer part prices that are doing the same thing.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    I mean you idiots want the same price for more perf over a DECADE. Meanwhile, the rest of the worlds pricing has DOUBLED.

    Now, computer prices used to drop across the board, but they just aren't doing it anymore, and IDIOTS like yourself continue on your stupid frikkin rants, ignoring the world itself, not to mention the LACK of production in that world for the GPU's you whine about. It's really funny how stupid you are.

Log in

Don't have an account? Sign up now