Crysis: Warhead

Up next is our legacy title for 2013, Crysis: Warhead. The stand-alone expansion to 2007’s Crysis, at over 4 years old Crysis: Warhead can still beat most systems down. Crysis was intended to be future-looking as far as performance and visual quality goes, and it has clearly achieved that. We’ve only finally reached the point where single-GPU cards have come out that can hit 60fps at 1920 with 4xAA.

At 2560 we still have a bit of a distance to go before any single-GPU card can crack 60fps. In lieu of that Titan is the winner as expected. Leading the GTX 680 by 54%, this is Titan’s single biggest win over its predecessor, actually exceeding the theoretical performance advantage based on the increase in functional units alone. For some reason GTX 680 never did gain much in the way of performance here versus the GTX 580, and while it’s hard to argue that Titan has reversed that, it has at least corrected some of the problem in order to push more than 50% out.

In the meantime, with GTX 680’s languid performance, this has been a game the latest Radeon cards have regularly cleared. For whatever reason they’re a good match for Crysis, meaning even with all its brawn, Titan can only clear the 7970GE by 21%.

On the other hand, our multi-GPU cards are a mixed bag. Once more Titan loses to both, but the GTX 690 only leads by 15% thanks to GK104’s aforementioned weak Crysis performance. Meanwhile the 7990 takes a larger lead at 33%.

I’d also note that we’ve thrown in a “bonus round” here just to see when Crysis will be playable at 1080p with its highest settings and with 4x SSAA for that picture-perfect experience. As it stands AMD multi-GPU cards can already cross 60fps, but for everything else we’re probably a generation off yet before Crysis is completely and utterly conquered.

Moving on, we once again have minimum framerates for Crysis.

When it comes to Titan, the relative improvement in minimum framerates over GTX 680 is nothing short of obscene. Whatever it was that was holding back GTX 680 is clearly having a hard time slowing down Titan, leading to Titan offering 71% better minimum framerates. There’s clearly much more going on here than just an increase in function units.

Meanwhile, though Titan’s gains here over the 7970GE aren’t quite as high as they were with the GTX 680, the lead over the 7970GE still grows a bit to 26%. As for our mutli-GPU cards, this appears to be a case where SLI is struggling; the GTX 690 is barely faster than Titan here. Though at 31% faster than Titan, the 7990 doesn’t seem to be faltering much.

Sleeping Dogs Far Cry 3
Comments Locked

337 Comments

View All Comments

  • CeriseCogburn - Tuesday, March 12, 2013 - link

    ROFL another amd fanboy having a blowout. Mommie will be down to the basement with the bar of soap, don't wet your pants.
    When amd dies your drivers will still suck, badly.
  • trajan2448 - Saturday, March 16, 2013 - link

    Until you guys start showing latencies, these reviews based primarily on fps numbers don't tell the whole story. Titan is 4x faster than multi GPU solutions in real rendering.
  • IUU - Wednesday, March 20, 2013 - link

    Just a thought: if they price titan say at 700 or 500 (that was the old price point for flagship cards), how on earth will they market game consoles, and the brave "new" world of the mobile "revolution"?
    Like it or not, high tech companies have found a convenient way to get away from the cutthroat competition of the pc-land(from there their hate and slogans like post-pc and the rest) and get a breath of fresh(money) air!

    Whether this is also good for the consumer in the long run, remains to be seen, but the fact is, we will pay more to get less, unless something unexpected happens.
  • paul_59 - Saturday, June 15, 2013 - link

    I would appreciate any intelligent opinions on the merits of buying a 690 card versus a Titan, considering they retail for the same price
  • bravegag - Tuesday, August 13, 2013 - link

    I have bought the EVGA nVidia GTX Titan, actually two of them instead of the Tesla K20 thanks to the benchmark results posted in this article. However, the performance results I got are nowhere close to the ones shown here. Running DGEMM from CUDA 5.5 and CUBLAS example matrixMulCUBLAS with my EVGA nVidia GTX Titan reaches no more than 220 GFlop/s which is nowhere close to 1 TFlop/s. My question is then, are the results presented here a total fake?

    I created the following project where some additional HPC benchmarks of the nVidia GTX Titan are included, the benchmark computing environment is also detailed there:
    https://github.com/bravegag/eigen-magma-benchmark
  • bravegag - Wednesday, August 14, 2013 - link

    have anyone tried replicating the benchmark results shown here? how did it go?
  • Tunnah - Wednesday, March 18, 2015 - link

    It feels nVidia are just taking the pee out of us now. I was semi-miffed at the 970 controversy, I know for business reasons etc. it doesn't make sense to truly trounce the competition (and your own products) when you can instead hold something back and keep it tighter, and have something to release in case they surprise you.

    And I was semi-miffed when I heard it would be more like a 33% improvement over the current cream of the crop, instead of the closer to 50% increase the Titan was over the 680, because they have to worry about the 390x, and leave room for a Titan X White Y Grey SuperHappyTime version.

    But to still charge $1000 even though they are keeping the DP performance low, this is just too far. The whole reasoning for the high price tag was you were getting a card that was not only a beast of a gaming card, but it would hold its own as a workstation card too, as long as you didn't need the full Quadro service. Now it is nothing more than a high end card, a halo product...that isn't actually that good!

    When it comes down to it, you're paying 250% the cost for 33% more performance, and that is disgusting. Don't even bring RAM into it, it's not only super cheap and in no way a justification for the cost, but in fact is useless, because NO GAMER WILL EVER NEED THAT MUCH, IT WAS THE FLIM FLAMMING WORKSTATION CROWD WHO NEEDING THAT FLIM FLAMMING AMOUNT OF FLOOMING RAM YOU FLUPPERS!

    This feels like a big juicy gob of spit in our faces. I know most people bought these purely for the gaming option and didn't use the DP capability, but that's not the point - it was WORTH the $999 price tag. This simply is not, not in the slightest. $650, $750 tops because it's the best, after all..but $999 ? Not in this lifetime.

    I've not had an AMD card since way back in the days of ATi, I am well and truly part of the nVidia crowd, even when they had a better card I'd wait for the green team reply. But this is actually insulting to consumers.

    I was never gonna buy one of these, I was waiting on the 980Ti for the 384bit bus and the bumps that come along with it...but now I'm not only hoping the 390x is better than people say because then nVidia will have to make it extra good..I'm hoping it's better than they say so I can actually buy it.

    For shame nVidia, what you're doing with this card is unforgivable

Log in

Don't have an account? Sign up now