Crysis: Warhead

Up next is our legacy title for 2013, Crysis: Warhead. The stand-alone expansion to 2007’s Crysis, at over 4 years old Crysis: Warhead can still beat most systems down. Crysis was intended to be future-looking as far as performance and visual quality goes, and it has clearly achieved that. We’ve only finally reached the point where single-GPU cards have come out that can hit 60fps at 1920 with 4xAA.

At 2560 we still have a bit of a distance to go before any single-GPU card can crack 60fps. In lieu of that Titan is the winner as expected. Leading the GTX 680 by 54%, this is Titan’s single biggest win over its predecessor, actually exceeding the theoretical performance advantage based on the increase in functional units alone. For some reason GTX 680 never did gain much in the way of performance here versus the GTX 580, and while it’s hard to argue that Titan has reversed that, it has at least corrected some of the problem in order to push more than 50% out.

In the meantime, with GTX 680’s languid performance, this has been a game the latest Radeon cards have regularly cleared. For whatever reason they’re a good match for Crysis, meaning even with all its brawn, Titan can only clear the 7970GE by 21%.

On the other hand, our multi-GPU cards are a mixed bag. Once more Titan loses to both, but the GTX 690 only leads by 15% thanks to GK104’s aforementioned weak Crysis performance. Meanwhile the 7990 takes a larger lead at 33%.

I’d also note that we’ve thrown in a “bonus round” here just to see when Crysis will be playable at 1080p with its highest settings and with 4x SSAA for that picture-perfect experience. As it stands AMD multi-GPU cards can already cross 60fps, but for everything else we’re probably a generation off yet before Crysis is completely and utterly conquered.

Moving on, we once again have minimum framerates for Crysis.

When it comes to Titan, the relative improvement in minimum framerates over GTX 680 is nothing short of obscene. Whatever it was that was holding back GTX 680 is clearly having a hard time slowing down Titan, leading to Titan offering 71% better minimum framerates. There’s clearly much more going on here than just an increase in function units.

Meanwhile, though Titan’s gains here over the 7970GE aren’t quite as high as they were with the GTX 680, the lead over the 7970GE still grows a bit to 26%. As for our mutli-GPU cards, this appears to be a case where SLI is struggling; the GTX 690 is barely faster than Titan here. Though at 31% faster than Titan, the 7990 doesn’t seem to be faltering much.

Sleeping Dogs Far Cry 3
Comments Locked

337 Comments

View All Comments

  • chizow - Thursday, February 21, 2013 - link

    You must not have followed the development of GPUs, and particularly flagship GPUs very closely in the last decade or so.

    G80, the first "Compute GPGPU" as Nvidia put it, was first and foremost a graphics part and a kickass one at that. Each flagship GPU after, GT200, GT200b, GF100, GF110 have continued in this vein...driven by the desktop graphics market first, Tesla/compute market second. Hell, the Tesla business did not even exist until the GeForceTesla200. Jensen Huang, Nvidia's CEO, even got on stage likening his GPUs to superheroes with day jobs as graphics cards while transforming into supercomputers at night.

    Now Nvidia flips the script, holds back the flagship GPU from the gaming market that *MADE IT POSSIBLE* and wants to charge you $1K because it's got "SuperComputer Guts"??? That's bait and switch, stab in the back, whatever you want to call it. So yes, if you were actually in this market before, Nvidia has screwed you over to the tune of $1K for something that used to cost $500-$650 max.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    You only spend at max $360 for a video card as you stated, so this doesn't affect you and you haven't been screwed.

    Grow up crybaby. A company may chagre what it desires, and since you're never buying, who cares how many times you scream they screwed everyone ?
    NO ONE CARES, not even you, since you never even pony up $500, as you yourself stated in this long, continuous crybaby whine you made here, and have been making, since the 680 was released, or rather, since Charlie fried your brain with his propaganda.

    Go get your 98 cent a gallon gasoline while you're at it , you fool.
  • chizow - Saturday, February 23, 2013 - link

    Uh no, I've spent over $1K in a single GPU purchasing transaction, have you? I didn't think so.

    I'm just unwilling to spend *$2K* for what cost $1K in the past for less than the expected increase in performance. I spent $700 this round instead of the usual $1K because that's all I was willing to pay for a mid-range ASIC in GK104 and while it was still a significant upgrade to my last set of $1K worth of graphics cards, I wasn't going to plunk down $1K for a set of mid-range GK104 GTX 680s.

    It's obvious you have never bought in this range of GPUs in the past, otherwise you wouldn't be posting such retarded replys for what is clearly usurious pricing by Nvidia.

    Now go away, idiot.
  • CeriseCogburn - Tuesday, February 26, 2013 - link

    Wrong again, as usual.
    So what it boils down to is you're a cheapskate, still disgruntled, still believe in Charlie D's lie, and are angry you won't have the current top card at a price you demand.
    I saw your whole griping list in the other thread too, but none of what you purchase or don't purchase makes a single but of difference when it comes to your insane tinfoil hat lies that you have used for your entire argument

    Once again, pretending you aren't aware of production capacity leaves you right where you brainless rant started a long time ago.

    You cover your tracks whining about ATI's initial price, which wasn't out of line either, and ignore nVidia's immediate crushing of it when the 680 came out, as you still complained about the performance increase there. You're a crybaby, that's it.

    That's what you have done now for months on end, whined and whined and whined, and got caught over and over in exaggerations and lies, demanding a perfectly increasing price perf line slanting upwards, for years on end, lying about it's past, which I caught you on in the earlier reviews.

    Well dummy, that's not how performance/price increases work in any area of computer parts, anyway.
    Glad you're just another freaking parrot, as the reviewers have trained you fools to automaton levels.
  • Pontius - Thursday, February 21, 2013 - link

    My only interest at the moment is OpenCL compute performance. Sad to see it's not working at the moment, but once they get the kinks worked out, I would really love to see some benchmarks.

    Also, as any GPGPU programmer knows, the number one bottleneck for GPU computing is randomly accessing memory. If you are working only within the on-chip local memory, then yes, you get blazingly fast speeds on a GPU. However, the second you do something as simple as a += on a global memory location, your performance grinds to a screeching halt. I would really like to see the performance of these cards on random memory heavy OpenCL benchmarks. Thanks for the review!
  • codedivine - Thursday, February 21, 2013 - link

    We may do this in the future if I get some time off from univ work. Stay tuned :)
  • Pontius - Thursday, February 21, 2013 - link

    Thanks codedevine, I'll keep an eye out.
  • Pontius - Thursday, February 21, 2013 - link

    My only interest at the moment is OpenCL compute performance. Sad to see it's not working at the moment, but once they get the kinks worked out, I would really love to see some benchmarks.

    Also, as any GPGPU programmer knows, the number one bottleneck for GPU computing is randomly accessing memory. If you are working only within the on-chip local memory, then yes, you get blazingly fast speeds on a GPU. However, the second you do something as simple as a += on a global memory location, your performance grinds to a screeching halt. I would really like to see the performance of these cards on random memory heavy OpenCL benchmarks. Thanks for the review!
  • Bat123Man - Thursday, February 21, 2013 - link

    The Titan is nothing more than a proof-of-concept; "Look what we can do! Whohoo! Souped up to the max!" Nvidia is not intending this card to be for everyone. They know it will be picked up by a few well-moneyed enthusiasts, but it is really just a science project so that when people think about "the fastest GPU on the market", they think Nvidia.

    How often do you guys buy the best of the best as soon as it is out the door anyway ? $1000, $2000, it makes no difference, most of us wouldn't buy it even at 500 bucks. This is all about bragging rights, pure and simple.
  • Oxford Guy - Thursday, February 21, 2013 - link

    Not exactly. The chip isn't fully enabled.

Log in

Don't have an account? Sign up now