Crysis: Warhead

Up next is our legacy title for 2013, Crysis: Warhead. The stand-alone expansion to 2007’s Crysis, at over 4 years old Crysis: Warhead can still beat most systems down. Crysis was intended to be future-looking as far as performance and visual quality goes, and it has clearly achieved that. We’ve only finally reached the point where single-GPU cards have come out that can hit 60fps at 1920 with 4xAA.

At 2560 we still have a bit of a distance to go before any single-GPU card can crack 60fps. In lieu of that Titan is the winner as expected. Leading the GTX 680 by 54%, this is Titan’s single biggest win over its predecessor, actually exceeding the theoretical performance advantage based on the increase in functional units alone. For some reason GTX 680 never did gain much in the way of performance here versus the GTX 580, and while it’s hard to argue that Titan has reversed that, it has at least corrected some of the problem in order to push more than 50% out.

In the meantime, with GTX 680’s languid performance, this has been a game the latest Radeon cards have regularly cleared. For whatever reason they’re a good match for Crysis, meaning even with all its brawn, Titan can only clear the 7970GE by 21%.

On the other hand, our multi-GPU cards are a mixed bag. Once more Titan loses to both, but the GTX 690 only leads by 15% thanks to GK104’s aforementioned weak Crysis performance. Meanwhile the 7990 takes a larger lead at 33%.

I’d also note that we’ve thrown in a “bonus round” here just to see when Crysis will be playable at 1080p with its highest settings and with 4x SSAA for that picture-perfect experience. As it stands AMD multi-GPU cards can already cross 60fps, but for everything else we’re probably a generation off yet before Crysis is completely and utterly conquered.

Moving on, we once again have minimum framerates for Crysis.

When it comes to Titan, the relative improvement in minimum framerates over GTX 680 is nothing short of obscene. Whatever it was that was holding back GTX 680 is clearly having a hard time slowing down Titan, leading to Titan offering 71% better minimum framerates. There’s clearly much more going on here than just an increase in function units.

Meanwhile, though Titan’s gains here over the 7970GE aren’t quite as high as they were with the GTX 680, the lead over the 7970GE still grows a bit to 26%. As for our mutli-GPU cards, this appears to be a case where SLI is struggling; the GTX 690 is barely faster than Titan here. Though at 31% faster than Titan, the 7990 doesn’t seem to be faltering much.

Sleeping Dogs Far Cry 3
POST A COMMENT

336 Comments

View All Comments

  • CeriseCogburn - Saturday, February 23, 2013 - link

    $800 or $900 dollars is close enough to a grand that it seems silly.

    Two 7970's at the $579 release and months long price is nearer $1200, and we have endless amd fanboy braggarts here claiming they did the right thing and went for it or certainly would since future proof and value is supreme.

    Now not a single one has said in this entire near 20 pages of comments they'd love to see the FUTURE PROOF ! of the 6 GIGS of ram onboard...
    Shortly ago it was all we ever heard, the absolute reason the 79xx series MUST be purchased over the 600 nVidia series...

    ROFL - the bare naked biased stupidity is almost too much to bear.

    Now the "futureproof" amd cards the crybaby liars screeched must be purchased for the 3G of ram future wins, ARE LOSERS TO THIS NEW TITAN PERIOD, AND FOREVERMORE.

    I guess the "word" "futureproof" was banned from the techtard dictionary just before this article posted.
    Reply
  • CeriseCogburn - Saturday, February 23, 2013 - link

    Thank you nVidia, 3 monitors, and a 4th, ultra rezz 5,760 x 1,080, and playable maxxed !

    ROFL -

    Thank you all the little angry loser fanboys who never brought this up over 22 pages of ranting whines.
    Reply
  • CeriseCogburn - Saturday, February 23, 2013 - link

    " After a few hours of trial and error, we settled on a base of the boost curve of 9,80 MHz, resulting in a peak boost clock of a mighty 1,123MHz; a 12 per cent increase over the maximum boost clock of the card at stock. "
    http://www.bit-tech.net/hardware/2013/02/21/nvidia...
    That's 27mhz according to here...

    LOL

    Love this place.
    Reply
  • TheJian - Sunday, February 24, 2013 - link

    Here's why they don't test more of the games I mentioned previously and others:
    http://www.techpowerup.com/reviews/NVIDIA/GeForce_...
    Crysis 2, with DX11 & HIRES pack added @1920x1200 it beats 3 radeons...Note you have to go to a game where NV doesn't care (warhead) to show it so badly. C2 shows much closer to 2 or 3 radeons than warhead which I don't think NV has spent a second on in probably 4 years.

    Page 11 has Diablo 3 scores.
    Diablo 3 scores

    Page 4 for AC3
    Assassins Creed 3, beats 1,2 or 3 Radeon 7970's at all tested resolutions...ROFL
    http://techreport.com/review/24381/nvidia-geforce-...
    Showing the same 20fps diff at 2560x1600, and showing same CF crapout, losing to singl 7970 even in both websites. Clearly AMD has per game problems. Which they allude to on page 16 of the review:
    "Just know that waiting for driver updates to fix problems has become a time-honored tradition for owners of CrossFire rigs."

    techpowerup.com titan review page 5
    Batman Arkham City, same story...You see this is why SLI/CF isn't all it's cracked up to be...Every game needs work, and if company X doesn't do the work, well, AC3/Bat AC etc is what happens...Crysis 2 seems almost the same also.

    techpowerup.com titan article page 8
    COD Black ops2, 2 titans handily beat 1/2/3 7970's.

    techpowerup page 13:
    F1 2012...ROFL, 1 titan to rule all cards...1/2/3 CF or SLI all beaten by ONE CARD. It seems they limit the games here for a reason at anandtech...Always pitching how fast two 7970's is in this article vs a titan, even though they half recommend ONE titan but 'not at these prices, dual cards will always win'.
    ...ummm, I beg to differ. It should win, if drivers are done correctly, but as shown not always.

    Note at anandtech, dirt showdown shows 3% for NV Titan vs. 7970ghz, but if you run the FAR better Dirt3:
    http://www.pcper.com/reviews/Graphics-Cards/NVIDIA...
    It's a ~20% win for Titan vs. 7970ghz. Crappy showdown game picked for a reason?

    Wait we're not done...
    techpowerup titan review page 15
    Max Payne3, 1 titan closing on 2 or 3 radeon 7970ghz's no matter the res...Not always good to get more cards I guess?

    techpowerup.com page 18 for starcraft 2
    Oh, snap...This is why they don't bench Starcraft 2...ROFL...1, 2 or 3 7970, all beat by 1 titan.
    But then, even a GTX 680 beats 3 7970's in all resolutions here...Hmmm...But then this is why you dropped it right? You found out a 680 beat 7970ghz way back here, even the 670 beat 7970ghz:
    http://www.anandtech.com/show/6025/radeon-hd-7970-...
    Totally explains why you came up with an excuse shortly after claiming a patch broke the benchmark. Most people would have just run with the patch version from a week earlier for the 660ti article. But as bad as 7970ghz lost to 670@1920x1200 it was clear the 660TI would beat it also...LOL. Haven't seen that benchmark since, just a comment it would be back in the future when patch allowed...NOPE. It must really suck for an AMD lover to have to cut out so many games from the new test suite.

    techpowerup.com titan review page 7
    Crap, someone benched Borderlands 2...LOL...Almost the same story, a titan looks good vs. 3 7970's (only loses in 5760x1080 which the single card isn't really for anyway).
    Again, proving adding more cards in some cases even goes backwards...LOL. It shouldn't, but you have to have the money to fix your drivers. Tough to do cutting 30% of your workforce & losing 1.18B.

    techpowerup page 20 titan article has WOW mists of pandaria.
    Dang those techpowerup guys, They had the nerve to bench the most popular game in the world. WOW Mists of Pandaria...Oh snap, 1 titan beats 3 7970's again, at all res. OUCH, even a SINGLE 680 does it...See why they don't bench other games here, and seem to act as though we all play pointless crap like warhead and Dirt3 showdown? Because if you bench a ton of today's hits (anything in 2012) except for a special few, you'll get results like techpowerup.

    That's ELEVEN, count them, 11 situations that kind of show a LOT MORE of the picture than they do here correct? I just wish I knew if they used mins or max at techpowerup (too lazy to ask for now), but either way it shows the weakness of multi-card setups without proper driver dev. It also shows why you need a wide range of TODAY's games tested for an accurate picture. Anandtech has really begun to drop the ball over the years since ryan took over card reviews. These games just add to the missing latency discussion issues that affect all radeons and are still being fixed on a GAME BY GAME basis. The driver fix doesn't affect them all at once. The last driver fixed 3 games (helped anyway), and every other game seems to need it's own fix. BUMMER. Ryan totally ignores this discussion. Techreport has done quite a few articles on it, and cover it in detail again in the titan review. PCper does also.

    Same Techpowerup article (since this site is puking on my links calling it spam) pg 19
    Skyrim, with all 3 radeon's at the bottom again. 1, 2 & 3 7970's beaton by ONE TITAN! So I guess 11 situations Ryan ignores. Does this make anyone take another look at the conclusions here on anandtech?
    PCper titan article shows the same in skyrim.
    http://www.anandtech.com/show/6025/radeon-hd-7970-...
    I kind of see why you dropped skyrim, even in your own tests at 1920x1200 670 was beating 7970ghz also, so even after 13.11 you'll still likely have a loss to 680 as shown at the other two links here, this was 4xmsaa too, which you complained about being a weakness in the 660ti article if memory serves...This kind of score short circuits comments like that huh? I mean 580/670 & 680 all pegged at 97.5fps clearly cpu bound not a memory issue I think, since all radeons are below 86. Well, shucks, can't have this benchmark in our next suite...ROFL. Anyone seeing a pattern here?

    Want more bias? Read the 660TI review's comments section where Ryan and I had a discussion about his conclusions in his article...ROFL. The fun starts about page 17 if memory serves (well not if you list all comments, diff page then). I only had to use HIS OWN benchmarks for the most part to prove his conclusioins BS in that case. He stooped so low as to claim a 27in monitor (bought from ebay...ROFL, even amazon didn't have a seller then, which I linked to) was a reason why 660ti's bandwidth etc sucked. Enthusiasts buy these apparently (cheapest was $500 from korea, next stop was $700 or so). Of course this is why they leave out mins here, as they would hit TEENS or single digits in that article if he posted them. All of the games he tested in that article wouldn't hit 30fps at 2560x1600 on EITHER amd or nv on a 66t0i. So why claim a victor?

    What about Crysis 3? Titan at or near top:
    http://www.guru3d.com/articles_pages/crysis_3_grap...
    Note he's telling you 40min, and really need 60 for smooth gameplay throughout as he says he uses avg. Also note at 2560x1600 with everything on, 7970/ 680 won't be playable as he's only avg 30. But see the point, only WARHEAD sucks on NV. But as show before nobody plays it, as servers are empty. 7970 wins over 680 by 20% in ryans warhead tests. But as soon as you go Crysis 2 dx11/hires textures or Crysis 3 it's suddenly a tie or loss.
    Page 8 in the same article
    Note the comment about 2560x1600, dipping to 25 or so even on gtx 680, and only fastest cards on the planet handle it fine:
    "At 2560x1600 with Very High Quality settings only the most expensive cards on the globe can manage. Please do bear in mind that our tests are based on averages, so YES there will be times your FPS drops to 25 fps in big fire fights and explosions, even with say a GTX 680."
    Reply
  • TheJian - Sunday, February 24, 2013 - link

    Sorry, this site pukes on a certain amount of links, so I had to change them all to just page refs for the most part: 2nd part here :)
    Ryans warhead comment from this article: "In the meantime, with GTX 680’s LANGUID performance, this has been a game the latest Radeon cards have regularly cleared. For whatever reason they’re a good match for Crysis, meaning even with all its brawn, Titan can only clear the 7970GE by 21%."
    No Ryan, just in this game...Not crysis 2 or 3...LOL. He gives yet another dig in the same page, because this 5yr old game is major important even though nobody plays it:
    "As it stands AMD multi-GPU cards can already cross 60fps, but for everything else we’re probably a generation off yet before Crysis is completely and utterly conquered."

    Jeez, if you'd just put down the 5yr old game and get with the times (crysis 2 or 3 will do Ryan or any of the games above, what 11 of them I gave?), you'll find the only LANGUID performer is AMD. So Titan is a gen behind if you believe him on all CRYSIS games? If NV is a gen behind, how come nobody else shows this in Cryis2 DX11/Hires pack, or Crysis 3? Oh, that's right, NV actually optimizes for games that are less than 5yrs old...ROFL. Honestly I don't think AMD has done anything on their driver for warhead for 5yrs either...They just happen to be faster in a 5yr old game. :) And NV doesn't care. Why would they with the non-existent player base shown above on servers? Is Cryengine 2 being used in a bunch of games I don't know about? Nope, just WARHEAD. I've never heard of the other 3 on the list, but crysis 1 is not quite the same engine and as shown above performs quite well on kepler(1fps difference on 680vs7970ghz @1920x1200) same for crysis 2 & 3. Only warhead sucks on kepler.
    Search wikipedia.org for CryEngine
    You can do this for any game people, to find out what is out now, and what is coming. Look up unreal 3 engine for instance and take a gander at the # of games running it vs. Warhead.
    search wikipedia.org for List of Unreal Engine games
    Complete list of u3 base games there.

    http://techreport.com/review/24381/nvidia-geforce-...
    Guild Wars 2, Titan beating single 7970 & 7970CF at 2560x1600 by a lot...Another ignored game with 3mil sold. Titan is beating CF 7970 by ~20%. OUCH.

    http://www.guru3d.com/articles_pages/crysis_3_grap...
    Reminder, crysis 3 2560x1600 680gtx (that languid card on warhead according to Ryan) TIES 7970ghz in guru3d's benchmarks. Mind you, neither can run there as it's 30fps for both. You'll dip to 10-20fps...ROFL. But point proven correct? RYAN is misrepresenting the facts. Unless you play 3 gen old warhead instead of crysis2 or crysis 3 (or even updated crysis 1 now on cryengine3 according to the site, probably why it does well on kepler too)? Who does that? You still play serious sam1 or far cry 1 too? Still playing doom1?

    Is that 14 games I'm up to now? That's a lot of crap you couldn't use in the new suite huh?

    http://www.anandtech.com/show/6159/the-geforce-gtx...
    The comments section for Ryan's 660ti article. Realizing what I said above, go back and read our conversation. Read as he attempts to defend the bias conclusions in that article, and read the data from his OWN article I used then to prove those cards were made for 1920x1200 and below, not 2560x1600 or 2560x1440 as Ryan defended. Look at the monitor he was pitching and me laying out how you had to EBAY it from KOREA to even make his statements make sense (I gave links, showed that ebay company in korea didn't even have an about page etc...ROFL). Not that you'd actually order a monitor direct from some DUDE in korea giving up your visa like that anyway, how risky is that for a $500 monitor? But it was humorous watching him and Jarred defend the opinions (Jarred basically called me a ahole and said I was uninformed...LOL). The links and the data said otherwise then, and above I just did it again. This hasn't changed much with dual cards or titan. You still need these to play above 1920x1200 at above 30fps and some games still bring the top cards to their knees at 2560x1600 etc. That's why they don't post minimums here. All of the arguments about bandwidth being an issue go out the window when you find out you'll be running 10-20fps to prove it's true. One of the pages in the 660TI article is titled ~"that darned memory bandwidth"...Really? I also pointed out the # of monitors selling @1920x1200 or less (68 if memory serves) and above it on newegg.com at the time. I pointed at that steampowered.com showed less than 2% market share above 1920x1200 (and almost all had dual cards according to their survey, NOT a 660TI or below). I doubt it's much higher now.

    Hopefully one day soon Anand will stop this junk. It's hard to believe this is the new game suite...I mean seriously? That's just sad. But then Anand himself ignored basically the entire freakin' earnings report for NVDA and didn't even respond to the only comment on his NON-informational post (mine...LOL).
    http://www.anandtech.com/show/6746/tegra-4-shipmen...
    I'm the only comment... $20 says Nobody from Anandtech addresses this post either... :) What can they say? The data doesn't lie. Don't believe me...I provided all the links to everything so you can judge them yourselves (and what they've said or done - or not done in all these cases). They didn't address last Q's financial/market share whipping NVDA gave AMD either. I love AMD myself. I currently run a 5850, and put off my 660ti purchase as I'm not really impressed with either side currently and can wait for now (had a black friday purchase planned but passed), but the BIAS here has to stop. Toms, Techreport, PCper etc is reporting heavily on latency problems on radeons (at least 1 other user already mentioned it in this comment section) and AMD is redoing their memory manager to fix it all! AMD released a driver just last month fixing 3 games for this (fixed borderlands2, guild wars2 and one other). PCper.com (Ryan Shrout) is still working on exactly how to accurately test it (others have already decided I guess but more will come out about this). He's calling it frame rating capture:
    http://www.pcper.com/reviews/Graphics-Cards/Frame-...
    Note his comment on situation:
    "This is the same graph with data gathered from our method that omits RUNT frames that only represent pixels under a certain threshold (to be discussed later). Removing the tiny slivers gives us a "perceived frame rate" that differs quite a bit - CrossFire doesn't look faster than a single card."
    AMD cheating here or what (they've both done tricks at some point in their history)? I look forward to seeing Ryan Shrout's data shortly. He used to run AMDMB.com so I'm pretty sure he's pro AMD :)
    http://www.tomshardware.com/reviews/geforce-gtx-ti...
    more latency stuff. Note AMD is working on a new memory manager for GCN supposedly to fix this. I wonder if this will lower their fps avg.

    I didn't form my opinion by making stuff up here. AMD has great stuff, but I provided a LOT of links above that say it's not quite like Anandtech would have you believe. I can find benchmarks where AMD wins, but that's not the point. Ryan always makes the claim AMD wins (check his 660TI article conclusions for example). At best you could call this even, at worst it looks pretty favorable to NV cards here IMHO. IF you toss out crap/old 2 games (warhead, dirt showdown) that nobody plays and add in the 14 above this is pretty grimm for AMD correct? Pretty grimm for Anandtech's opinion too IMHO. If you can argue with the data, feel free I'd like to see it. None of the benchmarks are what you'd buy either, they are all reference clocked cards which nobody in their right mind would purchase. Especially the 660TI's, who buys ref clocked 660TI's? Toms/anand/hardocp seem to love to use them even though it's not what we'd buy as the same price gets you another 100mhz easily OOBE.

    I'd apologize for the wall, but it's not an opinion, all of the benchmarks above are facts and NOT from me. You can call me crazy for saying this site has AMD bias, but that won't change the benchmarks, or the ones Anandtech decided to REMOVE from their test suite (skyrim, borderlands2, diablo3, starcraft2 - all have been in previous tests here, but removed at 660ti+ articles). Strange turn of events?
    Reply
  • Ryan Smith - Monday, February 25, 2013 - link

    "I'm the only comment... $20 says Nobody from Anandtech addresses this post either... :) What can they say?"

    Indeed. What can we say?

    I want to respond to all user comments, but I am not going to walk into a hostile situation. You probably have some good points, but if we're going to be attacked right off the bat, how are we supposed to have a meaningful discussion?
    Reply
  • TheJian - Monday, February 25, 2013 - link

    If that's what you call an attack, it has to be the most polite one I've seen. The worst I called you was BIASED.

    Please, feel free to defend the 14 missing games, and the choice of the warhead (which doesn't show the same as crysis 1, 2 or 3 as shown) and dirt showdown. Also why Starcraft2 was in but now out when a launch event for the next one is coming with the next few weeks. Not an important game? The rest above are all top sellers also. Please comment on skyrim, as with the hires pack that is OFFICIAL as I noted in response to CeriseCogburn (where right above his post you call it cpu limited, as his link and mine show it is NOT, and AMD was losing in his by 40fps! out of ~110 if that isn't GPU separation I don't know what is). Are you trying to say you have no idea what the HI-RES pack is for skyrim out for over a year now? Doesn't the term HI-RES instantly mean more GPU taxing than before?

    Nice bail...I attacked your data and your credibility here, not YOU personally (I don't know you, don't care either way what you're like outside your reviews). Still waiting for you to attack my data. Still waiting for an explanation of the game choices and why all the ones I listed are left out for 2 games that sold 100,000 units or less (total failures) and one of them (warhead) from 5 years ago that doesn't represent Crysis 1, 2 or 3 benchmarks shown from all the titan articles (where all the keplers did very well with a lot of victories at 1920x1200, and some above, not just titan).

    This isn't nor have any of my posts been hostile. Is it hostile because I point out you misrepresenting the facts? Is it hostile because I backed it with a lot of links showing it NOT like you're saying (which enforces the misrepresentation of the facts comments)? It would be (perhaps) hostile if I insinuated you were an Ahole and have an "uninformed opinion" like Jarred Walton said about me in the 660ti comments section (which I never did to either of you) even after I provided boat loads of PROOF and information like I did here. So basically it appears, if I provide ample proof in any way say you're not being factual I'm labelled hostile. I was even polite in my response to Jarred after that :)

    How does one critique your data without being hostile? :)

    Never mind I don't want an answer to your distraction comment. Defend your data, and rebut mine. I'm thinking there are curious people after all I provided. It won't be meaningful until you defend your data and rebut the data from all the sites I provided (heck any, they all show the same, 14 games where NV does pretty well and not so good for radeons or CF, in some cases even SLI). I've done all the work for you, all you have to do is explain the results of said homework, or just change your "suite of benchmarks" for gaming. Clearly you're leaving out a lot of the story which heavily slants to NV if added. The ones in the links are the most popular games out today and in the last 15 months. Why are they missing? All show clear separation in scores (in same family of gpu's or out). These are great gpu test games as shown. So please, defend your data and game choices, then do some rebuttal of the evidence. IF someone said this much stuff about my data, and I thought I had a leg to stand on, I'd certainly take some time to rebut the person's comments. Politely just as all my comments were. Including this one. I can't think of a defense here, but if you can and it makes sense I'll acknowledge it on the spot. :)
    Reply
  • CeriseCogburn - Tuesday, February 26, 2013 - link

    I appreciate that, and read all the words and viewed all the links and then some.

    I have noted extreme bias in many past articles in the wording that is just far too obvious and friends and I have just had a time rolling through it.
    I commented a few reviews back pointing a bit of it out yet there's plenty of comments that reek as well.
    I am disappointed yet this site is larger than just video cards so I roll with it so to speak.

    Now that you've utterly cracked open the factual can exposing the immense amd favored bias, and the benchmark suite is scheduled to change -lol- that's how the timing of things work and coincide so often it seems.

    Anyway, you not only probably have some points, you absolutely do have a lot of unassailable points but then people do have certain "job pressures" so I don't expect any changes at all but am very appreciative and do believe you have done everyone a world of good with your posts.
    The 4 benchmarks dropped was just such a huge nail, lol.

    It's all good, some people like to live in a fantasy type blissful fog and feel good and just the same when reality shines the light it's all good too and even better.

    I absolutely appreciate it, know that.
    You did not waste your time nor anyone else's.
    Reply
  • thralloforcus - Monday, February 25, 2013 - link

    Please test folding@home and bitcoin mining performance! Those would be my main justifications for getting a new video card to replace the 570 Classified's I have in SLI. Reply
  • Ryan Smith - Monday, February 25, 2013 - link

    As noted elsewhere, OpenCL is currently non-functional on Titan. Until it's fixed we can't run FAHBench. As for BitCoin Jarred has already gone into some good reasons why it's not a very useful GPU benchmark, and why GPUs are becoming less than useful for it. Reply

Log in

Don't have an account? Sign up now