Crysis: Warhead

Up next is our legacy title for 2014, Crysis: Warhead. The stand-alone expansion to 2007’s Crysis, at over 5 years old Crysis: Warhead can still beat most systems down. Crysis was intended to be future-looking as far as performance and visual quality goes, and it has clearly achieved that. We’ve only finally reached the point where single-GPU cards have come out that can hit 60fps at 1920 with 4xAA, never mind 2560 and beyond.

Crysis: Warhead - 3840x2160 - Gamer Quality

Crysis: Warhead - 2560x1440 - Enthusiast Quality + 4x MSAA

Crysis: Warhead - 1920x1080 - Enthusiast Quality + 4x MSAA

At the launch of the GTX 680, Crysis: Warhead was rather punishing of the GTX 680’s decreased memory bandwidth versus GTX 580. The GTX 680 was faster than the GTX 580, but the gains weren’t as great as what we saw elsewhere. For this reason the fact that the GTX 980 can hold a 60% lead over the GTX 680 is particularly important because it means that NVIDIA’s 3rd generation delta color compression is working and working well. This has allowed NVIDIA to overcome quite a bit of memory bandwidth bottlenecking in this game and push performance higher.

That said, since GTX 780 Ti has a full 50% more memory bandwidth, it’s telling that GTX 780 Ti and GTX 980 are virtually tied in this benchmark. Crysis: Warhead will gladly still take what memory bandwidth it can get from NVIDIA cards.

Otherwise against AMD cards this is the other game where GTX 980 can’t cleanly defeat R9 290XU. These cards are virtually tied, with AMD edging out NVIDIA in two of three tests. Given their differing architectures I’m hesitant to say this is a memory bandwidth factor as well, but if it were then R9 290XU has a very big memory bandwidth advantage going into this.

Crysis: Warhead - Min. Frame Rate - 3840x2160 - Gamer Quality

Crysis: Warhead - Min. Frame Rate - 2560x1440 - Enthusiast Quality + 4x MSAA

Crysis: Warhead - Min. Frame Rate - 1920x1080 - Enthusiast Quality + 4x MSAA

When it comes to minimum framerates the story is much the same, with the GTX 980 and AMD trading places. Though it’s interesting to note that the GTX 980 is doing rather well against the GTX 680 here; that memory bandwidth advantage would appear to really be paying off with minimum framterates.

Crysis 3 Total War: Rome 2
Comments Locked

274 Comments

View All Comments

  • Kutark - Sunday, September 21, 2014 - link

    I'd hold on to it. Thats still a damn fine card. Honestly you could probably find a used one on ebay for a decent price and SLI it up.

    IMO though id splurge for a 970 and call it a day. I've got dual 760's right now, first time i've done SLI in prob 10 years. And honestly, the headaches just arent worth it. Yeah, most games work, but some games will have weird graphical issues (BF4 near release was a big one, DOTA 2 doesnt seem to like it), others dont utilize it well, etc. I kind of wish id just have stuck with the single 760. Either way, my 2p
  • SkyBill40 - Wednesday, September 24, 2014 - link

    @ Kutark:

    Yeah, I tried to buy a nice card at that time despite wanting something higher than a 660Ti. But, as my wallet was the one doing the dictating, it's what I ended up with and I've been very happy. My only concern with a used one is just that: it's USED. Electronics are one of those "no go" zones for me when it comes to buying second hand since you have no idea about the circumstances surrounding the device and seeing as it's a video card and not a Blu Ray player or something, I'd like to know how long it's run, it's it's been OC'd or not, and the like. I'd be fine with buying another one new but not for the prices I'm seeing that are right in line with a 970. That would be dumb.

    In the end, I'll probably wait it out a bit more and decide. I'm good for now and will probably buy a new 144Hz monitor instead.
  • Kutark - Sunday, September 21, 2014 - link

    Psshhhhh.... I still have my 3dfx Voodoo SLI card. Granted its just sitting on my desk, but still!!!

    In all seriousness though, my roommate, who is NOT a gamer, is still using an old 7800gt card i had laying around because the video card in his ancient computer decided to go out and he didnt feel like building a new one. Can't say i blame him, Core 2 quad's are juuust fine for browsing the web and such.
  • Kutark - Sunday, September 21, 2014 - link

    Voodoo 2, i meant, realized i didnt type the 2.
  • justniz - Tuesday, December 9, 2014 - link

    >> the power bills are so ridiculous for the 8800 GTX!

    Sorry but this is ridiculous. Do the math.

    Best info I can find is that your card is consuming 230w.
    Assuming you're paying 15¢/kWh, even gaming for 12 hours a day every day for a whole month will cost you $12.59. Doing the same with a gtx980 (165w) would cost you $9.03/month.

    So you'd be paying maybe $580 to save $3.56 a month.
  • LaughingTarget - Friday, September 19, 2014 - link

    There is a major difference between market capitalization and available capital for investment. Market Cap is just a rote multiplication of the number of shares outstanding by the current share price. None of this is available for company use and is only an indirect measurement of how well a company is performing. Nvidia has $1.5 billion in cash and $2.5 billion in available treasury stock. Attempting to match Intel's process would put a significant dent into that with little indication it would justify the investment. Nvidia already took on a considerable chunk of debt going into this year as well, which would mean that future offerings would likely go for a higher cost of debt, making such an investment even harder to justify.

    While Nvidia is blowing out AMD 3:1 on R&D and capacity, Intel is blowing both of them away, combined, by a wide margin. Intel is dropping $10 billion a year on R&D, which is a full $3 billion beyond the entire asset base of Nvidia. It's just not possible to close the gap right now.
  • Silma - Saturday, September 20, 2014 - link

    I don't think you realize how many billion dollars you need to spend to open a 14 nm factory, not even counting R&D & yearly costs.
    It's humongous, there is a reason why there are so few foundries in the world.
  • sp33d3r - Saturday, September 20, 2014 - link

    Well, if the NVIDIA/AMD CEOs is blind enough and cannot see it coming, then intel are gonna manufacture their next integrated graphics on a 10 or 8 nm chip and though immature will be a tough competition to them in terms of power and efficiency and even weight.

    remember currently pcs load integrated graphics as a must by intel and people add third party graphics only 'cause intels is not good enough literally adding weight of two graphics cards (Intels and third partys) to the product. Its all worlds apart more convenient when integrated graphics outperforms or able to challenge third party GPUs, we would just throw away NVIDIA and guess what they wont remain a monopoly anymore rather completely wiped out

    Besides Intels integrated graphics are getting more mature in terms of not just die size with every launch, just compare 4000s with 5000s, it wont be long before they catch up.
  • wiyosaya - Friday, September 26, 2014 - link

    I have to agree that it is partly not about the verification cost breaking the bank. However, what I think is the more likely reason is that since the current node works, they will try to wring every penny out of that node. Look at the prices for the Titan Z. If this is not an attempt to fleece the "gotta have it buyer," I don't know what is.
  • Ushio01 - Thursday, September 18, 2014 - link

    Wouldn't paying to use the 22nm fabs be a better idea as there about to become under used and all the teething troubles have been fixed.

Log in

Don't have an account? Sign up now