Crysis: Warhead

Up next is our legacy title for 2013/2014, Crysis: Warhead. The stand-alone expansion to 2007’s Crysis, at over 5 years old Crysis: Warhead can still beat most systems down. Crysis was intended to be future-looking as far as performance and visual quality goes, and it has clearly achieved that. We’ve only finally reached the point where single-GPU cards have come out that can hit 60fps at 1920 with 4xAA, never mind 2560 and beyond.

Crysis: Warhead - 1920x1080 - Enthusiast Quality + 4x MSAA

Crysis: Warhead - 1920x1080 - E Shaders/G Quality

Crysis: Warhead - Min. Frame Rate - 1920x1080 - Enthusiast Quality + 4x MSAA

Crysis: Warhead - Min. Frame Rate - 1920x1080 - E Shaders/G Quality

Crysis 3 Total War: Rome 2
Comments Locked

177 Comments

View All Comments

  • kwrzesien - Tuesday, February 18, 2014 - link

    Seems coincidental that Apple is going to use TSMC for all production of the A8 chip with Samsung not ready yet, maybe Apple is getting priority on 20nm? Frankly what nVidia is doing with 28nm is amazing, and if the yields are great on this mature process maybe the price isn't so bad on a big die. Also keep in mind the larger the die the more surface area there is to dissipate heat, Haswell proved that moving to a very dense and small die can create even more thermal limitations.
  • DanNeely - Tuesday, February 18, 2014 - link

    Wouldn't surprise me if they are; all the fab companies other than Intel are wailing about the agonizingly high costs of new process transitions and Apple has a history of throwing huge piles of its money into accelerating the build up of supplier production lines in trade for initial access to the output.
  • dylan522p - Tuesday, February 18, 2014 - link

    Many rumors point to Apple actually making a huge deal with intel for 14nm on the A8.
  • Mondozai - Wednesday, February 19, 2014 - link

    Maybe 14 for the iPhone to get even better power consumption and 20 for the iPad? Or maybe give 14 nm to the premium models of the iPad over the mini to differentiate further and slow/reverse cannibalization.
  • Stargrazer - Tuesday, February 18, 2014 - link

    So, what about Unified Virtual Memory?
    That was supposed to be a major new feature of Maxwell, right? Is it not implemented in the 750s (yet - waiting for drivers?), or is there currently a lack of information about it?
  • A5 - Tuesday, February 18, 2014 - link

    That seems to be a CUDA-focused feature, so they probably aren't talking about it for the 750. I'm guessing it'll come up when the higher-end parts come out.
  • Ryan Smith - Thursday, February 20, 2014 - link

    Bingo. This is purely a consumer product; the roadmaps we show are from NV's professional lineup, simply because NV doesn't produce a similar roadmap for their graphics lineup (despite the shared architecture).
  • dragonsqrrl - Tuesday, February 18, 2014 - link

    "Meet The Reference GTX 750 Ti & Zotac GTX 750 Series"

    "This is the cooler style that most partners will mimic, as the 60W TDP of the GTX 650 Ti does not require a particularly large cooler"

    You mean 750 Ti right?
  • chizow - Tuesday, February 18, 2014 - link

    The performance and efficiency of this chip and Maxwell is nothing short of spectacular given this is still on 28nm. Can't wait to see the rest of the 20nm stack.

    Ryan, are you going to replace the "Architectural Analysis" at some point? Really looking forward to your deep-dive on that, or is it coming at a later date with the bigger chips?
  • dgingeri - Tuesday, February 18, 2014 - link

    In the conclusion, the writer talks about the advantages of the AMD cards, but after my experiences with my old 4870X2, I'd rather stick with Nvidia, and I know I'm not alone. Has AMD improved their driver quality to a decent level yet?

Log in

Don't have an account? Sign up now