Crysis Warhead

Up next is our legacy title for 2013, Crysis: Warhead. The stand-alone expansion to 2007’s Crysis, at over 4 years old Crysis: Warhead can still beat most systems down. Crysis was intended to be future-looking as far as performance and visual quality goes, and it has clearly achieved that. We’ve only finally reached the point where high-end single-GPU cards have come out that can hit 60fps at 1920 with 4xAA, while low-end GPUs are just now hitting 60fps at lower quality settings and resolutions.

Crysis: Warhead

I can't believe it. An Intel integrated solution actually beats out an NVIDIA discrete GPU in a Crysis title. The 5200 does well here, outperforming the 650M by 12% in its highest TDP configuration. I couldn't run any of the AMD parts here as Bulldozer based parts seem to have a problem with our Crysis benchmark for some reason.

Crysis: Warhead is likely one of the simpler tests we have in our suite here, which helps explain Intel's performance a bit. It's also possible that older titles have been Intel optimization targets for longer.

Crysis: Warhead

Ramping up the res kills the gap between the highest end Iris Pro and the GT 650M.

Crysis: Warhead

Moving to higher settings and at a higher resolution gives NVIDIA the win once more. The margin of victory isn't huge, but the added special effects definitely stress whatever Intel is lacking within its GPU architecture.

Crysis 3 GRID 2
Comments Locked

177 Comments

View All Comments

  • jasonelmore - Sunday, June 2, 2013 - link

    Looking at the prices, this will raise the price or Lower the margins of the 13" Retina Macbook Pro by about $150 each.
  • mschira - Sunday, June 2, 2013 - link

    Yea laptops benefit most - good for them.
    But what about the workstation?
    So intel stopped being a CPU company and turned into a mediocre GPU company? (can even beat last years GT650M)
    I would applaude the rise in GPU performance if they had not completely forgotten the CPU.
    M.
  • n13L5 - Monday, June 3, 2013 - link

    You're exactly right.

    13" ultrabook buyers who need it the most get little to nothing out of this.

    And desktop users don't need or want GT3e and it uses system RAM. Better off buying a graphics card instead of upgrading to Haswell on desktops.
  • glugglug - Tuesday, June 4, 2013 - link

    While I agree this misses "where it would benefit most", I disagree on just *where* that is.

    I guess Intel agrees with Microsofts implicit decision that media center is dead. Real-time HQ quicksync would be perfect to transcode anything extenders couldn't handle, and would also make the scanning for and skipping of commercials incredibly efficient.
  • n13L5 - Tuesday, June 11, 2013 - link

    Core i5…4350U…Iris 5000…15W…1.5 GHz
    Core i7…4550U…Iris 5000…15W…1.5 GHz
    Core i7…4650U…Iris 5000…15W…1.7 GHz

    These should work. The 4650U is available in the Sony Duo 13 as we speak, though at a hefty price tag of $1,969
  • Eric S - Monday, July 1, 2013 - link

    The last 13" looks like they were prepping it for a fusion drive then changed their mind leaving extra space in the enclosure. I think it is due for an internal redesign that could allow for a higher wattage processor.

    I think the big deal is the OpenCL performance paired with ECC memory for the GPU. The Nvidia discrete processor uses non-ECC GDDR. This will be a big deal for users of Adobe products. Among other things, this solves the issue of using the Adobe mercury engine with non-ECC memory and the resulting single byte errors in the output. The errors are not a big deal for games, but may not be ideal for rendering professional output and scientific applications. This is basically a mobile AMD FireGL or Nvidia Quadro card. Now we just need OpenCL support for the currently CUDA-based mercury engines in After Effects and Premiere. I have a feeling that is coming or Adobe will also lose Mercury Engine compatibility with the new Mac Pro.
  • tviceman - Saturday, June 1, 2013 - link

    Impressive iGPU performance, but I knew Intel was absolutely full of sh!t when claiming equal to or better than GT 650m performance. Not really even close, typically behind by 30-50% across the board.
  • Krysto - Saturday, June 1, 2013 - link

    When isn't Intel full of shit? Always take what the improvements they claim and cut it in half, and you'll be a lot closer to reality.
  • xtc-604 - Saturday, June 8, 2013 - link

    Lol...you think that's bad? Look at Apple's claims. "over 200 new improvements in Mountain Lion"
  • piroroadkill - Saturday, June 1, 2013 - link

    sh<exclamation point>t? What are we? 9?

Log in

Don't have an account? Sign up now