Crysis: Warhead

Up next is our legacy title for 2013, Crysis: Warhead. The stand-alone expansion to 2007’s Crysis, at over 4 years old Crysis: Warhead can still beat most systems down. Crysis was intended to be future-looking as far as performance and visual quality goes, and it has clearly achieved that. We’ve only finally reached the point where single-GPU cards have come out that can hit 60fps at 1920 with 4xAA.

Crysis: Warhead - 1920x1080 - Enthusiast Quality + 4x MSAA

Crysis: Warhead - 1920x1080 - E Shaders/G Quality

Crysis has shown to favor raw ROP performance and memory bandwidth above shader performance, so it’s been a rough game for NVIDIA’s Kepler cards, which typically have less memory bandwidth than their AMD competition. In this case at Gamer quality the GTX 650 Ti Boost still can’t crack 60fps, and it’s trailing the 7850 by 12%. On the other hand it has 50% more memory bandwidth than the 7790, so the 1GB variant may be interesting to see here if it can achieve similar results as the 2GB variant.

It’s interesting to note though that because this game depends so much on ROP performance and memory bandwidth that this is as close as we’ll see the GTX 650 Ti Boost get to the GTX 660. There’s less than 5% separating them at Gamer quality; these cards have plenty of shading/texturing performance, but not enough ROP performance to satisfy Crysis.

Crysis: Warhead - Min. Frame Rate- 1920x1080 - Enthusiast Quality + 4x MSAA

Crysis: Warhead - Min. Frame Rate - 1920x1080 - E Shaders/G Quality

Jumping to our minimum framerates, we do see things open up a bit as shader bottlenecking does occur in a few places. The result is that the GTX 650 Ti Boost falls well behind the 7850 in minimums, and the full GK106 GTX 660 pulls ahead. Still, the difference between the GTX 650 Ti and its boost variant is nothing short of staggering; the boost card leads by 45% here, relying heavily on those ROP and memory bandwidth advantages.

Sleeping Dogs Far Cry 3
Comments Locked

78 Comments

View All Comments

  • Oxford Guy - Tuesday, March 26, 2013 - link

    Hopefully only to be laughed out of the market.
  • xdesire - Tuesday, March 26, 2013 - link

    Nvidia's last struggles with Kepler gen. :)
  • Eugene86 - Tuesday, March 26, 2013 - link

    So this card beats out the slightly cheaper new 7790 in every game as well as the slightly more expensive 7850 in half of the games?

    Looks like a pretty good deal to me. What reason do people have to buy AMD again?
  • Zstream - Tuesday, March 26, 2013 - link

    I think the only game it won was in Shogun and bf3? I'm not sure on your statement or if you read the article or not.
  • Eugene86 - Tuesday, March 26, 2013 - link

    Did you read the article? Because the only games (one game) that the nvidia card lost in is Dirt.
  • aTonyAtlaw - Tuesday, March 26, 2013 - link

    I think you were looking at the GTX 660, friend. The 650 Ti Boost, the card under review, placed beneath the 7850 in nearly every test. They even talk about this on the conclusions page.
  • Eugene86 - Tuesday, March 26, 2013 - link

    I was talking about the 7790, that's why.
  • Warren21 - Tuesday, March 26, 2013 - link

    The comment you were replying to was challenging your statement of the 650 TiB beating the 7850 in "half". Two does not constitute half, it constitutes two.
  • just4U - Wednesday, March 27, 2013 - link

    Well Eugene, looking at your original statement you seem to be saying it beats the 7790 and the 7850.. (you added in the "as well as.." ) Anyway, no clue how the 2G 7790 does or how the 1G 650TIB does.. so it's all sorta moot. On paper if you ask me the 650 is the better card overall.
  • EzioAs - Tuesday, March 26, 2013 - link

    Bundled games? HD7850 uses less power and overclocks better? AMD cuts the price of their cards way more resulting in better performance per dollar cards before Nvidia actually release one that could fight back?

    It's true the GTX650ti Boost does seem pretty good for a newly released card in terms of performance per dollar but your question just shows a little bit "fanboyism".

Log in

Don't have an account? Sign up now