Crysis: Warhead

Up next is our legacy title for 2013, Crysis: Warhead. The stand-alone expansion to 2007’s Crysis, at over 4 years old Crysis: Warhead can still beat most systems down. Crysis was intended to be future-looking as far as performance and visual quality goes, and it has clearly achieved that. We’ve only finally reached the point where single-GPU cards have come out that can hit 60fps at 1920 with 4xAA.

At 2560 we still have a bit of a distance to go before any single-GPU card can crack 60fps. In lieu of that Titan is the winner as expected. Leading the GTX 680 by 54%, this is Titan’s single biggest win over its predecessor, actually exceeding the theoretical performance advantage based on the increase in functional units alone. For some reason GTX 680 never did gain much in the way of performance here versus the GTX 580, and while it’s hard to argue that Titan has reversed that, it has at least corrected some of the problem in order to push more than 50% out.

In the meantime, with GTX 680’s languid performance, this has been a game the latest Radeon cards have regularly cleared. For whatever reason they’re a good match for Crysis, meaning even with all its brawn, Titan can only clear the 7970GE by 21%.

On the other hand, our multi-GPU cards are a mixed bag. Once more Titan loses to both, but the GTX 690 only leads by 15% thanks to GK104’s aforementioned weak Crysis performance. Meanwhile the 7990 takes a larger lead at 33%.

I’d also note that we’ve thrown in a “bonus round” here just to see when Crysis will be playable at 1080p with its highest settings and with 4x SSAA for that picture-perfect experience. As it stands AMD multi-GPU cards can already cross 60fps, but for everything else we’re probably a generation off yet before Crysis is completely and utterly conquered.

Moving on, we once again have minimum framerates for Crysis.

When it comes to Titan, the relative improvement in minimum framerates over GTX 680 is nothing short of obscene. Whatever it was that was holding back GTX 680 is clearly having a hard time slowing down Titan, leading to Titan offering 71% better minimum framerates. There’s clearly much more going on here than just an increase in function units.

Meanwhile, though Titan’s gains here over the 7970GE aren’t quite as high as they were with the GTX 680, the lead over the 7970GE still grows a bit to 26%. As for our mutli-GPU cards, this appears to be a case where SLI is struggling; the GTX 690 is barely faster than Titan here. Though at 31% faster than Titan, the 7990 doesn’t seem to be faltering much.

Sleeping Dogs Far Cry 3
Comments Locked

337 Comments

View All Comments

  • chizow - Friday, February 22, 2013 - link

    Idiot...has the top end card cost 2x as much every time? Of course not!!! Or we'd be paying $100K for GPUs!!!
  • CeriseCogburn - Saturday, February 23, 2013 - link

    Stop being an IDIOT.

    What is the cost of the 7970 now, vs what I paid for it at release, you insane gasbag ?
    You seem to have a brainfart embedded in your cranium, maybe you should go propose to Charlie D.
  • chizow - Saturday, February 23, 2013 - link

    It's even cheaper than it was at launch, $380 vs. $550, which is the natural progression....parts at a certain performance level get CHEAPER as new parts are introduced to the market. That's called progress. Otherwise there would be NO INCENTIVE to *upgrade* (look this word up please, it has meaning).

    You will not pay the same money for the same performance unless the part breaks down, and semiconductors under normal usage have proven to be extremely venerable components. People expect progress, *more* performance at the same price points. People will not pay increasing prices for things that are not essential to life (like gas, food, shelter), this is called the price inelasticity of demand.

    This is a basic lesson in business, marketing, and economics applied to the semiconductor/electronics industry. You obviously have no formal training in any of the above disciplines, so please stop commenting like a ranting and raving idiot about concepts you clearly do not understand.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    They're ALREADY SOLD OUT STUPID IDIOT THEORIST.

    LOL

    The true loser, an idiot fool, wrong before he's done typing, the "education" is his brainwashed fried gourd Charlie D OWNZ.
  • chizow - Sunday, February 24, 2013 - link

    And? There's going to be some demand for this card just as there was demand for the 690, it's just going to be much lower based on the price tag than previous high-end cards. I never claimed anything otherwise.

    I outlined the expectations, economics, and buying decisions in general for the tech industry and in general, they hold true. Just look around and you'll get plenty of confirmation where people (like me) who previously bought 1, 2, 3 of these $500-650 GPUs are opting to pass on a single Titanic at $1000.

    Nvidia's introduction of an "ultra-premium" range is an unsustainable business model because it assumes Nvidia will be able to sustain this massive performance lead over AMD. Not to mention they will have a harder time justifying the price if their own next-gen offering isn't convincingly faster.
  • CeriseCogburn - Tuesday, February 26, 2013 - link

    You're not the nVidia CEO nor their bean counter, you whacked out fool.

    You're the IDIOT that babbles out stupid concepts with words like "justifying", as you purport to be an nVidia marketing hired expert.

    You're not. You're a disgruntled indoctrinated crybaby who can't move on with the times, living in a false past, and waiting for a future not here yet.
  • Oxford Guy - Thursday, February 21, 2013 - link

    The article's first page has the word luxury appearing five times. The blurb, which I read prior to reading the article's first page has luxury appearing twice.

    That is 7 uses of the word in just a bit over one page.

    Let me guess... it's a luxury product?
  • CeriseCogburn - Tuesday, February 26, 2013 - link

    It's stupid if you ask me. But that's this place, not very nVidia friendly after their little didn't get the new 98xx fiasco, just like Tom's.

    A lot of these top tier cards are a luxury, not just the Titan, as one can get by with far less, the problem is, the $500 cards fail often at 1920x resolution, and this one perhaps can be said to have conquered just that, so here we have a "luxury product" that really can't do it's job entirely, or let's just say barely, maybe, as 1920X is not a luxury resolution.
    Turn OFF and down SOME in game features, and that's generally, not just extreme case.

    People are fools though, almost all the time. Thus we have this crazed "reviews" outlook distortion, and certainly no such thing as Never Settle.
    We're ALWAYS settling when it comes to video card power.
  • araczynski - Thursday, February 21, 2013 - link

    too bad there's not a single game benchmark in that whole article that I give 2 squirts about. throw in some RPG's please, like witcher/skyrim.
  • Ryan Smith - Thursday, February 21, 2013 - link

    We did test Skyrim only to ultimately pass on it for a benchmark. The problem with Skyrim (and RPGs in general) is that they're typically CPU limited. In this case our charts would be nothing but bar after bar at roughly 90fps, which wouldn't tell us anything meaningful about the GPU.

Log in

Don't have an account? Sign up now