GRID 2

The final game in our benchmark suite is also our racing entry, Codemasters’ GRID 2. Codemasters continues to set the bar for graphical fidelity in racing games, and with GRID 2 they’ve gone back to racing on the pavement, bringing to life cities and highways alike. Based on their in-house EGO engine, GRID 2 includes a DirectCompute based advanced lighting system in its highest quality settings, which incurs a significant performance penalty but does a good job of emulating more realistic lighting within the game world.

For as good looking as GRID 2 is, it continues to surprise us just how easy it is to run with everything cranked up, even the DirectCompute lighting system and MSAA (Forward Rendering for the win!). At 2560 the 290X has the performance advantage by 9%, but we are getting somewhat academic since it’s 80fps versus 74fps, placing both well above 60fps. Though 120Hz gamers may still find the gap of interest.

Moving up to 4K, we can still keep everything turned up including the MSAA, while pulling off respectable single-GPU framerates and great multi-GPU framerates. To no surprise at this point, the 290X further extends its lead at 4K to 21%, but as usually is the case you really want two GPUs here to get the best framerates. In which case the 290X CF is the runaway winner, achieving a scaling factor of 96% at 4K versus NVIDIA’s 47%, and 97% versus 57% at 2560. This means the GTX 780 SLI is going to fall just short of 60fps once more at 4K, leaving the 290X CF alone at 99fps.

Unfortunately for AMD their drivers coupled with GRID 2 currently blows a gasket when trying to use 4K @ 60Hz, as GRID 2 immediately crashes when trying to load with 4K/Eyefinity enabled. We can still test at 30Hz, but those stellar 4K framerates aren’t going to be usable for gaming until AMD and Codemasters get that bug sorted out.

Finally, it’s interesting to note that for the 290X this is the game where it gains the least on the 280X. The 290X performance advantage here is just 20%, 5% lower than any other game and 10% lower than the average. The framerates at 2560 are high enough that this isn’t quite as important as in other games, but it does show that the 290X isn’t always going to maintain that 30% lead over its predecessor.

Without any capturable 4K FCAT frametimes, we’re left with the delta percentages at 2560, which more so than any other game are simply not in AMD’s favor. The GTX 780 SLI is extremely consistent here, to the point of being almost absurdly so for a multi-GPU setup. 4% is the kind of variance we expect to find with a single-GPU setup, not something incorporating multiple GPUs. AMD on the other hand, though improving over the 280X by a few percent, is merely adequate at 17%. The low frame times will further reduce the real world impact of the difference between the GTX 780 SLI and 290X CF here, but this is another game AMD could stand some improvements, even if it costs AMD some of the 290X’s very strong CF scaling factor.

Hitman: Absolution Synthetics
Comments Locked

396 Comments

View All Comments

  • Spunjji - Friday, October 25, 2013 - link

    Word.
  • extide - Thursday, October 24, 2013 - link

    That doesn't mean that AMD can't come up with a solution that might even be compatible with G-Sync... Time will tell..
  • piroroadkill - Friday, October 25, 2013 - link

    That would not be in NVIDIA's best interests. If a lot of machines (AMD, Intel) won't support it, why would you buy a screen for a specific graphics card? Later down the line, maybe something like the R9 290X comes out, and you can save a TON of money on a high performing graphics card from another team.

    It doesn't make sense.

    For NVIDIA, their best bet at getting this out there and making the most money from it, is licencing it.
  • Mstngs351 - Sunday, November 3, 2013 - link

    Well it depends on the buyer. I've bounced between AMD and Nvidia (to be upfront I've had more Nvidia cards) and I've been wanting to step up to a larger 1440 monitor. I will be sure that it supports Gsync as it looks to be one of the more exciting recent developments.

    So although you are correct that not a lot of folks will buy an extra monitor just for Gsync, there are a lot of us who have been waiting for an excuse. :P
  • nutingut - Saturday, October 26, 2013 - link

    Haha, that would be something for the cartel office then, I figure.
  • elajt_1 - Sunday, October 27, 2013 - link

    This doesn't prevent AMD from making something similiar, if Nvidia decides to not make it open.
  • hoboville - Thursday, October 24, 2013 - link

    Gsync will require you to buy a new monitor. Dropping more money on graphics and smoothness will apply at the high end and for those with big wallets, but for the rest of us there's little point to jumping into Gsync.

    In 3-4 years when IPS 2560x1440 has matured to the point where it's both mainstream (cheap) and capable of delivering low-latency ghosting-free images, then Gsync will be a big deal, but right now only a small percentage of the population have invested in 1440p.

    The fact is, most people have been sitting on their 1080p screens for 3+ years and probably will for another 3 unless those same screens fail--$500+ for a desktop monitor is a lot to justify. Once the monitor upgrades start en mass, then Gsync will be a market changer because AMD will not have anything to compete with.
  • misfit410 - Thursday, October 24, 2013 - link

    G-String didn't kill anything, I'm not about to give up my Dell Ultrasharp for another Proprietary Nvidia tool.
  • anubis44 - Tuesday, October 29, 2013 - link

    Agreed. G-sync is a stupid solution to non-existent problem. If you have a fast enough frame rate, there's nothing to fix.
  • MADDER1 - Thursday, October 24, 2013 - link

    Mantle could be for either higher frame rate or more detail. Gsync sounds like just frame rate.

Log in

Don't have an account? Sign up now