Hitman: Absolution

The second-to-last game in our lineup is Hitman: Absolution. The latest game in Square Enix’s stealth-action series, Hitman: Absolution is a DirectX 11 based title that though a bit heavy on the CPU, can give most GPUs a run for their money. Furthermore it has a built-in benchmark, which gives it a level of standardization that fewer and fewer benchmarks possess.

Hitman is another game that makes the 290X shine, with the 290X taking a 16% lead over the GTX 780. In fact we’re getting very close to being CPU limited here, which may be limiting just how far ahead the 290X can pull. However this also means there’s plenty of GPU headroom for enabling MSAA, which we don’t use in this benchmark.

Moving on to 4K, the 290X once again extends its lead, this time by among the largest such leads to 30% over the GTX 780. This is actually good enough for 43fps even at Ultra quality, but for better than that you’ll need multiple GPUs.

To that end we’re CPU limited at 2560, though for some reason the GTX 780 SLI fares a bit better regardless. Otherwise at 4K the GTX 780 SLI achieves better scaling than the 290X CF – 64% versus 56% –so while it can’t take the lead it does at least close the gap some. Though enough of a gap remains that the GTX 780 SLI will still come a bit short of 60fps at 4K Ultra settings, which makes the 290X CF the only setup capable of achieving that goal.

When it comes to minimum framerates the 290X is able to build on its lead just a bit more here at both 2560 and 4K. In both cases the performance advantage over the GTX 780 grows by a further 3%.

Finally, for our delta percentages we can see that unfortunately for AMD they are regressing a bit here. The variance for the 290X CF at 2560 is 24%, which is greater than what the 280X CF was already seeing, and significantly greater than the GTX 780 SLI. Consequently Hitman is a good example of how although AMD’s CF frame pacing situation is generally quite good, there are going to be games where they need to buckle down a bit more and get it under control, as evidenced by what NVIDIA has been able to achieve. Though it is interesting to note that AMD’s frame pacing at 4K improves over 2K, by over 8%.  AMD would seem to have an easier time keeping frame times under control when they’re outright longer, which isn’t wholly surprising since it means there’s more absolute time to resolve the matter.

Total War: Rome 2 GRID 2
POST A COMMENT

396 Comments

View All Comments

  • Spunjji - Friday, October 25, 2013 - link

    Word. Reply
  • extide - Thursday, October 24, 2013 - link

    That doesn't mean that AMD can't come up with a solution that might even be compatible with G-Sync... Time will tell.. Reply
  • piroroadkill - Friday, October 25, 2013 - link

    That would not be in NVIDIA's best interests. If a lot of machines (AMD, Intel) won't support it, why would you buy a screen for a specific graphics card? Later down the line, maybe something like the R9 290X comes out, and you can save a TON of money on a high performing graphics card from another team.

    It doesn't make sense.

    For NVIDIA, their best bet at getting this out there and making the most money from it, is licencing it.
    Reply
  • Mstngs351 - Sunday, November 3, 2013 - link

    Well it depends on the buyer. I've bounced between AMD and Nvidia (to be upfront I've had more Nvidia cards) and I've been wanting to step up to a larger 1440 monitor. I will be sure that it supports Gsync as it looks to be one of the more exciting recent developments.

    So although you are correct that not a lot of folks will buy an extra monitor just for Gsync, there are a lot of us who have been waiting for an excuse. :P
    Reply
  • nutingut - Saturday, October 26, 2013 - link

    Haha, that would be something for the cartel office then, I figure. Reply
  • elajt_1 - Sunday, October 27, 2013 - link

    This doesn't prevent AMD from making something similiar, if Nvidia decides to not make it open. Reply
  • hoboville - Thursday, October 24, 2013 - link

    Gsync will require you to buy a new monitor. Dropping more money on graphics and smoothness will apply at the high end and for those with big wallets, but for the rest of us there's little point to jumping into Gsync.

    In 3-4 years when IPS 2560x1440 has matured to the point where it's both mainstream (cheap) and capable of delivering low-latency ghosting-free images, then Gsync will be a big deal, but right now only a small percentage of the population have invested in 1440p.

    The fact is, most people have been sitting on their 1080p screens for 3+ years and probably will for another 3 unless those same screens fail--$500+ for a desktop monitor is a lot to justify. Once the monitor upgrades start en mass, then Gsync will be a market changer because AMD will not have anything to compete with.
    Reply
  • misfit410 - Thursday, October 24, 2013 - link

    G-String didn't kill anything, I'm not about to give up my Dell Ultrasharp for another Proprietary Nvidia tool. Reply
  • anubis44 - Tuesday, October 29, 2013 - link

    Agreed. G-sync is a stupid solution to non-existent problem. If you have a fast enough frame rate, there's nothing to fix. Reply
  • MADDER1 - Thursday, October 24, 2013 - link

    Mantle could be for either higher frame rate or more detail. Gsync sounds like just frame rate. Reply

Log in

Don't have an account? Sign up now