Hitman

The final game in our 2016 benchmark suite is the 2016 edition of Hitman, the latest title in the stealth-action franchise. The game offers two rendering paths: DirectX 11 and DirectX 12, with the latter being the case of DirectX 12 being added after the fact. As with past Hitman games, the latest proves to have a good mix of scenery and high model counts to stress modern video cards.

Hitman - 2560x1440 - Ultra Quality (Best API)

Hitman - 1920x1080 - Ultra Quality (Best API)

While the full API results are broken out in Bench, for this article I’m only listing the performance with the faster API for each card, as at this point we’ve seen a very consistent pattern in Hitman: AMD cards do better under DX12, while NVIDIA cards do better under DX11.

In any case, this is the one and only case where we see the GTX 1060 lose. At both 1440p and 1080p NVIDIA’s mainstream card falls to AMD’s own mainstream competitor by roughly 10%. This despite the fact that GTX 1060 enjoys a larger than normal 4-5% performance advantage over GTX 980.

As for the generational comparisons, GTX 1060 at least comes out well ahead of its predecessors. Compared to GTX 960 it delivers over twice the performance of NVIDIA’s former mainstream card, and nearly 3.5x more performance than GTX 660.

Grand Theft Auto V Compute
Comments Locked

189 Comments

View All Comments

  • DominionSeraph - Friday, August 5, 2016 - link

    Aw, look at the fanboy complain that it isn't "fair" because they didn't only present AMD's strengths and Nvidia's weaknesses, but instead used tests representative of the gaming landscape.
  • beck2050 - Monday, August 8, 2016 - link

    Nvidia has driver teams as well. Plus 22% with the latest Dx12 Hitman. 1060 will compete very well. Cooler faster less energy, and priced accordingly.
  • eddman - Saturday, August 6, 2016 - link

    So much misinformation still going around. Gameworks effects are either CPU only, which have ZERO effect on the GPU, no matter the brand, like waveworks in just casue 3, or are GPU based, which can be DISABLED, like witcher 3's hairworks or HBAO+, or RotTR's HBAO+ and VXAO.
  • Ryan Smith - Friday, August 5, 2016 - link

    The benchmark suite was finalized back in May, when the DX12 version of Tomb Raider was rubbish. I talk a bit more about this in another comment, but basically we only periodically update the benchmark suite due to the amount of work involved and the need to maintain a consistent dataset for Bench. The plan is to do another update in September.
  • Colin1497 - Saturday, August 6, 2016 - link

    I understood that situation. Last thing you needed was to change the games when you were running behind. Just commenting on the change in the landscape over 2 months. Doom and Vulcan is obviously another thing. Looking forward to what you do next.
  • Simplex - Sunday, August 7, 2016 - link

    It's "Vulkan", not "Vulcan".
  • MarkieGcolor - Friday, August 5, 2016 - link

    Please include 4k, and crossfire/sli setups in your benchmarks. Otherwise I do not care about this late review.
  • xenol - Friday, August 5, 2016 - link

    4K seems kind of pointless, we all know it's going to be sub 40FPS which few people are going to recommend this card for 4K gaming.

    Also what's the point of SLI on a card that doesn't support it?
  • AnnonymousCoward - Friday, August 5, 2016 - link

    If 4K is pointless then so are most of the 2560 tests which use ultra settings and produce <60fps.

    4K with medium settings, no AA would be much more interesting to me.
  • MarkieGcolor - Saturday, August 6, 2016 - link

    True. I'm just saying it would be interesting.

    I understand that if you want 1060 sli you should just buy 1080, but I feel Nvidia disabled sli to keep the second hand market at bay and sell more new expensive cards.

Log in

Don't have an account? Sign up now