Rise of the Tomb Raider

Starting things off in our benchmark suite is the built-in benchmark for Rise of the Tomb Raider, the latest iteration in the long-running action-adventure gaming series. One of the unique aspects of this benchmark is that it’s actually the average of 4 sub-benchmarks that fly through different environments, which keeps the benchmark from being too weighted towards a GPU’s performance characteristics under any one scene.

Rise of the Tomb Raider - 3840x2160 - Very High Quality (DX11)

Rise of the Tomb Raider - 2560x1440 - Very High Quality (DX11)

As we’re looking at the highest of high-end cards here, the performance comparisons are pretty straightforward. There’s the GTX 1080 Ti versus the GTX 1080, and then for owners with older cards looking for an upgrade, there’s the GTX 1080 Ti versus the GTX 980 Ti and GTX 780 Ti. It goes without saying that the GTX 1080 Ti is the fastest card that is (or tomorrow will be) on the market, so the only outstanding question is just how much faster NVIDIA’s latest card really is.

As you’d expect, the GTX 1080 Ti’s performance lead is dependent in part on the resolution tested. The higher the resolution the more GPU-bound a game is, and the more opportunity there is for the card to stretch its 3GB advantage in VRAM. In the case of Tomb Raider, the GTX 1080 Ti ends up being 33% faster than the GTX 1080 at 4K, and 26% faster at 1440p.

Otherwise against the 28nm GTX 980 Ti and GTX 780 Ti, the performance gains become very large very quickly. The GTX 1080 Ti holds a 70% lead over the GTX 980 here at 4K, and it’s a full 2.6x faster than the GTX 780 Ti. The end result is that whereas the GTX 980 Ti was the first card to crack 30fps at 4K on Tomb Raider, the GTX 1080 Ti is the first card that can actually average 60fps or better.

Driver Performance & The Test DiRT Rally
Comments Locked

161 Comments

View All Comments

  • eddman - Friday, March 10, 2017 - link

    Adjusted for inflation: http://i.imgur.com/ZZnTS5V.png
  • Meteor2 - Friday, March 10, 2017 - link

    Great charts!
  • mapesdhs - Saturday, March 11, 2017 - link

    Except they excude the Titans, Fury, etc.
  • eddman - Saturday, March 11, 2017 - link

    I, personally, made these charts.

    No titans because they are very niche cards for those gamers who cannot wait and/or have more money than sense. The following Ti variants perform almost as good as the titan cards anyway.

    No radeons because this is an nvidia-only chart. I should've titled it as such. I focused on nvidia because ATI/AMD usually don't price their cards so high.
  • mapesdhs - Saturday, March 11, 2017 - link

    Ok on the Radeon angled, shoulda realised it was NV only. :D

    However, your description of those who buy Titans cannot be anything than your own opinion, and what you don't realise is that for many store owners it's these very top-tier cards which bring in the majority of their important profit margins. They make far less on the mainstream cards. They enthusiast market is extremely important, whether or not one individually regards the products as being relevant for one's own needs. You need to be objective here.
  • mapesdhs - Saturday, March 11, 2017 - link

    Sorry for the typos... am on a train, wobbly kybd. :D Is this site ever gonna get modern and allow editing??...
  • eddman - Saturday, March 11, 2017 - link

    I think I am being objective. Titan cards do not fit into the regular geforce range. They are like an early pass. Wait a few months and you can have a Ti that performs the same at a much lower price.

    If nvidia never released a similar performing Ti card, I would've included them.
  • eddman - Saturday, March 11, 2017 - link

    Also I don't see how stores and their profits has anything to do with that.
  • Mr Perfect - Tuesday, March 14, 2017 - link

    Nice work on those charts.

    So there where a couple years where top tier cards where $400 or less. Inflation number normalize that price a fair bit though.
  • Ryan Smith - Thursday, March 9, 2017 - link

    Those days are long-gone, and not just because of profit taking. 16/14nm FinFET GPUs are astonishingly expensive to design and fab. The masks are in the millions, and now everything has to be double-patterned.

Log in

Don't have an account? Sign up now