Rise of the Tomb Raider

One of the newest games in the gaming benchmark suite is Rise of the Tomb Raider (RoTR), developed by Crystal Dynamics, and the sequel to the popular Tomb Raider which was loved for its automated benchmark mode. But don’t let that fool you: the benchmark mode in RoTR is very much different this time around.

Visually, the previous Tomb Raider pushed realism to the limits with features such as TressFX, and the new RoTR goes one stage further when it comes to graphics fidelity. This leads to an interesting set of requirements in hardware: some sections of the game are typically GPU limited, whereas others with a lot of long-range physics can be CPU limited, depending on how the driver can translate the DirectX 12 workload.

Where the old game had one benchmark scene, the new game has three different scenes with different requirements: Geothermal Valley (1-Valley), Prophet’s Tomb (2-Prophet) and Spine of the Mountain (3-Mountain) - and we test all three. These are three scenes designed to be taken from the game, but it has been noted that scenes like 2-Prophet shown in the benchmark can be the most CPU limited elements of that entire level, and the scene shown is only a small portion of that level. Because of this, we report the results for each scene on each graphics card separately.

Graphics options for RoTR are similar to other games in this type, offering some presets or allowing the user to configure texture quality, anisotropic filter levels, shadow quality, soft shadows, occlusion, depth of field, tessellation, reflections, foliage, bloom, and features like PureHair which updates on TressFX in the previous game.

Again, we test at 1920x1080 and 4K using our native 4K displays. At 1080p we run the High preset, while at 4K we use the Medium preset which still takes a sizable hit in frame rate.

It is worth noting that RoTR is a little different to our other benchmarks in that it keeps its graphics settings in the registry rather than a standard ini file, and unlike the previous TR game the benchmark cannot be called from the command-line. Nonetheless we scripted around these issues to automate the benchmark four times and parse the results. From the frame time data, we report the averages, 99th percentiles, and our time under analysis.

All of our benchmark results can also be found in our benchmark engine, Bench.

ASRock RX 580 Performance

Rise of the Tomb Raider (1080p, Ultra)

Rise of the Tomb Raider (1080p, Ultra)

GPU Tests: Shadow of Mordor GPU Tests: Rocket League
POST A COMMENT

115 Comments

View All Comments

  • bigboxes - Tuesday, June 12, 2018 - link

    I run to my 4790K to check for HSF. Who would have thought it had one in the box. No one buys that processor to use the stock HSF. Reply
  • mkaibear - Tuesday, June 12, 2018 - link

    I did. Worked fine. Reply
  • Marlin1975 - Monday, June 11, 2018 - link

    My 3570k came with a heatsink.

    Also in the AMD Ryzen reviews here it was pointed out there was no heatsink for the unlocked/higher chips. Yet in this review it was not and they did not use a regular/more common heatsink, but a very costly and less used water cooler.
    Reply
  • npz - Monday, June 11, 2018 - link

    Ok fair enogh, but the stock heatsink & fan Intel uses are crap so I don't think reviewers should be using them for actual benchmarks anyways as it will affect turbo speeds. Reply
  • Marlin1975 - Monday, June 11, 2018 - link

    That's the point. You use what they give to show its real performance. If it has a negative affect that is on the maker, not the user/reviewer.

    That way when you compare different CPUs they all have the same standard cooler so its apples to apples review.
    Reply
  • Inteli - Monday, June 11, 2018 - link

    Surely if you want to compare CPUs apples to apples, you'd want to use the same cooler for all of them (across brands, not just within), so the CPU is what's actually being tested. Why would only a stock cooler give "real performance" anyways? Are you saying the CLC on my 4690k is giving me "fake performance"?

    Not that it matters, because Intel didn't include a stock heat sink with this CPU.

    I would rather see CPUs hooked up to an absolutely overkill cooling setup (maybe a water chiller? :^) ) on stock clocks so the CPU can perform its absolute best.
    Reply
  • npz - Monday, June 11, 2018 - link

    It's real performance is with an aftermarket heatsink anyways which is why Intel stopped providing them for the K series after Haswell. As long as you use a cooler which will not impede the performance of the cpu, then the cpu benchmarks are all apples to apples comparison. It was so bad it would cause the cpus to throttle with certain heavy loads and you can forget about overclocking, which kills the point of the K-series.

    It will be an absolute disservice if Anandtech benchmarked with the stock heatsink. The only exception is Ryzen, especially Ryzen 2's wraith max heatsink which rivals high end aftermarket cooling
    Reply
  • cmdrdredd - Monday, June 11, 2018 - link

    ok but in the real world nobody is buying a K series CPU and running it stock. Reply
  • mkaibear - Tuesday, June 12, 2018 - link

    I do! (I intended to downclock it but it didn't downclock very well so I just left it at defaults. Stock cooler too - 2500K then 4790K) Reply
  • MDD1963 - Tuesday, June 26, 2018 - link

    Untrue, if the CPU is already at fairly high temps stock, going 10-15C higher in temps to gain 100 more MHz and 2 more FPS in a game seems ludicrous.... Reply

Log in

Don't have an account? Sign up now