Gaming Tests: Red Dead Redemption 2

It’s great to have another Rockstar benchmark in the mix, and the launch of Red Dead Redemption 2 (RDR2) on the PC gives us a chance to do that. Building on the success of the original RDR, the second incarnation came to Steam in December 2019 having been released on consoles first. The PC version takes the open-world cowboy genre into the start of the modern age, with a wide array of impressive graphics and features that are eerily close to reality.

For RDR2, Rockstar kept the same benchmark philosophy as with Grand Theft Auto V, with the benchmark consisting of several cut scenes with different weather and lighting effects, with a final scene focusing on an on-rails environment, only this time with mugging a shop leading to a shootout on horseback before riding over a bridge into the great unknown. Luckily most of the command line options from GTA V are present here, and the game also supports resolution scaling. We have the following tests:

  • 384p Minimum, 1440p Minimum, 8K Minimum, 1080p Max

For that 8K setting, I originally thought I had the settings file at 4K and 1.0x scaling, but it was actually set at 2.0x giving that 8K.  For the sake of it, I decided to keep the 8K settings.

For our results, we run through each resolution and setting configuration for a minimum of 10 minutes, before averaging and parsing the frame time data.

AnandTech Low Resolution
Low Quality
Medium Resolution
Low Quality
High Resolution
Low Quality
Medium Resolution
Max Quality
Average FPS
95th Percentile

All of our benchmark results can also be found in our benchmark engine, Bench.

Gaming Tests: GTA 5 Gaming Tests: Strange Brigade
Comments Locked

279 Comments

View All Comments

  • mitox0815 - Tuesday, April 13, 2021 - link

    "Just abandon"...those clocks you dream of might have been possible on certain CPUs, but definitely noton a broader line-up. The XPs ran hot enough as it was, screwing more out of them would've made no sense. THAT they tried with the 9590...and failed miserably. Not to mention people could OC the Northwoods too, beyond 3.6 or 3.7 Ghz in fact...negating that point entirely. As was said...Northwood, especially the FSB800 ones with HT were the top dogs until the A64 came around and showed them the door. Prescott was...ambitious, to put it nicely.
  • mitox0815 - Tuesday, April 13, 2021 - link

    *not on
  • TheinsanegamerN - Wednesday, March 31, 2021 - link

    Netburst was built for both high clock speeds and predictable workloads, such as video editing, where it did quite well. Obviously it royally sucked for unpredictable workloads like gaming, but you could see where intel was heading with the idea.
  • Oxford Guy - Wednesday, March 31, 2021 - link

    'you could see where intel was heading with the idea'

    Creating the phrase 'MHz myth' in the public consciousness.
  • GeoffreyA - Friday, April 2, 2021 - link

    "MHz myth in the public consciousness"

    And it largely worked, even in the K8 era with the non-enthusiast public. Only when Core 2 Duo dropped to lower clocks was it accepted overnight that, yes, lower clocks are now all right because Intel says so.
  • Prosthetic Head - Tuesday, March 30, 2021 - link

    Your point still stands, however P4 was also a VERY low bar for to measure IPC improvements relative to.
  • Hifihedgehog - Tuesday, March 30, 2021 - link

    Well, Bulldozer was too and look what AMD did with Ryzen...
  • Oxford Guy - Saturday, April 3, 2021 - link

    AMD had a long time. 2011 is stamped onto the spreader of Piledriver and that was only a small incremental change from Bulldozer, which is even older.
  • Oxford Guy - Saturday, April 3, 2021 - link

    And, Bulldozer had worse IPC than Phenom. So, AMD had basically tech eternity to improve on the IPC of what it was offering. It made Zen 1 seem a lot more revolutionary.
  • GeoffreyA - Saturday, April 3, 2021 - link

    "It made Zen 1 seem a lot more revolutionary"

    You're right; and if one compares against Haswell or Skylake, one will see that the Intel and AMD designs are crudely the same from a bird's-eye point of view, except for AMD's split-scheduler inherited from the Athlon. I think that goes to show there's pretty much only one way to make an efficient x86 CPU (notice departures are disastrous: Netburst/Bulldozer). Having said that, I'm glad AMD went through the BD era: taught them a great deal. Also forced them to start from scratch, which took their design further than revising K10 would have done.

Log in

Don't have an account? Sign up now