Gaming Performance

Civilization 6 (DX12)

Originally penned by Sid Meier and his team, the Civ series of turn-based strategy games are a cult classic, and many an excuse for an all-nighter trying to get Gandhi to declare war on you due to an integer overflow. Truth be told I never actually played the first version, but every edition from the second to the sixth, including the fourth as voiced by the late Leonard Nimoy, it a game that is easy to pick up, but hard to master.

Benchmarking Civilization has always been somewhat of an oxymoron – for a turn based strategy game, the frame rate is not necessarily the important thing here and even in the right mood, something as low as 5 frames per second can be enough. With Civilization 6 however, Firaxis went hardcore on visual fidelity, trying to pull you into the game. As a result, Civilization can taxing on graphics and CPUs as we crank up the details, especially in DirectX 12.

Perhaps a more poignant benchmark would be during the late game, when in the older versions of Civilization it could take 20 minutes to cycle around the AI players before the human regained control. The new version of Civilization has an integrated ‘AI Benchmark’, although it is not currently part of our benchmark portfolio yet, due to technical reasons which we are trying to solve. Instead, we run the graphics test, which provides an example of a mid-game setup at our settings.

RX 5700 XT: Civilization 6 - Average FPSRX 5700 XT: Civilization 6 - 99th Percentile

Grand Theft Auto V

The highly anticipated iteration of the Grand Theft Auto franchise hit the shelves on April 14th 2015, with both AMD and NVIDIA in tow to help optimize the title. GTA doesn’t provide graphical presets, but opens up the options to users and extends the boundaries by pushing even the hardest systems to the limit using Rockstar’s Advanced Game Engine under DirectX 11. Whether the user is flying high in the mountains with long draw distances or dealing with assorted trash in the city, when cranked up to maximum it creates stunning visuals but hard work for both the CPU and the GPU.

For our test we have scripted a version of the in-game benchmark. The in-game benchmark consists of five scenarios: four short panning shots with varying lighting and weather effects, and a fifth action sequence that lasts around 90 seconds. We use only the final part of the benchmark, which combines a flight scene in a jet followed by an inner city drive-by through several intersections followed by ramming a tanker that explodes, causing other cars to explode as well. This is a mix of distance rendering followed by a detailed near-rendering action sequence, and the title thankfully spits out frame time data.

RX 5700 XT: Grand Theft Auto V - Average FPSRX 5700 XT: Grand Theft Auto V - 99th Percentile

F1 2018

Aside from keeping up-to-date on the Formula One world, F1 2017 added HDR support, which F1 2018 has maintained; otherwise, we should see any newer versions of Codemasters' EGO engine find its way into F1. Graphically demanding in its own right, F1 2018 keeps a useful racing-type graphics workload in our benchmarks.

Aside from keeping up-to-date on the Formula One world, F1 2017 added HDR support, which F1 2018 has maintained. We use the in-game benchmark, set to run on the Montreal track in the wet, driving as Lewis Hamilton from last place on the grid. Data is taken over a one-lap race.

RX 5700 XT: F1 2018 - Average FPSRX 5700 XT: F1 2018 - 99th Percentile

For our discrete gaming tests, we saw very little difference between all three speed settings. This is because at the end of the day, the resolutions that people who buy this kit are likely to play at aren't going to be memory bound - it's almost always GPU bound. What we really need is an APU here to see some differences.

CPU Performance, Short Form Corsair DDR4-5000 Vengeance LPX Conclusion
Comments Locked

54 Comments

View All Comments

  • Qasar - Tuesday, January 28, 2020 - link

    he wont.. he hates amd like no other.. will bash them till his fingers bleed.. but when called out on his fud and lies.. he just turns to insults and posts more bs.. he is and intel shill and still believes intels lies and bs.. he's all talk an no action.
  • Spunjji - Wednesday, January 29, 2020 - link

    It says in the intro that Intel CPUs can't use RAM this fast, and using a more powerful GPU would not alter the gaming benchmarks *because they are not sensitive to memory bandwidth*.

    Did you outsource your brain to a herring?
  • HikariWS - Tuesday, January 28, 2020 - link

    Awesome article, congrats!

    Indeed, this RAM isn't worth it. It just won't be the bottleneck for the rig of ppl who would be willing to pay for it.
  • catavalon21 - Tuesday, January 28, 2020 - link

    I figure it's a demonstration of what the hardware is capable of that doesn't really have a (practical) use case...yet. Not a new technology, just a stupid-fast result of binning higher-volume parts to show what they can do. I'll be interested to see where it leads.
  • 29a - Tuesday, January 28, 2020 - link

    I can't believe you didn't test the one thing that would show a big difference, gaming on the iGPU.
  • FreckledTrout - Tuesday, January 28, 2020 - link

    Yeah the iGPU in the Ryzen 3700X would have screamed I tell you. Oh wait, my someone is telling me they don't have iGPU's. If you are going to complain at least read the article.
  • 29a - Wednesday, January 29, 2020 - link

    You really don't think I was talking about a different CPU? Are your really that stupid or just intellectually dishonest, it's one or the other.
  • Korguz - Wednesday, January 29, 2020 - link

    instead of being rude, why not mention which igpu you are referring to ?? instead... you resort to insults...
  • Iketh - Saturday, February 1, 2020 - link

    lol why must he refer to one? ANY IGPU
  • Korguz - Sunday, February 2, 2020 - link

    maybe he has a specific one in mind..
    and there is this " You really don't think I was talking about a different CPU ?"

Log in

Don't have an account? Sign up now