Gaming Performance

Civilization 6 (DX12)

Originally penned by Sid Meier and his team, the Civ series of turn-based strategy games are a cult classic, and many an excuse for an all-nighter trying to get Gandhi to declare war on you due to an integer overflow. Truth be told I never actually played the first version, but every edition from the second to the sixth, including the fourth as voiced by the late Leonard Nimoy, it a game that is easy to pick up, but hard to master.

Benchmarking Civilization has always been somewhat of an oxymoron – for a turn based strategy game, the frame rate is not necessarily the important thing here and even in the right mood, something as low as 5 frames per second can be enough. With Civilization 6 however, Firaxis went hardcore on visual fidelity, trying to pull you into the game. As a result, Civilization can taxing on graphics and CPUs as we crank up the details, especially in DirectX 12.

Perhaps a more poignant benchmark would be during the late game, when in the older versions of Civilization it could take 20 minutes to cycle around the AI players before the human regained control. The new version of Civilization has an integrated ‘AI Benchmark’, although it is not currently part of our benchmark portfolio yet, due to technical reasons which we are trying to solve. Instead, we run the graphics test, which provides an example of a mid-game setup at our settings.

RX 5700 XT: Civilization 6 - Average FPSRX 5700 XT: Civilization 6 - 99th Percentile

Grand Theft Auto V

The highly anticipated iteration of the Grand Theft Auto franchise hit the shelves on April 14th 2015, with both AMD and NVIDIA in tow to help optimize the title. GTA doesn’t provide graphical presets, but opens up the options to users and extends the boundaries by pushing even the hardest systems to the limit using Rockstar’s Advanced Game Engine under DirectX 11. Whether the user is flying high in the mountains with long draw distances or dealing with assorted trash in the city, when cranked up to maximum it creates stunning visuals but hard work for both the CPU and the GPU.

For our test we have scripted a version of the in-game benchmark. The in-game benchmark consists of five scenarios: four short panning shots with varying lighting and weather effects, and a fifth action sequence that lasts around 90 seconds. We use only the final part of the benchmark, which combines a flight scene in a jet followed by an inner city drive-by through several intersections followed by ramming a tanker that explodes, causing other cars to explode as well. This is a mix of distance rendering followed by a detailed near-rendering action sequence, and the title thankfully spits out frame time data.

RX 5700 XT: Grand Theft Auto V - Average FPSRX 5700 XT: Grand Theft Auto V - 99th Percentile

F1 2018

Aside from keeping up-to-date on the Formula One world, F1 2017 added HDR support, which F1 2018 has maintained; otherwise, we should see any newer versions of Codemasters' EGO engine find its way into F1. Graphically demanding in its own right, F1 2018 keeps a useful racing-type graphics workload in our benchmarks.

Aside from keeping up-to-date on the Formula One world, F1 2017 added HDR support, which F1 2018 has maintained. We use the in-game benchmark, set to run on the Montreal track in the wet, driving as Lewis Hamilton from last place on the grid. Data is taken over a one-lap race.

RX 5700 XT: F1 2018 - Average FPSRX 5700 XT: F1 2018 - 99th Percentile

For our discrete gaming tests, we saw very little difference between all three speed settings. This is because at the end of the day, the resolutions that people who buy this kit are likely to play at aren't going to be memory bound - it's almost always GPU bound. What we really need is an APU here to see some differences.

CPU Performance, Short Form Corsair DDR4-5000 Vengeance LPX Conclusion
Comments Locked

54 Comments

View All Comments

  • PeachNCream - Monday, January 27, 2020 - link

    Orly?

    https://www.anandtech.com/show/15414/ces-2020-zota...
  • Tunnah - Monday, January 27, 2020 - link

    I generally ignore these sort of halo products because most of the time they are just for willy wavers who want to own the very best..but normally even halo products have a tangible improvement in SOME scenario. These are just high score pieces, and would definitely judge the crap out of anyone who bought them.
  • tamalero - Monday, January 27, 2020 - link

    Most of the time, these halo products are the door for newer cheaper and faster products for the masses in the next generation. As much as flamboyant and glamorous these products are, they have their uses.
  • Spunjji - Wednesday, January 29, 2020 - link

    Agreed - this is just silly.
  • 5080 - Monday, January 27, 2020 - link

    Would be great memory for Renoir for Desktop once it becomes available. But then the price is certainly not in the same ballpark as the rest of the components. No one would spend this much on an APU based system for memory alone.
  • PeachNCream - Monday, January 27, 2020 - link

    That's a whole lot of coin for a barely measurable improvement in a few benchmarks.
  • ahtoh - Monday, January 27, 2020 - link

    The most important graph: derivative of performance/$$ is missing
  • PVG - Monday, January 27, 2020 - link

    Would be interesting to see this test with a 3950X, specially in workstation scenarios where the 16 cores start to become limited by the dual-channel controller.
  • Sivar - Monday, January 27, 2020 - link

    I don't understand. This is literally a performance memory preview, yet the memory heavy benchmark mode was declined in favor of a CPU-heavy benchmark?

    "We report the results as the ability to simulate the data as a fraction of real-time, so anything above a ‘one’ is suitable for real-time work. Out of the two modes, a ‘non-firing’ mode which is DRAM heavy and a ‘firing’ mode which has CPU work, we choose the latter. Despite this, the benchmark is still affected by DRAM speed a fair amount."
  • 29a - Tuesday, January 28, 2020 - link

    This article is full of all kinds of stupid. They should have test iGPU performance too.

Log in

Don't have an account? Sign up now