Gaming: Grand Theft Auto V

The highly anticipated iteration of the Grand Theft Auto franchise hit the shelves on April 14th 2015, with both AMD and NVIDIA in tow to help optimize the title. GTA doesn’t provide graphical presets, but opens up the options to users and extends the boundaries by pushing even the hardest systems to the limit using Rockstar’s Advanced Game Engine under DirectX 11. Whether the user is flying high in the mountains with long draw distances or dealing with assorted trash in the city, when cranked up to maximum it creates stunning visuals but hard work for both the CPU and the GPU.

For our test we have scripted a version of the in-game benchmark. The in-game benchmark consists of five scenarios: four short panning shots with varying lighting and weather effects, and a fifth action sequence that lasts around 90 seconds. We use only the final part of the benchmark, which combines a flight scene in a jet followed by an inner city drive-by through several intersections followed by ramming a tanker that explodes, causing other cars to explode as well. This is a mix of distance rendering followed by a detailed near-rendering action sequence, and the title thankfully spits out frame time data.

There are no presets for the graphics options on GTA, allowing the user to adjust options such as population density and distance scaling on sliders, but others such as texture/shadow/shader/water quality from Low to Very High. Other options include MSAA, soft shadows, post effects, shadow resolution and extended draw distance options. There is a handy option at the top which shows how much video memory the options are expected to consume, with obvious repercussions if a user requests more video memory than is present on the card (although there’s no obvious indication if you have a low end GPU with lots of GPU memory, like an R7 240 4GB).

AnandTech CPU Gaming 2019 Game List
Game Genre Release Date API IGP Low Med High
Grand Theft Auto V Open World Apr
2015
DX11 720p
Low
1080p
High
1440p
Very High
4K
Ultra

All of our benchmark results can also be found in our benchmark engine, Bench.

AnandTech IGP Low Medium High
Average FPS
95th Percentile

At 720p and 1080p, the Ryzen 5 2500X has the lead, while at 1440p the 8350K goes ahead. At 4K, all chips are equal.

Gaming: Strange Brigade (DX12, Vulkan) Gaming: Far Cry 5
Comments Locked

65 Comments

View All Comments

  • Le Québécois - Monday, February 11, 2019 - link

    Ian, any reason why more often than not, you seem to "skip" 1440 in your benchmarks? It's only present for a few games.

    Considering the GTX 1080, your best card, is always the bottleneck at 4K, as your numbers show, wouldn't it make more sense to focus more on 1440 instead?

    Especially considering it's the "best" resolution on the market if you are looking for a high pixel density yet still want to run your games at a playable levels of fps.
  • Ian Cutress - Monday, February 11, 2019 - link

    Some benchmarks are run at 1440p. Some go up to 8K. It's a mix. There's what, 10 games there? Not all of them have to conform to the same testing settings.
  • Le Québécois - Tuesday, February 12, 2019 - link

    Sorry for the confusion. I can clearly see we've got very different settings in that mix. I guess a more direct question would be: why do it this way and not with a more standardized series of test?

    A followup question would also be, why 8K? You are already GPU limited at 4K so your 8K result are not going to give any relevant information about those CPUs.

    Sorry, I don't mean to criticized, I simply wish to understand your thought process.
  • MrSpadge - Monday, February 11, 2019 - link

    What exactly do you want to see there that you can't see at 1080p? Differences between CPUs are going to be muddied due to approaching the GPU limit, and that's it.
  • Le Québécois - Tuesday, February 12, 2019 - link

    Well, at 1080, you can definitely see the difference between them, and exactly like you said, at 4K, it's all the same because of the GPU limitations. 1440 seems more relevant than 4K considering this. This is after all, a CPU review and most of the 4K results could be summed up by "they all perform within a few %".
  • neblogai - Monday, February 11, 2019 - link

    End of page 19: R5 2600 is really 65W TDP, not 95W.
  • Ian Cutress - Monday, February 11, 2019 - link

    Doh, a typo in all my graphs too. Should be updated.
  • imaheadcase - Monday, February 11, 2019 - link

    Im on phone on AT and truly see how terrible ads are now. AT straight up letting scam ads now being served because desperate for revenue. 😂
  • PeachNCream - Monday, February 11, 2019 - link

    Is there a point in even mentioning that give how little control they now have over advertising? Just fire up the ad blocker or visit another site and let the new owners figure it out the hard way.
  • StevoLincolnite - Tuesday, February 12, 2019 - link

    Anandtech had Maleware/Viruses infect it's userbase years ago via crappy adverts.

    That was the moment I got Ad-Block. And that is the moment where I will never turn it off again.

Log in

Don't have an account? Sign up now