Gaming: Grand Theft Auto V

The highly anticipated iteration of the Grand Theft Auto franchise hit the shelves on April 14th 2015, with both AMD and NVIDIA in tow to help optimize the title. GTA doesn’t provide graphical presets, but opens up the options to users and extends the boundaries by pushing even the hardest systems to the limit using Rockstar’s Advanced Game Engine under DirectX 11. Whether the user is flying high in the mountains with long draw distances or dealing with assorted trash in the city, when cranked up to maximum it creates stunning visuals but hard work for both the CPU and the GPU.

For our test we have scripted a version of the in-game benchmark. The in-game benchmark consists of five scenarios: four short panning shots with varying lighting and weather effects, and a fifth action sequence that lasts around 90 seconds. We use only the final part of the benchmark, which combines a flight scene in a jet followed by an inner city drive-by through several intersections followed by ramming a tanker that explodes, causing other cars to explode as well. This is a mix of distance rendering followed by a detailed near-rendering action sequence, and the title thankfully spits out frame time data.

 

There are no presets for the graphics options on GTA, allowing the user to adjust options such as population density and distance scaling on sliders, but others such as texture/shadow/shader/water quality from Low to Very High. Other options include MSAA, soft shadows, post effects, shadow resolution and extended draw distance options. There is a handy option at the top which shows how much video memory the options are expected to consume, with obvious repercussions if a user requests more video memory than is present on the card (although there’s no obvious indication if you have a low end GPU with lots of GPU memory, like an R7 240 4GB).

 

All of our benchmark results can also be found in our benchmark engine, Bench.

AnandTech IGP Low
Average FPS
95th Percentile
Gaming: Strange Brigade (DX12, Vulkan) Gaming: Far Cry 5
Comments Locked

220 Comments

View All Comments

  • catavalon21 - Wednesday, May 20, 2020 - link

    +1
  • Lord of the Bored - Friday, May 22, 2020 - link

    The nostalgia is strong these days.
  • Bidz - Wednesday, May 20, 2020 - link

    So... where is the temperature chart? Given the power usage and the tier level of the product I would say many users want to know how practical it is to use.
  • LawRecords - Wednesday, May 20, 2020 - link

    Agreed. Its odd that thermals are missing given the high power draw.
  • shabby - Wednesday, May 20, 2020 - link

    I'd imagine it would be pegged at 90c since the cpu is constantly clocking itself as high as it can.
  • DannyH246 - Wednesday, May 20, 2020 - link

    Its not odd at at all. Its to make Intel look better we all know this.
  • shady28 - Wednesday, May 20, 2020 - link

    LTT has a video on thermals. The thermals for the gen 10 are better than gen 9, despite the higher clocks and core counts. Intel redesigned the conductive layer between the die and the lid. It worked.
  • Spunjji - Tuesday, May 26, 2020 - link

    Seriously? The thermals are better despite the higher power draw?

    I'm guessing this is a case of being able to get the heat out more easily *if you have a cooling system capable of subsequently dealing with the heat being pulled out*. That would make sense given the changes involved, but it involves the assumption that people are prepared to go from 280mm+ radiators.
  • mrvco - Wednesday, May 20, 2020 - link

    I get that this is a CPU review and not a GPU or system review, but it would be helpful to also include gaming resolutions w/ quality settings that people actually use for gaming rather just benchmarking... especially when building a gaming system and making decisions on how to allocate budget between CPU (+p/s +cooling) and GPU.
  • TheUnhandledException - Wednesday, May 20, 2020 - link

    I agree. Yes the result will show nearly identical performance from a 10900 down to an Ryzen 3600 but that is kinda the point. You don't really need an ultra high end CPU for gaming at high resolution. Even if it was just one game it would be nice to see how CPU performance scales at 1080p, 1080p high quality, 1440p, and 4K.

Log in

Don't have an account? Sign up now