Gaming: Grand Theft Auto V

The highly anticipated iteration of the Grand Theft Auto franchise hit the shelves on April 14th 2015, with both AMD and NVIDIA in tow to help optimize the title. GTA doesn’t provide graphical presets, but opens up the options to users and extends the boundaries by pushing even the hardest systems to the limit using Rockstar’s Advanced Game Engine under DirectX 11. Whether the user is flying high in the mountains with long draw distances or dealing with assorted trash in the city, when cranked up to maximum it creates stunning visuals but hard work for both the CPU and the GPU.

For our test we have scripted a version of the in-game benchmark. The in-game benchmark consists of five scenarios: four short panning shots with varying lighting and weather effects, and a fifth action sequence that lasts around 90 seconds. We use only the final part of the benchmark, which combines a flight scene in a jet followed by an inner city drive-by through several intersections followed by ramming a tanker that explodes, causing other cars to explode as well. This is a mix of distance rendering followed by a detailed near-rendering action sequence, and the title thankfully spits out frame time data.

There are no presets for the graphics options on GTA, allowing the user to adjust options such as population density and distance scaling on sliders, but others such as texture/shadow/shader/water quality from Low to Very High. Other options include MSAA, soft shadows, post effects, shadow resolution and extended draw distance options. There is a handy option at the top which shows how much video memory the options are expected to consume, with obvious repercussions if a user requests more video memory than is present on the card (although there’s no obvious indication if you have a low end GPU with lots of GPU memory, like an R7 240 4GB).

All of our benchmark results can also be found in our benchmark engine, Bench.

AnandTech IGP Low Medium High
Average FPS
95th Percentile

Gaming: Strange Brigade (DX12, Vulkan) Gaming: F1 2018
Comments Locked

245 Comments

View All Comments

  • schujj07 - Tuesday, November 26, 2019 - link

    Where I work we now have 4x Dual 32 Core Epyc 7502s and 2x Dual 24 Core Epyc 7401s. We cannot move to Server 2016/2019 due to the per core licensing. However, for our VMware environment it is amazing how many VMs just 1 of those hosts can run.
  • Supercell99 - Tuesday, November 26, 2019 - link

    Is vmware stable on the new Epycs? I have some older Dells R630 2697 x2 I need to upgrade running ESXi 6.0 A bit nervous about jumping to AMD for production on vmware.
  • schujj07 - Tuesday, November 26, 2019 - link

    They are perfectly stable. We are running them for production work. 2nd Gen Epyc is only supported on 6.7 U3.
  • Foeketijn - Tuesday, November 26, 2019 - link

    On Epyc. Not TR. I would think.
  • twtech - Monday, November 25, 2019 - link

    Speaking of which, why does this review have so many gaming benchmarks, and say, no compiler benchmarks? I'd have liked to see the 32-core TR vs. the 3175x or 3275 compiling a large C++ project.
  • eek2121 - Monday, November 25, 2019 - link

    Not only that, but Anandtech is still doing gaming benchmarks on a Geforce 1080. Gamers Nexus has a much more production oriented review, but still no compiler benchmarks, etc.
  • Slash3 - Tuesday, November 26, 2019 - link

    I've never understood why AT has kept the GTX 1080. For purposes of benchmarking, it acts as an immediate bottleneck on faster CPUs and adds no value to a processor evaluation except in extreme cases such as the 2970WX/2990WX where performance impacts are made more readily evident. Even then, one or two simple tests would be enough to paint the picture, unless it called for further testing.

    It's simply a waste of benchmark time and continues to baffle me with its inclusion. The only reason I can think to keep it in reviews is to pad the Bench database, or that the tests can be completed quickly and it's simply spare time. I love AT, but sometimes they just make me scratch my head.
  • imaheadcase - Tuesday, November 26, 2019 - link

    1080p is fine..they are using it for CPU benchmarks to bottleneck, not gpu.
  • peevee - Tuesday, November 26, 2019 - link

    It is GTX1080, not 1080p.
  • DannyH246 - Monday, November 25, 2019 - link

    Because Inteltech takes Intels $$$ and its one of the few areas where Intel doesn't get smashed.I agree with you, the main uses for these these kinds of CPU's are proper work not gaming. And definitely not gaming at 1080p. Its a joke.

Log in

Don't have an account? Sign up now