Gaming: Grand Theft Auto V

The highly anticipated iteration of the Grand Theft Auto franchise hit the shelves on April 14th 2015, with both AMD and NVIDIA in tow to help optimize the title. GTA doesn’t provide graphical presets, but opens up the options to users and extends the boundaries by pushing even the hardest systems to the limit using Rockstar’s Advanced Game Engine under DirectX 11. Whether the user is flying high in the mountains with long draw distances or dealing with assorted trash in the city, when cranked up to maximum it creates stunning visuals but hard work for both the CPU and the GPU.

For our test we have scripted a version of the in-game benchmark. The in-game benchmark consists of five scenarios: four short panning shots with varying lighting and weather effects, and a fifth action sequence that lasts around 90 seconds. We use only the final part of the benchmark, which combines a flight scene in a jet followed by an inner city drive-by through several intersections followed by ramming a tanker that explodes, causing other cars to explode as well. This is a mix of distance rendering followed by a detailed near-rendering action sequence, and the title thankfully spits out frame time data.

 

There are no presets for the graphics options on GTA, allowing the user to adjust options such as population density and distance scaling on sliders, but others such as texture/shadow/shader/water quality from Low to Very High. Other options include MSAA, soft shadows, post effects, shadow resolution and extended draw distance options. There is a handy option at the top which shows how much video memory the options are expected to consume, with obvious repercussions if a user requests more video memory than is present on the card (although there’s no obvious indication if you have a low end GPU with lots of GPU memory, like an R7 240 4GB).

 

All of our benchmark results can also be found in our benchmark engine, Bench.

AnandTech IGP Low
Average FPS
95th Percentile
Gaming: Strange Brigade (DX12, Vulkan) Gaming: Far Cry 5
Comments Locked

220 Comments

View All Comments

  • yankeeDDL - Wednesday, May 20, 2020 - link

    I think the main idea was to show if the CPU was getting in the way when teh GPU is definitely not the bottleneck.
  • mrvco - Wednesday, May 20, 2020 - link

    That's difficult to discern without all the relevant data.. i.e. diminishing returns as the bottle-neck transitions from the CPU to the GPU at typical resolutions and quality settings. I think better of the typical AnandTech reader, but I would hate to think that someone reads this review and extrapolates 720p / medium quality FPS relative performance to 1440p or 2160p at high or ultra settings and blows their build budget on a $400+ CPU and associated components required to power and cool that CPU with little or no improvement in actual gaming performance.
  • dullard - Wednesday, May 20, 2020 - link

    Do we really need this same comment with every CPU review ever? Every single CPU review for years (Decades?) people make that exact same comment. That is why the reviews test several different resolutions already.

    Anandtech did 2 to 4 resolutions with each game. Isn't that enough? Can't you interpolate or extrapolate as needed to whatever specific resolution you use? Or did you miss that there are scroll over graphs of other resolutions in the review.
  • schujj07 - Wednesday, May 20, 2020 - link

    “There are two types of people in this world: 1.) Those who can extrapolate from incomplete data.”
  • diediealldie - Thursday, May 21, 2020 - link

    LMAO you're genius
  • DrKlahn - Wednesday, May 20, 2020 - link

    In some cases they do higher than 1080p and some they don't. I do wish they would include higher resolution in all tests and that the "gaming lead" statements came with the caveat that it's largely only going to be beneficial for those seeking low resolution with very high frame rates. Someone with a 1080p 60Hz monitor likely isn't going to benefit from the Intel platform, nor is someone with a high resolution monitor with eye candy enabled. But the conclusion doesn't really spell that out well for the less educated. And it's certainly not just Anandtech doing this. Seems to be the norm. But you see people parroting "Intel is better for gaming" when in their setup it may not bring any benefit while incurring more cost and being more difficult to cool due to the substantial power use.
  • Spunjji - Tuesday, May 26, 2020 - link

    It's almost like their access is partially contingent on following at least a few of the guidelines about how to position the product. :/
  • mrvco - Wednesday, May 20, 2020 - link

    Granted, 720p and 1080p resolutions are highly CPU dependent when using a modern GPU, but I'm not seeing 1440p at high or ultra quality results which is where things do transition to being more GPU dependent and a more realistic real-world scenario for anyone paying up for mid-range to high-end gaming PCs.
  • Meteor2 - Wednesday, July 15, 2020 - link

    Spend as much as you can on the GPU and pair with a $200 CPU. It’s actually pretty simple.
  • yankeeDDL - Wednesday, May 20, 2020 - link

    I have to say that this fared better than I expected.
    I would definitely not buy one, but kudos to Intel.
    Can't imagine what it means to have a 250W CPU + 200W GPU in a PC next to you while you're playing. Must sound like an airplane.

Log in

Don't have an account? Sign up now