Gaming: Grand Theft Auto V

The highly anticipated iteration of the Grand Theft Auto franchise hit the shelves on April 14th 2015, with both AMD and NVIDIA in tow to help optimize the title. GTA doesn’t provide graphical presets, but opens up the options to users and extends the boundaries by pushing even the hardest systems to the limit using Rockstar’s Advanced Game Engine under DirectX 11. Whether the user is flying high in the mountains with long draw distances or dealing with assorted trash in the city, when cranked up to maximum it creates stunning visuals but hard work for both the CPU and the GPU.

For our test we have scripted a version of the in-game benchmark. The in-game benchmark consists of five scenarios: four short panning shots with varying lighting and weather effects, and a fifth action sequence that lasts around 90 seconds. We use only the final part of the benchmark, which combines a flight scene in a jet followed by an inner city drive-by through several intersections followed by ramming a tanker that explodes, causing other cars to explode as well. This is a mix of distance rendering followed by a detailed near-rendering action sequence, and the title thankfully spits out frame time data.

 

There are no presets for the graphics options on GTA, allowing the user to adjust options such as population density and distance scaling on sliders, but others such as texture/shadow/shader/water quality from Low to Very High. Other options include MSAA, soft shadows, post effects, shadow resolution and extended draw distance options. There is a handy option at the top which shows how much video memory the options are expected to consume, with obvious repercussions if a user requests more video memory than is present on the card (although there’s no obvious indication if you have a low end GPU with lots of GPU memory, like an R7 240 4GB).

 

All of our benchmark results can also be found in our benchmark engine, Bench.

AnandTech IGP Low
Average FPS
95th Percentile
Gaming: Strange Brigade (DX12, Vulkan) Gaming: Far Cry 5
Comments Locked

220 Comments

View All Comments

  • yeeeeman - Wednesday, May 20, 2020 - link

    The CPU won't consume nowhere near 250w during gaming. 250w is valid only for short all core scenarios. Otherwise it will stay in its 130w tdp. Go and read other reviews and you will see I am right.
  • yankeeDDL - Thursday, May 21, 2020 - link

    According to this (https://images.anandtech.com/doci/15785/10900K%20y... it stays at 230W for almost 4min.
    In any case, you can read my sentence again and use 130W instead of 250W, and it does nt change anything.
  • arashi - Saturday, May 23, 2020 - link

    You can't blame him, he's on Intel payroll and has to act the idiot.
  • dirkdigles - Wednesday, May 20, 2020 - link

    Ian, I think the pricing on the charts is a bit misleading. The $488 price for the 10900K is the 1000-unit bulk pricing, and the $499 price on the 3900X hasn't been seen since January 2020... it's currently $409 on Amazon. This would skew the ability for the reader to make comparison.

    I know MSRP is a good metric, but street price is more important. What can I buy these chips for, today? If I'm a consumer, I likely can't get that $488 bulk per chip price for the 10900K, and the 3900X is not going to cost me anywhere near $409. Please update.
  • dirkdigles - Wednesday, May 20, 2020 - link

    *anywhere near $499. Typo.
  • WaltC - Wednesday, May 20, 2020 - link

    Yes, I paid ~$409 for my 3900X, and on top of that AMZN offered me 6-months, same-as-cash, which I was more than happy to accept...;) Good times!
  • AnarchoPrimitiv - Wednesday, May 20, 2020 - link

    Exactly, the 3900x is over $100 cheaper and is nowhere "around the same price"
  • yeeeeman - Wednesday, May 20, 2020 - link

    Well Intel has the 10900f at 400$. Locked with no igpu. almost same frequencies. That is a better buy than the 10900k
  • Spunjji - Tuesday, May 26, 2020 - link

    Right - the 10900F is likely a better deal, but the comparison was with the 10900K.
  • Irata - Wednesday, May 20, 2020 - link

    Waiting for comments on how the two small fans on the mainboard make this an unacceptable option. If I remember correctly, that applied to X570 boards.

Log in

Don't have an account? Sign up now