Grand Theft Auto V

The highly anticipated iteration of the Grand Theft Auto franchise finally hit the shelves on April 14th 2015, with both AMD and NVIDIA in tow to help optimize the title. GTA doesn’t provide graphical presets, but opens up the options to users and extends the boundaries by pushing even the hardest systems to the limit using Rockstar’s Advanced Game Engine. Whether the user is flying high in the mountains with long draw distances or dealing with assorted trash in the city, when cranked up to maximum it creates stunning visuals but hard work for both the CPU and the GPU.

For our test we have scripted a version of the in-game benchmark, relying only on the final part which combines a flight scene along with an in-city drive-by followed by a tanker explosion. For low end systems we test at 720p on the lowest settings, whereas mid and high end graphics play at 1080p with very high settings across the board. We record both the average frame rate and the percentage of frames under 60 FPS (16.6ms).

Grand Theft Auto V on ASUS GTX 980 Strix 4GB ($560)

Grand Theft Auto V on MSI R9 290X Gaming LE 4GB ($380)

Grand Theft Auto V on MSI GTX 770 Lightning 2GB ($245)

Grand Theft Auto V on MSI R9 285 Gaming 2GB ($240)

Grand Theft Auto V on ASUS R7 240 DDR3 2GB ($70)

Grand Theft Auto V on Integrated Graphics

The older Core i7-2600K eeks out a small ~5 FPS advantage over the Core i3 when running a GTX 980 at 1080p maximum settings, but with all other GPUs the differences are minimal. With integrated graphics, the Core i3 shows it can pummel the older IGP into the ground.

Gaming: Total War: Attila Gaming: GRID Autosport
Comments Locked

186 Comments

View All Comments

  • Ian Cutress - Friday, February 3, 2017 - link

    There are some minimum frame rate numbers in Bench, however they're not all regular (they're based on pure min, not 99%). The goal is to have some nicer numbers in our testbed update. Soon. When I find time to finish the script ... :D
  • fanofanand - Friday, February 3, 2017 - link

    "This RGB fad that apparently sells like hot cakes"

    I love you Ian! In a totally hetero way.....

    Seriously though great article, this should silence all the crybabies who whine about the lack of "Anandtech style in-depth analysis". You are still the best CPU reviewer in the biz!
  • jgarcows - Friday, February 3, 2017 - link

    I'm still running an i5-2400 at default speeds that I paid $205 for when it first came out. It is insane how slow the improvement of intel chips have been. You'd think by now an i3 would be an upgrade.
  • crashtech - Friday, February 3, 2017 - link

    Frame times would be what hurts the i3 in games if anything. The averages may not be telling the whole story.
  • djscrew - Friday, February 3, 2017 - link

    Don't count on this being the new norm. Even though Intel just invalidated a long standing policy and the perception that these are inferior chips with this change I don't think it will lat. The next process shrink will likely bring with it a die size change leaving the i3 people who want to upgrade a few years down SOL. They could simply roll back this "feature" and we're back to status quo.
  • fanofanand - Friday, February 3, 2017 - link

    Your comment doesn't make sense. The next node will require a new chipset and ANYONE with today's mobos will need to upgrade, EVERYONE will be SOL.
  • jaydee - Friday, February 3, 2017 - link

    Conclusions page: "A good example of this is Agisoft: the Core i5-7400 (which costs $14 more, quad core, 3.4-3.8 GHz) completes the work ~10% quicker."

    Do you mean the i5-7400 @ 3.0-3.5 GHz, or the i5-7500 @ 3.4-3.8 GHz?
  • Ian Cutress - Friday, February 3, 2017 - link

    Ah yes, I meant the 7400. I had 2600K numbers in my head at the time. :)
  • name99 - Friday, February 3, 2017 - link

    "and goes in line with the fact that Intel has officially stated that one of the key features of the new 14+ process is that the transistors are more ‘relaxed’ and there’s no decrease in density."

    Remember those days when Intel was slagging TSMC for no transistor scaling? Ah good times.
    [img] https://www.extremetech.com/wp-content/uploads/201... [/img]

    I guess TSMC just decided to "relax" their transistors...
  • name99 - Friday, February 3, 2017 - link

    "The latest memory technology to hit prime time is Intel and Micron’s 3D XPoint. "

    Where "hit prime time" means "may ship some time in 2019"?
    No-one cares about 3D XPoint in SSDs; and the DRAM version seems utterly MIA since the initial enthusiastic Intel claims. (Come to think of it, much like Intel 10nm ...)

Log in

Don't have an account? Sign up now