Gaming Benchmarks: High End

At the top of the line we take the best GPUs on the market from May 2015 - an AMD R9 290X and an NVIDIA GTX 980.

Alien: Isolation

If first person survival mixed with horror is your sort of thing, then Alien: Isolation, based off of the Alien franchise, should be an interesting title. Developed by The Creative Assembly and released in October 2014, Alien: Isolation has won numerous awards from Game Of The Year to several top 10s/25s and Best Horror titles, ratcheting up over a million sales by February 2015. Alien: Isolation uses a custom built engine which includes dynamic sound effects and should be fully multi-core enabled.

For low end graphics, we test at 720p with Ultra settings, whereas for mid and high range graphics we bump this up to 1080p, taking the average frame rate as our marker with a scripted version of the built-in benchmark.

Alien Isolation on MSI R9 290X Gaming LE 4GB ($380)

Alien Isolation on ASUS GTX 980 Strix 4GB ($560)

Total War: Attila

The Total War franchise moves on to Attila, another The Creative Assembly development, and is a stand-alone strategy title set in 395AD where the main story line lets the gamer take control of the leader of the Huns in order to conquer parts of the world. Graphically the game can render hundreds/thousands of units on screen at once, all with their individual actions and can put some of the big cards to task.

For low end graphics, we test at 720p with performance settings, recording the average frame rate. With mid and high range graphics, we test at 1080p with the quality setting. In both circumstances, unlimited video memory is enabled and the in-game scripted benchmark is used.

Total War: Attila on MSI R9 290X Gaming LE 4GB ($380)

Total War: Attila on ASUS GTX 980 Strix 4GB ($560)

Grand Theft Auto V

The highly anticipated iteration of the Grand Theft Auto franchise finally hit the shelves on April 14th 2015, with both AMD and NVIDIA in tow to help optimize the title. GTA doesn’t provide graphical presets, but opens up the options to users and extends the boundaries by pushing even the hardest systems to the limit using Rockstar’s Advanced Game Engine. Whether the user is flying high in the mountains with long draw distances or dealing with assorted trash in the city, when cranked up to maximum it creates stunning visuals but hard work for both the CPU and the GPU.

For our test we have scripted a version of the in-game benchmark, relying only on the final part which combines a flight scene along with an in-city drive-by followed by a tanker explosion. For low end systems we test at 720p on the lowest settings, whereas mid and high end graphics play at 1080p with very high settings across the board. We record both the average frame rate and the percentage of frames under 60 FPS (16.6ms).

Grand Theft Auto V on MSI R9 290X Gaming LE 4GB ($380)

Grand Theft Auto V on ASUS GTX 980 Strix 4GB ($560)

GRID: Autosport

No graphics tests are complete without some input from Codemasters and the EGO engine, which means for this round of testing we point towards GRID: Autosport, the next iteration in the GRID and racing genre. As with our previous racing testing, each update to the engine aims to add in effects, reflections, detail and realism, with Codemasters making ‘authenticity’ a main focal point for this version.

GRID’s benchmark mode is very flexible, and as a result we created a test race using a shortened version of the Red Bull Ring with twelve cars doing two laps. The car is focus starts last and is quite fast, but usually finishes second or third. For low end graphics we test at 1080p medium settings, whereas mid and high end graphics get the full 1080p maximum. Both the average and minimum frame rates are recorded.

GRID: Autosport on MSI R9 290X Gaming LE 4GB ($380)

GRID: Autosport on ASUS GTX 980 Strix 4GB ($560)

Middle-Earth: Shadows of Mordor

The final title in our testing is another battle of system performance with the open world action-adventure title, Shadows of Mordor. Produced by Monolith using the LithTech Jupiter EX engine and numerous detail add-ons, SoM goes for detail and complexity to a large extent, despite having to be cut down from the original plans. The main story itself was written by the same writer as Red Dead Redemption, and it received Zero Punctuation’s Game of The Year in 2014.

For testing purposes, SoM gives a dynamic screen resolution setting, allowing us to render at high resolutions that are then scaled down to the monitor. As a result, we get several tests using the in-game benchmark. For low end graphics we examine at 720p with low settings, whereas mid and high end graphics get 1080p Ultra. The top graphics test is also redone at 3840x2160, also with Ultra settings, and we also test two cards at 4K where possible.

Shadows of Mordor on MSI R9 290X Gaming LE 4GB ($380)

Shadows of Mordor on MSI R9 290X Gaming LE 4GB ($380)

Shadows of Mordor on 2x MSI R9 290X Gaming LE 4GB ($380)

Shadows of Mordor on ASUS GTX 980 Strix 4GB ($560)

Shadows of Mordor on ASUS GTX 980 Strix 4GB ($560)

Shadows of Mordor on 2x ASUS GTX 980 Strix 4GB ($560)

Generational Tests: Gaming Benchmarks on Mid-Range GPUs Conclusions
Comments Locked

121 Comments

View All Comments

  • TheinsanegamerN - Monday, August 3, 2015 - link

    Quite nice comparison.

    Unfortunately, it seems that, while broadwell does have the best IPC of the bunch, the overclock is pathetic. 1.325v to hit 4.2 GHz? my ivy bridge 3570k does the same clock with 1.075v. now, I've been told I have a exceptionally good chip, but it strikes me as odd that broadwell, being on a smaller 14nm process, cant match what ivy bridge could do two years ago. and since sandy bridge can be OC'ed to 4.7GHz+ with ease,and ive can hit 4.5, it seems there is still no reason to upgrade to broadwell, as any IPC gains are cancelled out by the lower clock rate. unless you need to do lots of dolphin emulation and refuse to overclock at all, the ancient sandy bridge still seems to do the best.
  • K_Space - Monday, August 3, 2015 - link

    TheinsanegamerN agreed. Those who held into their Sandy made a very wise investment, just like those good ol' 920s back in the X58 era.
  • Dupl3xxx - Monday, August 3, 2015 - link

    Ah, yes, the 920 was a lovely beast. Started overclocking at 3.6. It booted, tried 3.8, booted, tried 4.0, failed. 3.8 was literally done in less than an hour as my second ever attempt at overclocking, with my first being the intel e6600. And when a dying PSU wounded it, I got a 3930k. It does 4.0 ghz, and I've yet to find any situation where it's a bottleneck, besides things like rendering and benchmarks. I considered upgrading to the 59xx series, but when I learned that only the 5960x would be a 8-core, that was quickly decided against.

    It'll be interesting to watch Skylake and Zen fight it out in a year or so.
  • Impulses - Monday, August 3, 2015 - link

    I'm surprised Intel isn't banking on nostalgic memories of the Q6600 to hype the 6600K & 6700K... Surely marketing had a hand in the simplified naming reminiscent of the old C2Q.
  • augiem - Monday, August 3, 2015 - link

    I'm still on a i7-920 from mid 2009. Been running 3.6GHz the entire time, still rock solid as the day I bought it. I still can't believe I've been using a PC for this long. Before the i7, I would upgrade every 1.5 - 2 years tops. This thing is nuts.
  • mkozakewich - Tuesday, August 4, 2015 - link

    We've reached the end of that exponential advancement, so you can expect things to advance at roughly this rate for a while, at least until we also reach "small enough".
  • close - Tuesday, August 4, 2015 - link

    That's logarithmic advancement :). It keeps slowing down year after year.
  • Cryio - Tuesday, August 4, 2015 - link

    Technically with Sandy Bridge they reached the end. SB was quite a jump over Nehalem.
  • Harry Lloyd - Tuesday, August 4, 2015 - link

    There is no end. Intel just do not care, as they have no competition. Why would they waste money on increasing performance, when they can focus on efficiency for mobile? They can get away with selling basically the same CPUs every year on desktop, as they are still the fastest.
  • Badelhas - Tuesday, August 4, 2015 - link

    I also blame AMD. If they had good high end CPUs Intel would be forced to improve the ones they´ve been selling for the last 5 years or so.

Log in

Don't have an account? Sign up now