Generational Tests on the i7-6700K: Gaming Benchmarks on High End GPUs

Alien: Isolation

If first person survival mixed with horror is your sort of thing, then Alien: Isolation, based off of the Alien franchise, should be an interesting title. Developed by The Creative Assembly and released in October 2014, Alien: Isolation has won numerous awards from Game Of The Year to several top 10s/25s and Best Horror titles, ratcheting up over a million sales by February 2015. Alien: Isolation uses a custom built engine which includes dynamic sound effects and should be fully multi-core enabled.

For low end graphics, we test at 720p with Ultra settings, whereas for mid and high range graphics we bump this up to 1080p, taking the average frame rate as our marker with a scripted version of the built-in benchmark.

Alien Isolation on MSI R9 290X Gaming LE 4GB ($380)

Alien Isolation on ASUS GTX 980 Strix 4GB ($560)

Total War: Attila

The Total War franchise moves on to Attila, another The Creative Assembly development, and is a stand-alone strategy title set in 395AD where the main story line lets the gamer take control of the leader of the Huns in order to conquer parts of the world. Graphically the game can render hundreds/thousands of units on screen at once, all with their individual actions and can put some of the big cards to task.

For low end graphics, we test at 720p with performance settings, recording the average frame rate. With mid and high range graphics, we test at 1080p with the quality setting. In both circumstances, unlimited video memory is enabled and the in-game scripted benchmark is used.

Total War: Attila on MSI R9 290X Gaming LE 4GB ($380)

Total War: Attila on ASUS GTX 980 Strix 4GB ($560)

Grand Theft Auto V

The highly anticipated iteration of the Grand Theft Auto franchise finally hit the shelves on April 14th 2015, with both AMD and NVIDIA in tow to help optimize the title. GTA doesn’t provide graphical presets, but opens up the options to users and extends the boundaries by pushing even the hardest systems to the limit using Rockstar’s Advanced Game Engine. Whether the user is flying high in the mountains with long draw distances or dealing with assorted trash in the city, when cranked up to maximum it creates stunning visuals but hard work for both the CPU and the GPU.

For our test we have scripted a version of the in-game benchmark, relying only on the final part which combines a flight scene along with an in-city drive-by followed by a tanker explosion. For low end systems we test at 720p on the lowest settings, whereas mid and high end graphics play at 1080p with very high settings across the board. We record both the average frame rate and the percentage of frames under 60 FPS (16.6ms).

Grand Theft Auto V on MSI R9 290X Gaming LE 4GB ($380)

Grand Theft Auto V on ASUS GTX 980 Strix 4GB ($560)

GRID: Autosport

No graphics tests are complete without some input from Codemasters and the EGO engine, which means for this round of testing we point towards GRID: Autosport, the next iteration in the GRID and racing genre. As with our previous racing testing, each update to the engine aims to add in effects, reflections, detail and realism, with Codemasters making ‘authenticity’ a main focal point for this version.

GRID’s benchmark mode is very flexible, and as a result we created a test race using a shortened version of the Red Bull Ring with twelve cars doing two laps. The car is focus starts last and is quite fast, but usually finishes second or third. For low end graphics we test at 1080p medium settings, whereas mid and high end graphics get the full 1080p maximum. Both the average and minimum frame rates are recorded.

GRID: Autosport on MSI R9 290X Gaming LE 4GB ($380)

GRID: Autosport on ASUS GTX 980 Strix 4GB ($560)

Middle-Earth: Shadow of Mordor

The final title in our testing is another battle of system performance with the open world action-adventure title, Shadow of Mordor. Produced by Monolith using the LithTech Jupiter EX engine and numerous detail add-ons, SoM goes for detail and complexity to a large extent, despite having to be cut down from the original plans. The main story itself was written by the same writer as Red Dead Redemption, and it received Zero Punctuation’s Game of The Year in 2014.

For testing purposes, SoM gives a dynamic screen resolution setting, allowing us to render at high resolutions that are then scaled down to the monitor. As a result, we get several tests using the in-game benchmark. For low end graphics we examine at 720p with low settings, whereas mid and high end graphics get 1080p Ultra. The top graphics test is also redone at 3840x2160, also with Ultra settings, and we also test two cards at 4K where possible.

Shadow of Mordor on MSI R9 290X Gaming LE 4GB ($380)

Shadow of Mordor on 2x MSI R9 290X Gaming LE 4GB ($380)

Shadow of Mordor on ASUS GTX 980 Strix 4GB ($560)

Shadow of Mordor on ASUS GTX 980 Strix 4GB ($560)

Generational Tests on the i7-6700K: Gaming Benchmarks on Mid-Range GPUs What You Can Buy: Office and Web Benchmarks
Comments Locked

477 Comments

View All Comments

  • halcyon - Thursday, August 6, 2015 - link

    Remember the time, when you could OC 50% or even 100% higher frequency on your CPU?

    Or the time, when every single c. 1.6 year cycle would bring you at least 50% performance increase?

    Those times are over.

    I'm still running 4-core(8HT)/tri-channel/SATA3/USB3 platform from 2009.

    I see very little reason to upgrade, other than tinkering and spending time.

    The money is burning in my hand. Intel is refusing to come out with something that is really worthwhile.

    Ah well, maybe Skylake-E or maybe Cannonlake-E, or maybe...

    I've been waiting to upgrade for a long time (and have upgraded my GPU several times, as there at least I get some bang for my buck).
  • Bambooz - Friday, August 7, 2015 - link

    I remember my Core 2 Duo E4300 (1.8GHz) that ran at 3.2GHz (400MHz FSB *8) for most of it's life till I replaced it with a Core 2 Quad Q6600 (2.4GHz) that ran at 3.6GHz (400MHz FSB *9). Those were the days when OCing was actually a fun thing to do.. till intel fucked everyone over by making you pay for CPUs that can be OCed or not have any meaningful OC at all.
  • watzupken - Thursday, August 6, 2015 - link

    Come to think of it, the comparison of the i7 processor actually put Skylake i7 6700K in an unfavorable position. Honestly, I am more keen to get a Broadwell i7 5775C if I am looking to upgrade. The significantly better graphics, lower TDP and L4 cache seems like a better deal to me. Clockspeed is no big deal since the i7 5775C is overclocking unlocked if I am not mistaken.

    I think Intel just messed themselves up by launching the desktop variant of Broadwell with Crystal lake graphics just ahead of Skylake.
  • yhselp - Thursday, August 6, 2015 - link

    Thank you for the review. You place a lot of emphasis on gaming and overclocking, value for money, upgrading from SB, and rightly so; however, there are no gaming benchmarks with an overclocked Skylake CPU to illustrate that point -- it would be particularly interesting to see how an i5-6600K@4.5Ghz (assuming it can reach that) fares as that would be most representative of this market.

    All that Sandy Bridge gamers see now is a 0.9fps to 5.8fps (excluding GRID) improvement which can't possibly justify the price of a new CPU, motherboard and memory.
  • MiSt77 - Thursday, August 6, 2015 - link

    I would like to argue, that it's a little bit too early for the the authors' assertion, that Skylake only gives meagre performance improvements over Broadwell (and earlier processors).

    Considering that Skylake does in fact feature one big, if not ground breaking, improvement in that (with AVX-512) programmers now have thirty-two¹ 512-bit registers at their disposal, I think Skylake-optimized software should - at least for algorithms amenable to SIMD optimizations - be able to deliver considerable performance improvements.

    In this regard it would be interesting to know, if any of the tested software already comes with proper AVX-512 support (or not, as I suppose) - regrettably, there is not even a single mention of this in the benchmark section ...

    ¹ That is, twice as many as before, which also should help optimizing compilers, as it reduces register pressure.
  • SuperVeloce - Thursday, August 6, 2015 - link

    Um, no. As far as we know, only xeons will get avx-512 enabled.
  • MiSt77 - Thursday, August 6, 2015 - link

    Thank you for this factual correction!

    To my dismay I have to admit that you are right: AVX-512 will only be offered on (some?) Xeon models. As I'm one of those, who where waiting for Skylake specifically because of AVX-512, I find this very disappointing: thanks for ruining my day, Intel.

    On the other hand, with the E3-1535M v5 and E3-1505M v5 two (mobile) Xeons were announced recently; maybe my dream of an (AVX-512 enabled) notebook can still become a reality ...
  • Da W - Thursday, August 6, 2015 - link

    Yeee! I'll be able to keep my i7 4770k for at least a decade without upgrading!
    Time to spend that money on a surface pro 4, a new phone, hololens, whatever else, but my desktop will remain usable!!
    Big change from the 90s :)
  • TheGame21x - Thursday, August 6, 2015 - link

    My Core i5 2500k is still chugging along nicely at 4.2 GHz so I think this'll be another generation I'll be skipping. The cost of buying a new motherboard and RAM alongside the new CPU is a bit more than I'm willing to spend as of now given the performance increase.
  • sweeper765 - Thursday, August 6, 2015 - link

    Another Sandy Bridge owner here (2500k @ 4.6GHz 1.27v load).
    I've been waiting for a worthwhile upgrade. This is close but i'm not yet convinced.

    The ipc improvements and new instructions are attractive but there are some negative points too:
    - hotter, always hotter. Why no temperature analysis ? I see from other reviews temps in the high 80's with water cooling setups. Is this was passes as normal these days?
    - need to buy DDR4 besides cpu and motherboard. I see there is a strong push towards this, despite having no real benefit relative to DDR3 (same bandwidth at the same frequency). Basically it's a kind of high frequency DDR3L. Please don't say it saves power, it's only a few watts less than ddr3 at best.
    - why is stock voltage so high for 14nm technology? It's higher than my oc voltage on a 32nm 4 years old cpu. Power consumption in load is also pretty high.
    - i was hoping for avx3
    - cannot install win7 from usb are you kidding me? I don't have an optical drive and was doing fine without it for years now
    - price goes up from generation to generation. 2600k launched at 317$, 6700k at 350$
    - connectivity wise i would have liked to see more usb on back panel and more internal sata , especially with sata express consuming 2 standard ports.

Log in

Don't have an account? Sign up now