Generational Tests on the i7-6700K: Gaming Benchmarks on High End GPUs

Alien: Isolation

If first person survival mixed with horror is your sort of thing, then Alien: Isolation, based off of the Alien franchise, should be an interesting title. Developed by The Creative Assembly and released in October 2014, Alien: Isolation has won numerous awards from Game Of The Year to several top 10s/25s and Best Horror titles, ratcheting up over a million sales by February 2015. Alien: Isolation uses a custom built engine which includes dynamic sound effects and should be fully multi-core enabled.

For low end graphics, we test at 720p with Ultra settings, whereas for mid and high range graphics we bump this up to 1080p, taking the average frame rate as our marker with a scripted version of the built-in benchmark.

Alien Isolation on MSI R9 290X Gaming LE 4GB ($380)

Alien Isolation on ASUS GTX 980 Strix 4GB ($560)

Total War: Attila

The Total War franchise moves on to Attila, another The Creative Assembly development, and is a stand-alone strategy title set in 395AD where the main story line lets the gamer take control of the leader of the Huns in order to conquer parts of the world. Graphically the game can render hundreds/thousands of units on screen at once, all with their individual actions and can put some of the big cards to task.

For low end graphics, we test at 720p with performance settings, recording the average frame rate. With mid and high range graphics, we test at 1080p with the quality setting. In both circumstances, unlimited video memory is enabled and the in-game scripted benchmark is used.

Total War: Attila on MSI R9 290X Gaming LE 4GB ($380)

Total War: Attila on ASUS GTX 980 Strix 4GB ($560)

Grand Theft Auto V

The highly anticipated iteration of the Grand Theft Auto franchise finally hit the shelves on April 14th 2015, with both AMD and NVIDIA in tow to help optimize the title. GTA doesn’t provide graphical presets, but opens up the options to users and extends the boundaries by pushing even the hardest systems to the limit using Rockstar’s Advanced Game Engine. Whether the user is flying high in the mountains with long draw distances or dealing with assorted trash in the city, when cranked up to maximum it creates stunning visuals but hard work for both the CPU and the GPU.

For our test we have scripted a version of the in-game benchmark, relying only on the final part which combines a flight scene along with an in-city drive-by followed by a tanker explosion. For low end systems we test at 720p on the lowest settings, whereas mid and high end graphics play at 1080p with very high settings across the board. We record both the average frame rate and the percentage of frames under 60 FPS (16.6ms).

Grand Theft Auto V on MSI R9 290X Gaming LE 4GB ($380)

Grand Theft Auto V on ASUS GTX 980 Strix 4GB ($560)

GRID: Autosport

No graphics tests are complete without some input from Codemasters and the EGO engine, which means for this round of testing we point towards GRID: Autosport, the next iteration in the GRID and racing genre. As with our previous racing testing, each update to the engine aims to add in effects, reflections, detail and realism, with Codemasters making ‘authenticity’ a main focal point for this version.

GRID’s benchmark mode is very flexible, and as a result we created a test race using a shortened version of the Red Bull Ring with twelve cars doing two laps. The car is focus starts last and is quite fast, but usually finishes second or third. For low end graphics we test at 1080p medium settings, whereas mid and high end graphics get the full 1080p maximum. Both the average and minimum frame rates are recorded.

GRID: Autosport on MSI R9 290X Gaming LE 4GB ($380)

GRID: Autosport on ASUS GTX 980 Strix 4GB ($560)

Middle-Earth: Shadow of Mordor

The final title in our testing is another battle of system performance with the open world action-adventure title, Shadow of Mordor. Produced by Monolith using the LithTech Jupiter EX engine and numerous detail add-ons, SoM goes for detail and complexity to a large extent, despite having to be cut down from the original plans. The main story itself was written by the same writer as Red Dead Redemption, and it received Zero Punctuation’s Game of The Year in 2014.

For testing purposes, SoM gives a dynamic screen resolution setting, allowing us to render at high resolutions that are then scaled down to the monitor. As a result, we get several tests using the in-game benchmark. For low end graphics we examine at 720p with low settings, whereas mid and high end graphics get 1080p Ultra. The top graphics test is also redone at 3840x2160, also with Ultra settings, and we also test two cards at 4K where possible.

Shadow of Mordor on MSI R9 290X Gaming LE 4GB ($380)

Shadow of Mordor on 2x MSI R9 290X Gaming LE 4GB ($380)

Shadow of Mordor on ASUS GTX 980 Strix 4GB ($560)

Shadow of Mordor on ASUS GTX 980 Strix 4GB ($560)

Generational Tests on the i7-6700K: Gaming Benchmarks on Mid-Range GPUs What You Can Buy: Office and Web Benchmarks
POST A COMMENT

476 Comments

View All Comments

  • Visual - Wednesday, August 12, 2015 - link

    I kinda don't like how you keep repeating the generic benchmark descriptions before each graph. I'd prefer if it were hidden by default, visible on hover or toggled by clicking of some info button or similar, or at the very least formatted a bit different than actual article text.

    I'd also like if you had some comments on the actual results, at least where there are some peculiarities in them.

    Case in point: Why is the 5775C IGP so much better in some games?
    Reply
  • mapesdhs - Wednesday, August 12, 2015 - link

    Agree re comments on results, eg. why does the 2600K dip so badly for Shadow of Mordor @ 4K with the GTX 770? It doesn't happen with the 980, but if the dip was a VRAM issue @ 4K then the 3770K shouldn't be so close to the other CPUs. Weird... Reply
  • wyssin - Wednesday, August 12, 2015 - link

    Has anyone published a review comparing i7-6700k with other cpus all overclocked to, say, 4.5 GHz? For those who typically run an overclocked system, it's not an apples-to-apples comparison to put the new entry up against the older all at stock settings.
    So to make the best-informed decision, it would be very useful to be able to see head-to-head trials at both (1) stock settings and (2) overclocked to a speed they can all reasonably manage (apparently around 4.4 or 4.5 GHz).

    I have the same problems with the Guru3D review and the Gamestar.de review that were mentioned in earlier comments.
    Reply
  • Oxford Guy - Thursday, August 13, 2015 - link

    The key is to pick a speed that the worst overclocking examples would be able to get to with reasonable voltage. That takes the luck of the draw out of the scenario. Reply
  • beatsbyden - Thursday, August 13, 2015 - link

    Not much improvent. Only worth the money if you're coming from an I5 Reply
  • Darkvengence - Thursday, August 13, 2015 - link

    This lack of CPU power needed in gaming is only temporary once u have photorealistic graphics in 4k u gonna need crazy powerful GPUs which need feeding by beastly CPUs . our current technology will seem like a dinosaur CPU in comparison. That is of course a fair few years away but still one day it will happen . I'm glad current CPU are not being taxed by today's games even less with dx12. Gives my gen 1 MSI nightblade more life with 4970k as u can't change motherboard it all custom front panel connectors and stuff. I used to have a i7 920 and got to say that is still a good CPU especially for single GPU systems. I really like sandy bridge tho very impressive for its age. But older CPUs lose out mainly being tied to older chipsets so u lose new connector and bus speeds for hardware tho Reply
  • gasparmx - Thursday, November 19, 2015 - link

    I think you're kinda wrong, the point of DX12 is depending less on CPU, NVIDIA says in the future probably you're not going to need a beast CPU to play 4k games. Reply
  • djscrew - Friday, August 14, 2015 - link

    I'm so disappointed in SB/DDR4. After all this wait and the IPC gains with discreet graphics are negative? WTF Intel. I Guess my Nehalem system will survive another generation, or maybe three? No compelling reason to upgrade. It's such a shame because I was really looking forward to building a $3k rig. I think I'll shop for a nice 4k panel instead. Reply
  • Ninjawithagun - Friday, August 14, 2015 - link

    And just when I was about to purchase the 6600K and a Z170 mini-ITX motherboard as an upgrade to my 4690K and Z97i Plus motherboard...man, am I glad I ran across this article. Saved myself about $600 for a useless upgrade! Reply
  • ES_Revenge - Friday, August 14, 2015 - link

    Umm what the heck happened to the power consumption? In particular the i7/6700K. It's not really shown thoroughly in this review but the Broadwell CPUs are more power-efficient it seems. While the 6700K has a half GHz faster clock speed, it also has a much lesser GPU. To begin with, both the i5 and i7 Skylake parts have higher TDPs than the Broadwell desktop parts, and then the 6700K can actually draw over 100W when loaded. This is above its TDP and also significantly more than its 6600K counterpart which runs only a few hundred MHz slower. Odd.

    I mean I think we were all waiting for a desktop CPU that didn't have the power constraints as the Broadwell CPUs did but I don't think this is exactly what anyone was expecting. It's like these Skylake CPUs don't just take more power but they do so...for no reason at all. Sure they're faster but not hugely so; and, again, their iGPUs are significantly slower than Broadwell's. So their slight speed advantage came at the price of markedly increased power consumption over the previous gen.

    That only leads me to the question--WTF? lol What happened here with the power consumption? And losing that IVR didn't seem to help anything, eh? Skylake is fast and all but TBH I was more impressed *overall* with Broadwell (and those CPUs you can't even find for sale anywhere, the last time I checked--a few weeks ago). Granted as we've seen in 2nd part of the Broadwell review it's not a stellar OCer but still, overall it seems better to me than Skylake.

    It's kind of funny because when Broadwell DT launched I was thinking of how "Intel is mainly focusing on power consumption these days", meaning I thought they weren't focused enough on performance of DT CPUs. But it seems they've just thrown that out the window but the performance isn't anything *spectacular* from these CPUs, so it just seems like a step backwards. It's like with Broadwell they were showing just how much performance they could do with both CPU and iGPU with a minimum of power consumption--and the result was impressive. Here it's like they just forgot about that and said "It's Skylake...it's new and better! Everyone buy it!" Not really that impressive.
    Reply

Log in

Don't have an account? Sign up now