Comparing the Quad Cores: CPU Tests

As a straight up comparison between what Intel offered in terms of quad cores, here’s an analysis of all the results for the 2600K, 2600K overclocked, and Intel’s final quad-core with HyperThreading chip for desktop, the 7700K.

On our CPU tests, the Core i7-2600K when overclocked to a 4.7 GHz all-core frequency (and with DDR3-2400 memory) offers anywhere from 10-24% increase in performance against the stock settings with Intel maximum supported frequency memory. Users liked the 2600K because of this – there were sizable gains to be had, and Intel’s immediate replacements to the 2600K didn’t offer the same level of boost or difference in performance.

However, when compared to the Core i7-7700K, Intel’s final quad-core with HyperThreading processor, users were able to get another 8-29% performance on top of that. Depending on the CPU workload, it would be very easy to see how a user could justify getting the latest quad core processor and feeling the benefits for more modern day workloads, such as rendering or encoding, especially given how the gaming market has turned more into a streaming culture. For the more traditional workflows, such as PCMark or our legacy tests, only gains of 5-12% are seen, which is what we would have seen back when some of these newer tests were no longer so relevant.

As for the Core i7-9700K, which has eight full cores and now sits in the spot of Intel’s best Core i7 processor, performance gains are very much more tangible, and almost double in a lot of cases against an overclocked Core i7-2600K (and more than double against one at stock).

The CPU case is clear: Intel’s last quad core with hyperthreading is an obvious upgrade for a 2600K user, even before you overclock it, and the 9700K which is almost the same launch price parity is definitely an easy sell. The gaming side of the equation isn’t so rosy though.

Comparing the Quad Cores: GPU Tests

Modern games today are running at higher resolutions and quality settings than the Core i7-2600K did when it was first launch, as well as new physics features, new APIs, and new gaming engines that can take advantage of the latest advances in CPU instructions as well as CPU-to-GPU connectivity. For our gaming benchmarks, we test with four tests of settings on each game (720p, 1080p, 1440p-4K, and 4K+) using a GTX 1080, which is one of last generations high-end gaming cards, and something that a number of Core i7 users might own for high-end gaming.

When the Core i7-2600K was launched, 1080p gaming was all the rage. I don’t think I purchased a monitor bigger than 1080p until 2012, and before then I was clan gaming on screens that could have been as low as 1366x768. The point here is that with modern games at older resolutions like 1080p, we do see a sizeable gain when the 2600K is overclocked. A 22% gain in frame rates from a 34% overclock sounds more than reasonable to any high-end focused gamer. Intel only managed to improve on that by 12% over the next few years to the Core i7-7700K, relying mostly on frequency gains. It’s not until the 9700K, with more cores and running games that actually know what to do with them, do we see another jump up in performance.

However, all those gains are muted at a higher resolutions setting, such as 1440p. Going from an overclocked 2600K to a brand new 9700K only gives a 9% increase in frame rates for modern games. At an enthusiast 4K setting, the results across the board are almost equal. As resolutions are getting higher, even with modern physics and instructions and APIs, the bulk of the workload is still on the GPU, and even the Core i7-2600K is powerful enough for it. There is the odd title where having the newer chip helps a lot more, but it’s in the minority.

That is, at least on average frame rates. Modern games and modern testing methods now test percentile frame rates, and the results are a little different.

Here the results look a little worse for the Core i7-2600K and a bit better for the Core i7-9700K, but on the whole the broad picture is the same for percentile results as it is for average frame results. In the individual results, we see some odd outliers, such as Ashes of the Singularity which was 15% down on percentiles at 4K for a stock 2600K, but the 9700K was only 6% higher than an overclocked 2600K, but like the average frame rates, it is really title dependent.

Power Consumption Conclusions
Comments Locked

213 Comments

View All Comments

  • XXxPro_bowler420xXx - Saturday, May 11, 2019 - link

    I am running a 3570 as my computer here at school. With a $50 1050Ti and 16gb of ram.
  • godrilla - Friday, May 10, 2019 - link

    I would love to see a 6 core i7 980xe overclocked to 4.3 ghz with 2 ghz 12 gig ram triple channel memory vs all these quad cores. < my rig. Playing all games at max settings for example shadow of Tomb Raider max settings at 3440x1440p getting 60fps gsync helps with frame variance smoothness. Metro Exodus extreme settings plus tesselation, physx and hairworks getting average 60fps same resolution with 1080ti ftw3.
  • Ratman6161 - Friday, May 10, 2019 - link

    "there is only one or two reasons to stick to that old system, even when overclocked. The obvious reason is cost"

    I have to disagree with that statement. My reason for my trusty 2600K still running is that its a wonderful "hand-me-down" system. I was running my 2600K as my primary system right up until I went Ryzen. At that point, my old system became my wife's new system. I toned down the overclock to 4.2 Ghz so I could slap a cheap but quiet cooler on it and for her uses (MS Office, email, web browsing, etc) it is a great system and plenty fast enough. My old Samsung 850 EVO SDD went along with it since in my newer system I've got a 960 EVO, but other than gaining that SSD along the way, its had no significant upgrades since 2011.

    For someone who could easily get by on something like an i3-8100 or i5-7xxx, the 2600K hand-me-down is a great option.
  • WJMazepas - Friday, May 10, 2019 - link

    My main PC still have a i5-760 so i believe its time to upgrade
  • xrror - Friday, May 10, 2019 - link

    lol indeed!
  • HStewart - Friday, May 10, 2019 - link

    Personally I have not owned or cared for a desktop since my Dual Xeon 5150, it 12 years old and for a while until later i7's came out it was fastest machine in around. Back then I was into 3D rendering and even built a render farm - also serious into games with latest NVidia Graphics cards.

    But since then I went mobile and less graphics and try to less games but still like get out Command & Conquer and Company of Hero's - never much a first person shooter. So for me a higher end laptop would do me fine - for a longest time Lenovo Y50 was good - but Lenovo for me had build issues... but when the Dell XPS 13 2in1 came out it was great for some things portability was great and still use it because it nice to travel with documents and such. But I wanted a faster machine so when the Dell XPS 15 2in1 was announce, I jump onto bandwagon almost fully loaded 4k screen is probably a waste on it because I am getting older - graphics is slightly better than the 3 year old Y50, but CPU is extremely faster than the Lenovo. Some older games have trouble with GPU, and professional graphics like Vue 2016 have trouble with GPU.

    But I will be 60 in couple of years and need to grow up from games.

    I think my next computer is going to be something different, I want a portable always online - cellular device - I thought about a iPad with cellular but I think I am going wait for Lakefield device, small device with long battery life and connected. My experience with iOS and Android over time is always the same thing - great when first started out - but later there battery drop and performance drops with OS upgrades - when if you think about it no different than with Windows. Even though I am a technical person, never a Linux person - just does not fit with me even when I try it.
  • eva02langley - Friday, May 10, 2019 - link

    GTA V is 5 years old... your game suites is horrible. At this point, I would just do a 3Dmark benchmark.
  • Qasar - Saturday, May 11, 2019 - link

    eva02... the games they test.. i dont even play them.....
  • eastcoast_pete - Friday, May 10, 2019 - link

    Thanks Ian! The most disappointing aspect of the newer Intel i7s vs. Sandy Bridge is the underwhelming progress on performance/Wh. How much more efficiency did the multiple changes in manufacturing and design really gain? Judging by the numbers, not that much. The amazing thing about Sandy Bridge was that it did boost performance, and did so at significantly improved perf/Wh. At this moment, we seem to be back to Athlon vs. P4 days: the progress is most noticeable with the chips that say "AMD" on them.
  • Qwertilot - Friday, May 10, 2019 - link

    In general, I think they did gain a lot of perf/Wh. Just not at the very top end. They've been pushing the clocks on the recent i7's incredibly hard.

Log in

Don't have an account? Sign up now