Comparing the Quad Cores: CPU Tests

As a straight up comparison between what Intel offered in terms of quad cores, here’s an analysis of all the results for the 2600K, 2600K overclocked, and Intel’s final quad-core with HyperThreading chip for desktop, the 7700K.

On our CPU tests, the Core i7-2600K when overclocked to a 4.7 GHz all-core frequency (and with DDR3-2400 memory) offers anywhere from 10-24% increase in performance against the stock settings with Intel maximum supported frequency memory. Users liked the 2600K because of this – there were sizable gains to be had, and Intel’s immediate replacements to the 2600K didn’t offer the same level of boost or difference in performance.

However, when compared to the Core i7-7700K, Intel’s final quad-core with HyperThreading processor, users were able to get another 8-29% performance on top of that. Depending on the CPU workload, it would be very easy to see how a user could justify getting the latest quad core processor and feeling the benefits for more modern day workloads, such as rendering or encoding, especially given how the gaming market has turned more into a streaming culture. For the more traditional workflows, such as PCMark or our legacy tests, only gains of 5-12% are seen, which is what we would have seen back when some of these newer tests were no longer so relevant.

As for the Core i7-9700K, which has eight full cores and now sits in the spot of Intel’s best Core i7 processor, performance gains are very much more tangible, and almost double in a lot of cases against an overclocked Core i7-2600K (and more than double against one at stock).

The CPU case is clear: Intel’s last quad core with hyperthreading is an obvious upgrade for a 2600K user, even before you overclock it, and the 9700K which is almost the same launch price parity is definitely an easy sell. The gaming side of the equation isn’t so rosy though.

Comparing the Quad Cores: GPU Tests

Modern games today are running at higher resolutions and quality settings than the Core i7-2600K did when it was first launch, as well as new physics features, new APIs, and new gaming engines that can take advantage of the latest advances in CPU instructions as well as CPU-to-GPU connectivity. For our gaming benchmarks, we test with four tests of settings on each game (720p, 1080p, 1440p-4K, and 4K+) using a GTX 1080, which is one of last generations high-end gaming cards, and something that a number of Core i7 users might own for high-end gaming.

When the Core i7-2600K was launched, 1080p gaming was all the rage. I don’t think I purchased a monitor bigger than 1080p until 2012, and before then I was clan gaming on screens that could have been as low as 1366x768. The point here is that with modern games at older resolutions like 1080p, we do see a sizeable gain when the 2600K is overclocked. A 22% gain in frame rates from a 34% overclock sounds more than reasonable to any high-end focused gamer. Intel only managed to improve on that by 12% over the next few years to the Core i7-7700K, relying mostly on frequency gains. It’s not until the 9700K, with more cores and running games that actually know what to do with them, do we see another jump up in performance.

However, all those gains are muted at a higher resolutions setting, such as 1440p. Going from an overclocked 2600K to a brand new 9700K only gives a 9% increase in frame rates for modern games. At an enthusiast 4K setting, the results across the board are almost equal. As resolutions are getting higher, even with modern physics and instructions and APIs, the bulk of the workload is still on the GPU, and even the Core i7-2600K is powerful enough for it. There is the odd title where having the newer chip helps a lot more, but it’s in the minority.

That is, at least on average frame rates. Modern games and modern testing methods now test percentile frame rates, and the results are a little different.

Here the results look a little worse for the Core i7-2600K and a bit better for the Core i7-9700K, but on the whole the broad picture is the same for percentile results as it is for average frame results. In the individual results, we see some odd outliers, such as Ashes of the Singularity which was 15% down on percentiles at 4K for a stock 2600K, but the 9700K was only 6% higher than an overclocked 2600K, but like the average frame rates, it is really title dependent.

Power Consumption Conclusions
POST A COMMENT

206 Comments

View All Comments

  • Fallen Kell - Saturday, May 11, 2019 - link

    Yeah. In many cases it is very sad when you look at this article. It has effectively taken a decade to finally get to the point that there is a worthwile upgrade in CPU performance. Prior to this, we were seeing CPU performance double every couple of years. A case in point is to look at an article from 2015 that did a comparison of CPUs over the last decade (i.e. ~2005 - 2015) and over that timeframe you saw a 6x performance increase in memory bandwidth and 8x - 10x CPU computational increase. But looking from 2011 to 2019 we barely see a doubling in performance (and then only on select use cases), while at the same time the price of said CPU is 25% more. It is no wonder why people have not been upgrading. Why spend $1000 for new CPU, motherboard, RAM to only gain 25-40% performance? We are just finally hitting that point now that people start to consider it worth that price.

    That all being said, it would have been nice to have included at least 1 AMD CPU in theses benchmarks for comparison. Sure, we can go to the review bench to get it, but having it here for some easy comparison would have been nice, especially given how Intel has seemed to have decided to innovating and purposely taking a dive (almost as if they feared regulatory actions from the USA/EU for effectively being a "monopoly" and to avoid such actions decided to simply stop releasing anything really competitive until AMD was able to get their act together again and have a competitive CPU...).
    Reply
  • Zoomer - Thursday, June 13, 2019 - link

    Funny thing is, last time it happened, Intel needed AMD to give it a kick in the nuts. Maybe this time too? Reply
  • mode_13h - Saturday, May 11, 2019 - link

    I figured I'd wait for PCIe 4.0, to upgrade. With Zen2, I guess my chance is here. Reply
  • Wardrop - Saturday, May 11, 2019 - link

    Yep, same. Hoping to replace my 3770k with Zen 2. Looking to down-size my chassis too with a Sliger case. Hopefully Zen 2 doesn't disappoint. Reply
  • Marlin1975 - Friday, May 10, 2019 - link

    Still running my 3770 as I have not seen that large a difference to upgrade. But Zen+ had me itching and Zen2 is what will finally replace my 3770/Z77 system.

    That and its not just about the CPU but also the upgrades in chipset/USB/etc... parts.
    Reply
  • gambiting - Friday, May 10, 2019 - link

    Still have a 2600(not even the K model) running in a living room PC, paired with a GTX1050Ti and an SSD - runs everything without any issues, been playing Sekiro and Division 2 on it without any problems, locked 1080p@60fps. Progress is all good and fine, but these "old" CPUs have loads of life in them still. Reply
  • Potatooo - Wednesday, May 15, 2019 - link

    Me too. I haven't had much time for video games the last couple of years to justify $$$, but putting a 1050ti in an old i2600 office PC has kept me happy the last 18 month's or so (eg 55ish fps for Far Cry 5 ND medium/1080, 70 fps+ Forza 7/FH4 high/1080). I'm about to try a S/H RX580 which will probably be a bridge too far, but at least I'll get freesync. Reply
  • GNUminex_l_cowsay - Friday, May 10, 2019 - link

    Dare I look at the CIV 6 benchmarks; knowing they are pointless? What sort of idiot tests cpu performance in CIV 6 using FPS rather than turn times? I don't know who specifically but they write for anandtech. Reply
  • RealBeast - Friday, May 10, 2019 - link

    Certainly not a Civ 6 player. ;) Reply
  • Targon - Monday, May 13, 2019 - link

    I made a similar comment, Civ6 added a new benchmark with Gathering Storm as well that is even more resource intensive. Turn length will show what your CPU can do, without GPU issues getting in the way. Reply

Log in

Don't have an account? Sign up now