Upgrading from an Intel Core i7-2600K: Yes

Back in 2010-2011, life was simple. We were relishing in benchmarks like CineBench R10, SuperPI, and no-one had even thought of trying to transcode video on any sort of scale. In 2019, the landscape has changed: gamers gonna stream, designers gonna design, scientists gonna simulate, and emulators gonna emulate. The way that software is designed has changed substantially as well, with more care taken for memory allocations, multiple cores and threads, and with fast storage in mind. Compilers are smarter too, and all the optimizations for the older platforms are in those code bases.

We regularly speak to CPU architects that describe how they build new processors for the next generation: by analyzing modern workload requirements. In a future of machine learning, for example, we’re now seeing hardware on mobile processors dedicated to accelerating neural networks for things like smartphone photography. (It’s interesting that smartphone SoCs today, in day-to-day use, are arguably more diverse than desktops in that regard.)

Ultimately, benchmarks have changed too. What we tested back in 2011 in our Core i7-2600K review was indicative of the way people were using their computers then, and in 2019 we are testing how people are using their computers today. On some level, one expects that what would have been the balance of compute/storage/resources back then might have adjusted, and as a result, older parts may perform better or worse than expected.

For this review, I wanted to compare an eternal idol for enthusiast desktop computing with its more modern counterparts. The Sandy Bridge Core i7-2600K that was released in 2011 was an enthusiasts dream: significantly faster than the previous generation, priced right, and offered a substantial performance boost when overclocked. The fact that it overclocked well was the crux of its staying power: if users were seeing 20-40%+ performance from an overclock and some fast memory, then the several years of Intel offering baseline 3-8% performance increases were scoffed at, and users did not upgrade.


It's a Core i7 Family Photo

The Core i7-2600K was a quad core processor with hyperthreading. Intel launched five more families of Core i7 that were also quad core with hyperthreading: the Core i7-3770K, i7-4770K, i7-5775C, 6700K, and 7700K, before it moved up to six cores (HT) with the 8700K and eight cores (no HT) with the 9700K. Each of those generations of quad cores offered slightly more frequency, sometimes new instructions, sometimes better transistor density, sometimes better graphics, and sometimes a better platform.

Features like new instructions, better integrated graphics, or the platform are valid reasons to push an upgrade, even if the raw performance gain in most tasks is minor. Moving to PCIe 3.0 for graphics, or moving to DDR4 to access higher capacity memory modules, or shifting to NVMe storage with more diverse chipset support all helped users that bypassed the popular 2600K.

In this review, we tested the Core i7-2600K at Intel’s recommended release settings (known as ‘stock’), and an overclocked Core i7-2600K, pushing up from 3.5 GHz all-core to 4.7 GHz all-core, and with faster memory. For comparison to newer CPUs, we chose the Core i7-7700K, Intel’s final Core i7 quad-core for the desktop, representing the best Intel has offered in a quad-core with HT package, and the Core i7-9700K, the latest high-end Core i7 processor.

The results from our testing paint an interesting picture, and as a result so do our conclusions. Our CPU testing was quite clear – in almost every test, the overclock on the 2600K was only able to half the deficit between the 7700K and the 2600K when both were run at stock. Whenever the overclock gave 20% extra performance, the 7700K was another 20% ahead. The only benchmarks that differed were the benchmarks that were AVX2 capable, where the 7700K had a massive lead due to the fact that it supports AVX2. In all our CPU tests, the Core i7-9700K by comparison blew them all out of the water.

For anyone still using a Core i7-2600K for CPU testing, even when overclocked, it’s time to feel the benefits of an upgrade.

 

The GPU testing had a different result. From 2011 to 2019, enthusiast gamers have moved from 1080p in one of two directions: higher resolutions or higher framerates. The direction moved depends on the type of game played, and modern game engines are geared up to cater for both, and have been optimized for the latest hardware with the latest APIs.

For users going up in resolution, to 4K and beyond, the i7-2600K when overclocked performs just as well as the latest Core i7-9700K. The stock 2600K is a little behind, but not overly noticeable unless you drill down into specific titles. But the overclocked Core i7-2600K is still a great chip for high resolution 60 FPS gaming.

For users staying at 1080p (or 1440p) but looking at high frame rates to drive higher refresh rate displays, there is more of a tangible benefit here. Newer games on modern APIs can use more threads, and the higher number of draw calls required per frame (and for more frames) can be driven better with the latest Core i7 hardware. The Core i7-7700K gives a good boost, which can be bettered with the full eight cores of the Core i7-9700K. Both of these chips can be overclocked too, which we’ve not covered here.

The Bottom Line

Back during 2011 and 2012, I was a competitive overclocker, and my results were focused around using the Core i7-2600K as the base for pushing my CPU and GPUs to the limits. The day-to-day performance gains for any of my CPU or GPU tests were tangible, not only for work but also for gaming at 1080p.

Fast forward to 2019, and there is only one or two reasons to stick to that old system, even when overclocked. The obvious reason is cost: if you can’t afford an upgrade, then that’s a very legitimate reason not to, and I hope you’re still having fun with it. The second reason to not upgrade is that the only thing you do, as an enthusiast gamer with a modern day graphics card, is game at 4K.

There are a million other reasons to upgrade, even to the Core i7-7700K: anything CPU related, memory support (capacity and speed), storage support, newer chipsets, newer connectivity standards, AVX2, PCIe 3.0, multi-tasking, gaming and streaming, NVMe. Or if you’re that way inclined, the RGB LED fad of modern components.

Back in my day, we installed games from DVDs and used cold cathodes for RGB.


Picture from 2006? – Battlefield 2 on a CRT.
Running an ATI X1900XTX on an AMD Athlon 3400+

Analyzing the Results: Impressive and Depressing?
POST A COMMENT

206 Comments

View All Comments

  • Fallen Kell - Saturday, May 11, 2019 - link

    Yeah. In many cases it is very sad when you look at this article. It has effectively taken a decade to finally get to the point that there is a worthwile upgrade in CPU performance. Prior to this, we were seeing CPU performance double every couple of years. A case in point is to look at an article from 2015 that did a comparison of CPUs over the last decade (i.e. ~2005 - 2015) and over that timeframe you saw a 6x performance increase in memory bandwidth and 8x - 10x CPU computational increase. But looking from 2011 to 2019 we barely see a doubling in performance (and then only on select use cases), while at the same time the price of said CPU is 25% more. It is no wonder why people have not been upgrading. Why spend $1000 for new CPU, motherboard, RAM to only gain 25-40% performance? We are just finally hitting that point now that people start to consider it worth that price.

    That all being said, it would have been nice to have included at least 1 AMD CPU in theses benchmarks for comparison. Sure, we can go to the review bench to get it, but having it here for some easy comparison would have been nice, especially given how Intel has seemed to have decided to innovating and purposely taking a dive (almost as if they feared regulatory actions from the USA/EU for effectively being a "monopoly" and to avoid such actions decided to simply stop releasing anything really competitive until AMD was able to get their act together again and have a competitive CPU...).
    Reply
  • Zoomer - Thursday, June 13, 2019 - link

    Funny thing is, last time it happened, Intel needed AMD to give it a kick in the nuts. Maybe this time too? Reply
  • mode_13h - Saturday, May 11, 2019 - link

    I figured I'd wait for PCIe 4.0, to upgrade. With Zen2, I guess my chance is here. Reply
  • Wardrop - Saturday, May 11, 2019 - link

    Yep, same. Hoping to replace my 3770k with Zen 2. Looking to down-size my chassis too with a Sliger case. Hopefully Zen 2 doesn't disappoint. Reply
  • Marlin1975 - Friday, May 10, 2019 - link

    Still running my 3770 as I have not seen that large a difference to upgrade. But Zen+ had me itching and Zen2 is what will finally replace my 3770/Z77 system.

    That and its not just about the CPU but also the upgrades in chipset/USB/etc... parts.
    Reply
  • gambiting - Friday, May 10, 2019 - link

    Still have a 2600(not even the K model) running in a living room PC, paired with a GTX1050Ti and an SSD - runs everything without any issues, been playing Sekiro and Division 2 on it without any problems, locked 1080p@60fps. Progress is all good and fine, but these "old" CPUs have loads of life in them still. Reply
  • Potatooo - Wednesday, May 15, 2019 - link

    Me too. I haven't had much time for video games the last couple of years to justify $$$, but putting a 1050ti in an old i2600 office PC has kept me happy the last 18 month's or so (eg 55ish fps for Far Cry 5 ND medium/1080, 70 fps+ Forza 7/FH4 high/1080). I'm about to try a S/H RX580 which will probably be a bridge too far, but at least I'll get freesync. Reply
  • GNUminex_l_cowsay - Friday, May 10, 2019 - link

    Dare I look at the CIV 6 benchmarks; knowing they are pointless? What sort of idiot tests cpu performance in CIV 6 using FPS rather than turn times? I don't know who specifically but they write for anandtech. Reply
  • RealBeast - Friday, May 10, 2019 - link

    Certainly not a Civ 6 player. ;) Reply
  • Targon - Monday, May 13, 2019 - link

    I made a similar comment, Civ6 added a new benchmark with Gathering Storm as well that is even more resource intensive. Turn length will show what your CPU can do, without GPU issues getting in the way. Reply

Log in

Don't have an account? Sign up now