Upgrading from an Intel Core i7-2600K: Yes

Back in 2010-2011, life was simple. We were relishing in benchmarks like CineBench R10, SuperPI, and no-one had even thought of trying to transcode video on any sort of scale. In 2019, the landscape has changed: gamers gonna stream, designers gonna design, scientists gonna simulate, and emulators gonna emulate. The way that software is designed has changed substantially as well, with more care taken for memory allocations, multiple cores and threads, and with fast storage in mind. Compilers are smarter too, and all the optimizations for the older platforms are in those code bases.

We regularly speak to CPU architects that describe how they build new processors for the next generation: by analyzing modern workload requirements. In a future of machine learning, for example, we’re now seeing hardware on mobile processors dedicated to accelerating neural networks for things like smartphone photography. (It’s interesting that smartphone SoCs today, in day-to-day use, are arguably more diverse than desktops in that regard.)

Ultimately, benchmarks have changed too. What we tested back in 2011 in our Core i7-2600K review was indicative of the way people were using their computers then, and in 2019 we are testing how people are using their computers today. On some level, one expects that what would have been the balance of compute/storage/resources back then might have adjusted, and as a result, older parts may perform better or worse than expected.

For this review, I wanted to compare an eternal idol for enthusiast desktop computing with its more modern counterparts. The Sandy Bridge Core i7-2600K that was released in 2011 was an enthusiasts dream: significantly faster than the previous generation, priced right, and offered a substantial performance boost when overclocked. The fact that it overclocked well was the crux of its staying power: if users were seeing 20-40%+ performance from an overclock and some fast memory, then the several years of Intel offering baseline 3-8% performance increases were scoffed at, and users did not upgrade.


It's a Core i7 Family Photo

The Core i7-2600K was a quad core processor with hyperthreading. Intel launched five more families of Core i7 that were also quad core with hyperthreading: the Core i7-3770K, i7-4770K, i7-5775C, 6700K, and 7700K, before it moved up to six cores (HT) with the 8700K and eight cores (no HT) with the 9700K. Each of those generations of quad cores offered slightly more frequency, sometimes new instructions, sometimes better transistor density, sometimes better graphics, and sometimes a better platform.

Features like new instructions, better integrated graphics, or the platform are valid reasons to push an upgrade, even if the raw performance gain in most tasks is minor. Moving to PCIe 3.0 for graphics, or moving to DDR4 to access higher capacity memory modules, or shifting to NVMe storage with more diverse chipset support all helped users that bypassed the popular 2600K.

In this review, we tested the Core i7-2600K at Intel’s recommended release settings (known as ‘stock’), and an overclocked Core i7-2600K, pushing up from 3.5 GHz all-core to 4.7 GHz all-core, and with faster memory. For comparison to newer CPUs, we chose the Core i7-7700K, Intel’s final Core i7 quad-core for the desktop, representing the best Intel has offered in a quad-core with HT package, and the Core i7-9700K, the latest high-end Core i7 processor.

The results from our testing paint an interesting picture, and as a result so do our conclusions. Our CPU testing was quite clear – in almost every test, the overclock on the 2600K was only able to half the deficit between the 7700K and the 2600K when both were run at stock. Whenever the overclock gave 20% extra performance, the 7700K was another 20% ahead. The only benchmarks that differed were the benchmarks that were AVX2 capable, where the 7700K had a massive lead due to the fact that it supports AVX2. In all our CPU tests, the Core i7-9700K by comparison blew them all out of the water.

For anyone still using a Core i7-2600K for CPU testing, even when overclocked, it’s time to feel the benefits of an upgrade.

 

The GPU testing had a different result. From 2011 to 2019, enthusiast gamers have moved from 1080p in one of two directions: higher resolutions or higher framerates. The direction moved depends on the type of game played, and modern game engines are geared up to cater for both, and have been optimized for the latest hardware with the latest APIs.

For users going up in resolution, to 4K and beyond, the i7-2600K when overclocked performs just as well as the latest Core i7-9700K. The stock 2600K is a little behind, but not overly noticeable unless you drill down into specific titles. But the overclocked Core i7-2600K is still a great chip for high resolution 60 FPS gaming.

For users staying at 1080p (or 1440p) but looking at high frame rates to drive higher refresh rate displays, there is more of a tangible benefit here. Newer games on modern APIs can use more threads, and the higher number of draw calls required per frame (and for more frames) can be driven better with the latest Core i7 hardware. The Core i7-7700K gives a good boost, which can be bettered with the full eight cores of the Core i7-9700K. Both of these chips can be overclocked too, which we’ve not covered here.

The Bottom Line

Back during 2011 and 2012, I was a competitive overclocker, and my results were focused around using the Core i7-2600K as the base for pushing my CPU and GPUs to the limits. The day-to-day performance gains for any of my CPU or GPU tests were tangible, not only for work but also for gaming at 1080p.

Fast forward to 2019, and there is only one or two reasons to stick to that old system, even when overclocked. The obvious reason is cost: if you can’t afford an upgrade, then that’s a very legitimate reason not to, and I hope you’re still having fun with it. The second reason to not upgrade is that the only thing you do, as an enthusiast gamer with a modern day graphics card, is game at 4K.

There are a million other reasons to upgrade, even to the Core i7-7700K: anything CPU related, memory support (capacity and speed), storage support, newer chipsets, newer connectivity standards, AVX2, PCIe 3.0, multi-tasking, gaming and streaming, NVMe. Or if you’re that way inclined, the RGB LED fad of modern components.

Back in my day, we installed games from DVDs and used cold cathodes for RGB.


Picture from 2006? – Battlefield 2 on a CRT.
Running an ATI X1900XTX on an AMD Athlon 3400+

Analyzing the Results: Impressive and Depressing?
Comments Locked

213 Comments

View All Comments

  • kgardas - Friday, May 10, 2019 - link

    Indeed, it's sad that it took ~8 years to have double performance kind of while in '90 we get that every 2-3 years. And look at the office tests, we're not there yet and we will probably never ever be as single-thread perf. increases are basically dead. Chromium compile suggests that it makes a sense to update at all -- for developers, but for office users it's nonsense if you consider just the CPU itself.
  • chekk - Friday, May 10, 2019 - link

    Thanks for the article, Ian. I like your summation: impressive and depressing.
    I'll be waiting to see what Zen 2 offers before upgrading my 2500K.
  • AshlayW - Friday, May 10, 2019 - link

    Such great innovation and progress and cost-effectiveness advances from Intel between 2011 and 2017. /s

    Yes AMD didn't do much here either, but it wasn't for lack of trying. Intel deliberately stagnated the market to bleed consumers from every single cent, and then Ryzen turns up and you get the 6 and now 8 core mainstream CPUs.

    Would have liked to see 2600K versus Ryzen honestly. Ryzen 1st gen is around Ivy/Haswell performance per core in most games and second gen is haswell/broadwell. But as many games get more threaded, Ryzen's advantage will ever increase.

    I owned a 2600K and it was the last product from Intel that I ever owned that I truly felt was worth its price. Even now I just can't justify spending £350-400 quid on a hexa core or octa with HT disabled when the competition has unlocked 16 threads for less money.
  • 29a - Friday, May 10, 2019 - link

    "Yes AMD didn't do much here either"

    I really don't understand that statement at all.
  • thesavvymage - Friday, May 10, 2019 - link

    Theyre saying AMD didnt do much to push the price/performance envelope between 2011 and 2017. Which they didnt, since their architecture until Zen was terrible.
  • eva02langley - Friday, May 10, 2019 - link

    Yeah, you are right... it is AMD fault and not Intel who wanted to make a dime on your back selling you quadcore for life.
  • wilsonkf - Friday, May 10, 2019 - link

    Would be more interesting to add 8150/8350 to the benchmark. I run my 8350 at 4.7Ghz for five years. It's a great room heater.
  • MDD1963 - Saturday, May 11, 2019 - link

    I don't think AMD would have sold as many of the 8350s and 9590s as they did had people known that i3's and i5's outperformed them in pretty much all games, and, at lower clock speeds, no less. Many people probably bought the FX8350 because it 'sounded faster' at 4.7 GHz than did the 2600K at 'only' 3.8 GHz' , or so I speculate, anyway... (sort of like the Florida Broward county votes in 2000!)
  • Targon - Tuesday, May 14, 2019 - link

    Not everyone looks at games as the primary use of a computer. The AMD FX chips were not great when it came to IPC, in the same way that the Pentium 4 was terrible from an IPC basis. Still, the 8350 was a lot faster than the Phenom 2 processors, that's for sure.
  • artk2219 - Wednesday, May 15, 2019 - link

    I got my FX 8320 because I preferred threads over single core performance. I was much more likely to notice a lack of computing resources and multi tasking ability vs how long something took to open or run. The funny part is that even though people shit all over them, they were, and honestly still are valid chips for certain use cases. They'll still game, they can be small cheap vhosts, nas servers, you name it. The biggest problem recently is finding a decent AM3+ board to put them in.

Log in

Don't have an account? Sign up now