Upgrading from an Intel Core i7-2600K: Yes

Back in 2010-2011, life was simple. We were relishing in benchmarks like CineBench R10, SuperPI, and no-one had even thought of trying to transcode video on any sort of scale. In 2019, the landscape has changed: gamers gonna stream, designers gonna design, scientists gonna simulate, and emulators gonna emulate. The way that software is designed has changed substantially as well, with more care taken for memory allocations, multiple cores and threads, and with fast storage in mind. Compilers are smarter too, and all the optimizations for the older platforms are in those code bases.

We regularly speak to CPU architects that describe how they build new processors for the next generation: by analyzing modern workload requirements. In a future of machine learning, for example, we’re now seeing hardware on mobile processors dedicated to accelerating neural networks for things like smartphone photography. (It’s interesting that smartphone SoCs today, in day-to-day use, are arguably more diverse than desktops in that regard.)

Ultimately, benchmarks have changed too. What we tested back in 2011 in our Core i7-2600K review was indicative of the way people were using their computers then, and in 2019 we are testing how people are using their computers today. On some level, one expects that what would have been the balance of compute/storage/resources back then might have adjusted, and as a result, older parts may perform better or worse than expected.

For this review, I wanted to compare an eternal idol for enthusiast desktop computing with its more modern counterparts. The Sandy Bridge Core i7-2600K that was released in 2011 was an enthusiasts dream: significantly faster than the previous generation, priced right, and offered a substantial performance boost when overclocked. The fact that it overclocked well was the crux of its staying power: if users were seeing 20-40%+ performance from an overclock and some fast memory, then the several years of Intel offering baseline 3-8% performance increases were scoffed at, and users did not upgrade.

It's a Core i7 Family Photo

The Core i7-2600K was a quad core processor with hyperthreading. Intel launched five more families of Core i7 that were also quad core with hyperthreading: the Core i7-3770K, i7-4770K, i7-5775C, 6700K, and 7700K, before it moved up to six cores (HT) with the 8700K and eight cores (no HT) with the 9700K. Each of those generations of quad cores offered slightly more frequency, sometimes new instructions, sometimes better transistor density, sometimes better graphics, and sometimes a better platform.

Features like new instructions, better integrated graphics, or the platform are valid reasons to push an upgrade, even if the raw performance gain in most tasks is minor. Moving to PCIe 3.0 for graphics, or moving to DDR4 to access higher capacity memory modules, or shifting to NVMe storage with more diverse chipset support all helped users that bypassed the popular 2600K.

In this review, we tested the Core i7-2600K at Intel’s recommended release settings (known as ‘stock’), and an overclocked Core i7-2600K, pushing up from 3.5 GHz all-core to 4.7 GHz all-core, and with faster memory. For comparison to newer CPUs, we chose the Core i7-7700K, Intel’s final Core i7 quad-core for the desktop, representing the best Intel has offered in a quad-core with HT package, and the Core i7-9700K, the latest high-end Core i7 processor.

The results from our testing paint an interesting picture, and as a result so do our conclusions. Our CPU testing was quite clear – in almost every test, the overclock on the 2600K was only able to half the deficit between the 7700K and the 2600K when both were run at stock. Whenever the overclock gave 20% extra performance, the 7700K was another 20% ahead. The only benchmarks that differed were the benchmarks that were AVX2 capable, where the 7700K had a massive lead due to the fact that it supports AVX2. In all our CPU tests, the Core i7-9700K by comparison blew them all out of the water.

For anyone still using a Core i7-2600K for CPU testing, even when overclocked, it’s time to feel the benefits of an upgrade.


The GPU testing had a different result. From 2011 to 2019, enthusiast gamers have moved from 1080p in one of two directions: higher resolutions or higher framerates. The direction moved depends on the type of game played, and modern game engines are geared up to cater for both, and have been optimized for the latest hardware with the latest APIs.

For users going up in resolution, to 4K and beyond, the i7-2600K when overclocked performs just as well as the latest Core i7-9700K. The stock 2600K is a little behind, but not overly noticeable unless you drill down into specific titles. But the overclocked Core i7-2600K is still a great chip for high resolution 60 FPS gaming.

For users staying at 1080p (or 1440p) but looking at high frame rates to drive higher refresh rate displays, there is more of a tangible benefit here. Newer games on modern APIs can use more threads, and the higher number of draw calls required per frame (and for more frames) can be driven better with the latest Core i7 hardware. The Core i7-7700K gives a good boost, which can be bettered with the full eight cores of the Core i7-9700K. Both of these chips can be overclocked too, which we’ve not covered here.

The Bottom Line

Back during 2011 and 2012, I was a competitive overclocker, and my results were focused around using the Core i7-2600K as the base for pushing my CPU and GPUs to the limits. The day-to-day performance gains for any of my CPU or GPU tests were tangible, not only for work but also for gaming at 1080p.

Fast forward to 2019, and there is only one or two reasons to stick to that old system, even when overclocked. The obvious reason is cost: if you can’t afford an upgrade, then that’s a very legitimate reason not to, and I hope you’re still having fun with it. The second reason to not upgrade is that the only thing you do, as an enthusiast gamer with a modern day graphics card, is game at 4K.

There are a million other reasons to upgrade, even to the Core i7-7700K: anything CPU related, memory support (capacity and speed), storage support, newer chipsets, newer connectivity standards, AVX2, PCIe 3.0, multi-tasking, gaming and streaming, NVMe. Or if you’re that way inclined, the RGB LED fad of modern components.

Back in my day, we installed games from DVDs and used cold cathodes for RGB.

Picture from 2006? – Battlefield 2 on a CRT.
Running an ATI X1900XTX on an AMD Athlon 3400+

Analyzing the Results: Impressive and Depressing?


View All Comments

  • versesuvius - Saturday, May 11, 2019 - link

    There is a point in every Windows OS user computer endeavors, that they start playing less and less games, and at about the same time start foregoing upgrades to their CPU. They keep adding ram and hard disk space and maybe a new graphic card after a couple of years. The only reason that such a person that by now has completely stopped playing games may upgrade to a new CPU and motherboard is the maximum amount of RAM that can be installed on their motherboard. And with that really comes the final PC that such a person may have in a long, long time. Kids get the latest CPU and soon will realize the law of diminishing returns, which by now is gradually approaching "no return", much faster than their parents. So, in perhaps ten years there will be no more "Tic", or "Toc" or Cadence or Moore's law. There be will computers, baring the possibility that dumb terminals have replaced PCs, that everybody knows what they can expect from. No serendipity there for certain. Reply
  • Targon - Tuesday, May 14, 2019 - link

    The fact that you don't see really interesting games showing up all that often is why many people stopped playing games in the first place. Many people enjoyed the old adventure games with puzzles, and while action appeals to younger players, being more strategic and needing to come up with different approaches in how you play has largely died. Interplay is gone, Bullfrog, Lionhead....On occasion something will come out, but few and far between.

    Games for adults(and not just adult age children who want to play soldier on the computer) are not all that common. I blame EA for much of the decline in the industry.
  • skirmash - Saturday, May 11, 2019 - link

    I still have an i7-2600 in an old Dell based upon an H67 chipset. I was thinking about using it as a server and updating the board to get updated connectivity. updating the board and using it as a server. Z77 chipset would seem to be the way to go although getting a new board with this chipset seems expensive unless I go used. Anyone any thoughts on this - whether its worthwhile etc or a cost effective way to do it? Reply
  • skirmash - Saturday, May 11, 2019 - link

    Sorry for the typos but I hope you get the sentiment. Reply
  • Tunnah - Saturday, May 11, 2019 - link

    Oh wow this is insane timing, I'm actually upgrading from one of these and have had a hard time figuring out what sort of performance upgrade I'd be getting. Much appreciated! Reply
  • Tunnah - Saturday, May 11, 2019 - link

    I feel like I can chip in a perspective re: gaming. While your benchmarks show solid average FPS and all that, they don't show the quality of life that you lose by having an underpowered CPU. I game at 4K, 2700k (4.6ghz for heat&noise reasons), 1080Ti, and regularly can't get 60fps no matter the settings, or have constant grame blips and dips. This is in comparison to a friend who has the same card but a Ryzen 1700X

    Newer games like Division 2, Assassin's Creed Odyssey, and as shown here, Shadow Of The Romb Raider, all severely limit your performance if you have an older CPU, to the point where getting a constant 60fps is a real struggle, and benchmarks aside, that's the only benchmark the average user is aiming for.

    I also have 1333mhz RAM, which is just a whole other pain! As more and more games move into giant open world games and texture streaming and loading is happening in game rather than on loading screens, having slow RAM really affects your enjoyment.

    I'm incredibly grateful for this piece btw, I'm actually moving to Zen2 when it comes out, and I gotta say, I've not been this excited since..well, Sandy Bridge.
  • Death666Angel - Saturday, May 11, 2019 - link

    "I don’t think I purchased a monitor bigger than 1080p until 2012."
    Wow, really? So you were a CRT guy before that? How could you work on those low res screens all the time?! :D I got myself a 1200p 24" monitor once they became affordable in early 2008 (W2408hH). Had a 1280x1024 19" before that and it was night and day, sooo much better.
  • PeachNCream - Sunday, May 12, 2019 - link

    Still running 1366x768 on my two non-Windows laptops (HP Steam 11 and Dell Latitude e6320) and it okay. My latest, far less uses Windows gaming system has a 14 inch panel running 1600x900. Its a slight improvement, but I could live without it. The old Latitude does all my video production work so though I could use a few more pixels, it isn't the end of the world as is. The laptop my office issued is a HP Probook 640 G3 so it has a 14 inch 1080p panel which to have to scale at 125% to actually use so the resolution is pretty much pointless. Reply
  • PeachNCream - Sunday, May 12, 2019 - link

    Ugh, phone auto correct...I really need to look over anything I type on a phone more closely. I feel like I'm reading comment by a non-native English speaker, but its me. How depressing. Reply
  • Death666Angel - Sunday, May 12, 2019 - link

    I've done some horrendous posts when I used my phone to make a comment somewhere. Mostly because my phone is trained to my German texting habits and not my English commenting habits. And trying to mix them leads to sub par results in both areas, so I mostly stick to using my phone for texting and my PC and laptop for commenting. But sometimes I have to write something via my phone and it makes a beautiful mess if I'm not careful. Reply

Log in

Don't have an account? Sign up now