Upgrading from an Intel Core i7-2600K: Yes

Back in 2010-2011, life was simple. We were relishing in benchmarks like CineBench R10, SuperPI, and no-one had even thought of trying to transcode video on any sort of scale. In 2019, the landscape has changed: gamers gonna stream, designers gonna design, scientists gonna simulate, and emulators gonna emulate. The way that software is designed has changed substantially as well, with more care taken for memory allocations, multiple cores and threads, and with fast storage in mind. Compilers are smarter too, and all the optimizations for the older platforms are in those code bases.

We regularly speak to CPU architects that describe how they build new processors for the next generation: by analyzing modern workload requirements. In a future of machine learning, for example, we’re now seeing hardware on mobile processors dedicated to accelerating neural networks for things like smartphone photography. (It’s interesting that smartphone SoCs today, in day-to-day use, are arguably more diverse than desktops in that regard.)

Ultimately, benchmarks have changed too. What we tested back in 2011 in our Core i7-2600K review was indicative of the way people were using their computers then, and in 2019 we are testing how people are using their computers today. On some level, one expects that what would have been the balance of compute/storage/resources back then might have adjusted, and as a result, older parts may perform better or worse than expected.

For this review, I wanted to compare an eternal idol for enthusiast desktop computing with its more modern counterparts. The Sandy Bridge Core i7-2600K that was released in 2011 was an enthusiasts dream: significantly faster than the previous generation, priced right, and offered a substantial performance boost when overclocked. The fact that it overclocked well was the crux of its staying power: if users were seeing 20-40%+ performance from an overclock and some fast memory, then the several years of Intel offering baseline 3-8% performance increases were scoffed at, and users did not upgrade.

It's a Core i7 Family Photo

The Core i7-2600K was a quad core processor with hyperthreading. Intel launched five more families of Core i7 that were also quad core with hyperthreading: the Core i7-3770K, i7-4770K, i7-5775C, 6700K, and 7700K, before it moved up to six cores (HT) with the 8700K and eight cores (no HT) with the 9700K. Each of those generations of quad cores offered slightly more frequency, sometimes new instructions, sometimes better transistor density, sometimes better graphics, and sometimes a better platform.

Features like new instructions, better integrated graphics, or the platform are valid reasons to push an upgrade, even if the raw performance gain in most tasks is minor. Moving to PCIe 3.0 for graphics, or moving to DDR4 to access higher capacity memory modules, or shifting to NVMe storage with more diverse chipset support all helped users that bypassed the popular 2600K.

In this review, we tested the Core i7-2600K at Intel’s recommended release settings (known as ‘stock’), and an overclocked Core i7-2600K, pushing up from 3.5 GHz all-core to 4.7 GHz all-core, and with faster memory. For comparison to newer CPUs, we chose the Core i7-7700K, Intel’s final Core i7 quad-core for the desktop, representing the best Intel has offered in a quad-core with HT package, and the Core i7-9700K, the latest high-end Core i7 processor.

The results from our testing paint an interesting picture, and as a result so do our conclusions. Our CPU testing was quite clear – in almost every test, the overclock on the 2600K was only able to half the deficit between the 7700K and the 2600K when both were run at stock. Whenever the overclock gave 20% extra performance, the 7700K was another 20% ahead. The only benchmarks that differed were the benchmarks that were AVX2 capable, where the 7700K had a massive lead due to the fact that it supports AVX2. In all our CPU tests, the Core i7-9700K by comparison blew them all out of the water.

For anyone still using a Core i7-2600K for CPU testing, even when overclocked, it’s time to feel the benefits of an upgrade.


The GPU testing had a different result. From 2011 to 2019, enthusiast gamers have moved from 1080p in one of two directions: higher resolutions or higher framerates. The direction moved depends on the type of game played, and modern game engines are geared up to cater for both, and have been optimized for the latest hardware with the latest APIs.

For users going up in resolution, to 4K and beyond, the i7-2600K when overclocked performs just as well as the latest Core i7-9700K. The stock 2600K is a little behind, but not overly noticeable unless you drill down into specific titles. But the overclocked Core i7-2600K is still a great chip for high resolution 60 FPS gaming.

For users staying at 1080p (or 1440p) but looking at high frame rates to drive higher refresh rate displays, there is more of a tangible benefit here. Newer games on modern APIs can use more threads, and the higher number of draw calls required per frame (and for more frames) can be driven better with the latest Core i7 hardware. The Core i7-7700K gives a good boost, which can be bettered with the full eight cores of the Core i7-9700K. Both of these chips can be overclocked too, which we’ve not covered here.

The Bottom Line

Back during 2011 and 2012, I was a competitive overclocker, and my results were focused around using the Core i7-2600K as the base for pushing my CPU and GPUs to the limits. The day-to-day performance gains for any of my CPU or GPU tests were tangible, not only for work but also for gaming at 1080p.

Fast forward to 2019, and there is only one or two reasons to stick to that old system, even when overclocked. The obvious reason is cost: if you can’t afford an upgrade, then that’s a very legitimate reason not to, and I hope you’re still having fun with it. The second reason to not upgrade is that the only thing you do, as an enthusiast gamer with a modern day graphics card, is game at 4K.

There are a million other reasons to upgrade, even to the Core i7-7700K: anything CPU related, memory support (capacity and speed), storage support, newer chipsets, newer connectivity standards, AVX2, PCIe 3.0, multi-tasking, gaming and streaming, NVMe. Or if you’re that way inclined, the RGB LED fad of modern components.

Back in my day, we installed games from DVDs and used cold cathodes for RGB.

Picture from 2006? – Battlefield 2 on a CRT.
Running an ATI X1900XTX on an AMD Athlon 3400+

Analyzing the Results: Impressive and Depressing?


View All Comments

  • MxClood - Saturday, May 18, 2019 - link

    In most test here it's around 100% or more increase in perf, i don't see where it's 40%.

    Also when you increase the graphics/resolution in gaming, the FPS are the same because the GPU becomes the bottleneck of FPS. You could put any futuristic cpu, the fps would be the same.
    So why is it an argument about disappointing/abysmal performance.
  • Beaver M. - Wednesday, May 22, 2019 - link

    After so many decades being wrong you guys still claim CPU power doesnt matter much in games.
    Youre wrong. Again. Common bottleneck today in games is the CPU, especially because the GPU advancement has been very slow.
  • Spunjji - Wednesday, May 22, 2019 - link

    GPU advancement slowing down *makes the CPU less relevant, not more*. The CPU is only relevant to performance when it can't meet the bare minimum requirements to serve the GPU fast enough. If the GPU is your limit, no amount of CPU power increase will help. Reply
  • LoneWolf15 - Friday, May 17, 2019 - link

    Is it abysmal because of the CPU though, or because of the software?

    Lots of software isn't written to take advantage of more than four cores tops, aside from the heavy hitters, and to an extent, we've hit a celing with clock speeds for awhile, with 5GHz being (not exactly, but a fair representation of) the ceiling.
    AMD has caught up in a big way, and for server apps and rendering, it's an awesome value and a great CPU. Even with that, it still doesn't match up with a 9700K in games, all other things being equal, unless a game is dependent on GPU alone.
    I think most mainstream software isn't optimized beyond a certain point for any of our current great CPUs, largely because until recently, CPU development and growth has stagnated. I'm really hoping real competition drives improved software.
    Note also that it hasn't been like the 90s in some time, where we were doubling CPU performance every 16 months. Some of that is because there's too many limitations to achieving that doubling, both software and hardware.

    I'm finding considerable speed boosts over my i7-4790K that was running at 4.4GHz (going to an i9-9900K running constantly at 4.7GHz on all cores) in regular apps and gaming (at 1900x1200 with two GTX 1070 cards in SLI), and I got a deal on the CPU, so I'm perfectly happy with my first mainboard/CPU upgrade in five years (my first board was a 386DX back in `93).
  • peevee - Tuesday, May 14, 2019 - link

    Same here. i7-2600k from may 2011, with the same OCZ Vertex 3.
    8 years, twice the cores, not even twice the performance in real world. Just essentially overclocked to the max from the factory.

    Remember when real life performance more than doubled every 2 years? On the same 1 core, in all apps, not just heavily multithreaded? Good thing AMD at least forced Intel go from 4 to 6 to 8 in 2 years. Now they need to double their memory controllers, it's the same 128 bits since what, Pentium Pro?
  • Mr Perfect - Friday, May 10, 2019 - link

    Same here. Over the years I've stuffed it full of RAM and SSD and been pleased with the performance. I'm thinking it's time for it to go though.

    In 2016 I put a 1060 in the machine and was mildly disappointed in the random framerate drops in games (at 1200p). Assuming it was the GPU's fault, I upgraded further in 2018 to a 1070 Ti some bitcoin miner was selling for cheap when the market crashed. The average framerates went up, but all of the lows are just as low as they ever where. So either Fallout 4 runs like absolute garbage in certain areas, or the CPU was choking up both GPUs.

    When something that isn't PCIe 3 comes out I suppose I can try again and see.
  • ImOnMy116 - Friday, May 10, 2019 - link

    For whatever it's worth, in my experience Fallout 4 (and Skyrim/Skyrim SE/maybe all Bethesda titles) are poorly optimized. It seems their engine is highly dependent on IPC, but even in spite of running an overclocked 6700K/1080 Ti, I get frame drops in certain parts of the map. I think it's likely at least partially dependent on where your character is facing at any given point in time. There can be long draw distances or lots of NPCs near by taxing the CPU (i.e. Diamond City). Reply
  • Mr Perfect - Friday, May 10, 2019 - link

    Yeah, that makes sense. F4's drops are definitely depended on location and where the character is facing for me too.

    The country side, building interiors and winding city streets you can't see very far down are just fine. Even Diamond City is okay. It's when I stand at an intersection of one of the roads that runs arrow straight through Boston or get up on rooftops with a view over the city that rates die. If the engine wants pure CPU grunt for that, then the 2600 just isn't up to it.

    Strangely, Skyrim SE has been fine. The world is pretty sparse compared to F4 though.
  • Vayra - Monday, May 13, 2019 - link

    Fallout 4 is simply a game of asset overload. That happens especially in the urban areas. It shows us that the engine is past expiry date and unable to keep up to the game's demands of this time. The game needs all those assets to at least look somewhat bearable. And its not efficient about it at all; a big part of all those little items also need to be fully interactive objects.

    So its not 'strange' at all, really. More objects = more cpu load and none of them can be 'cooked' beforehand. They are literally placed in the world as you move around in it.
  • Vayra - Monday, May 13, 2019 - link

    This is also part of the reason why the engine has trouble with anything over 60 fps, and why you can sometimes see objects falling from the sky as you zone in. Reply

Log in

Don't have an account? Sign up now