Power Consumption

One of the risk factors in overclocking is driving the processor beyond its ideal point of power and performance. Processors are typically manufactured with a particular sweet spot in mind: the peak efficiency of a processor will be at a particular voltage and particular frequency combination, and any deviation from that mark will result in expending extra energy (usually for better performance).

When Intel first introduced the Skylake family, this efficiency point was a key element to its product portfolio. Some CPUs would test and detect the best efficiency point on POST, making sure that when the system was idle, the least power is drawn. When the CPU is actually running code however, the system raises the frequency and voltage in order to offer performance away from that peak efficiency point. If a user pushes that frequency a lot higher, voltage needs to increase and power consumption rises.

So when overclocking a processor, either one of the newer ones or even an old processor, the user ends up expending more energy for the same workload, albeit to get the workload performed faster as well. For our power testing, we took the peak power consumption values during an all-thread version of POV-Ray, using the CPU internal metrics to record full SoC power.

Power (Package), Full Load

The Core i7-2600K was built on Intel’s 32nm process, while the i7-7700K and i7-9700K were built on variants of Intel’s 14nm process family. These latter two, as shown in the benchmarks in this review, have considerable performance advantages due to microarchitectural, platform, and frequency improvements that the more efficient process node offers. They also have AVX2, which draw a lot of power in our power test.

In our peak power results graph, we see the Core i7-2600K at stock (3.5 GHz all-core) hitting only 88W, while the Core i7-7700K at stock (4.3 GHz all-core) at 95 W. These results are both respectable, however adding the overclock to the 2600K, to hit 4.7 GHz all-core, shows how much extra power is needed. At 116W, the 34% overclock is consuming 31% more power (for 24% more performance) when comparing to the 2600K at stock.

The Core i7-9700K, with eight full cores, goes above and beyond this, drawing 124W at stock. While Intel’s power policy didn’t change between the generations, the way it ended up being interpreted did, as explained in our article here:

Why Intel Processors Draw More Power Than Expected: TDP and Turbo Explained

You can also learn about power control on Intel’s latest CPUs in our original Skylake review:

The Intel Skylake Mobile and Desktop Launch, with Architecture Analysis

Gaming: F1 2018 Analyzing the Results: Impressive and Depressing?
Comments Locked

213 Comments

View All Comments

  • kgardas - Friday, May 10, 2019 - link

    Indeed, it's sad that it took ~8 years to have double performance kind of while in '90 we get that every 2-3 years. And look at the office tests, we're not there yet and we will probably never ever be as single-thread perf. increases are basically dead. Chromium compile suggests that it makes a sense to update at all -- for developers, but for office users it's nonsense if you consider just the CPU itself.
  • chekk - Friday, May 10, 2019 - link

    Thanks for the article, Ian. I like your summation: impressive and depressing.
    I'll be waiting to see what Zen 2 offers before upgrading my 2500K.
  • AshlayW - Friday, May 10, 2019 - link

    Such great innovation and progress and cost-effectiveness advances from Intel between 2011 and 2017. /s

    Yes AMD didn't do much here either, but it wasn't for lack of trying. Intel deliberately stagnated the market to bleed consumers from every single cent, and then Ryzen turns up and you get the 6 and now 8 core mainstream CPUs.

    Would have liked to see 2600K versus Ryzen honestly. Ryzen 1st gen is around Ivy/Haswell performance per core in most games and second gen is haswell/broadwell. But as many games get more threaded, Ryzen's advantage will ever increase.

    I owned a 2600K and it was the last product from Intel that I ever owned that I truly felt was worth its price. Even now I just can't justify spending £350-400 quid on a hexa core or octa with HT disabled when the competition has unlocked 16 threads for less money.
  • 29a - Friday, May 10, 2019 - link

    "Yes AMD didn't do much here either"

    I really don't understand that statement at all.
  • thesavvymage - Friday, May 10, 2019 - link

    Theyre saying AMD didnt do much to push the price/performance envelope between 2011 and 2017. Which they didnt, since their architecture until Zen was terrible.
  • eva02langley - Friday, May 10, 2019 - link

    Yeah, you are right... it is AMD fault and not Intel who wanted to make a dime on your back selling you quadcore for life.
  • wilsonkf - Friday, May 10, 2019 - link

    Would be more interesting to add 8150/8350 to the benchmark. I run my 8350 at 4.7Ghz for five years. It's a great room heater.
  • MDD1963 - Saturday, May 11, 2019 - link

    I don't think AMD would have sold as many of the 8350s and 9590s as they did had people known that i3's and i5's outperformed them in pretty much all games, and, at lower clock speeds, no less. Many people probably bought the FX8350 because it 'sounded faster' at 4.7 GHz than did the 2600K at 'only' 3.8 GHz' , or so I speculate, anyway... (sort of like the Florida Broward county votes in 2000!)
  • Targon - Tuesday, May 14, 2019 - link

    Not everyone looks at games as the primary use of a computer. The AMD FX chips were not great when it came to IPC, in the same way that the Pentium 4 was terrible from an IPC basis. Still, the 8350 was a lot faster than the Phenom 2 processors, that's for sure.
  • artk2219 - Wednesday, May 15, 2019 - link

    I got my FX 8320 because I preferred threads over single core performance. I was much more likely to notice a lack of computing resources and multi tasking ability vs how long something took to open or run. The funny part is that even though people shit all over them, they were, and honestly still are valid chips for certain use cases. They'll still game, they can be small cheap vhosts, nas servers, you name it. The biggest problem recently is finding a decent AM3+ board to put them in.

Log in

Don't have an account? Sign up now