Power Consumption

One of the risk factors in overclocking is driving the processor beyond its ideal point of power and performance. Processors are typically manufactured with a particular sweet spot in mind: the peak efficiency of a processor will be at a particular voltage and particular frequency combination, and any deviation from that mark will result in expending extra energy (usually for better performance).

When Intel first introduced the Skylake family, this efficiency point was a key element to its product portfolio. Some CPUs would test and detect the best efficiency point on POST, making sure that when the system was idle, the least power is drawn. When the CPU is actually running code however, the system raises the frequency and voltage in order to offer performance away from that peak efficiency point. If a user pushes that frequency a lot higher, voltage needs to increase and power consumption rises.

So when overclocking a processor, either one of the newer ones or even an old processor, the user ends up expending more energy for the same workload, albeit to get the workload performed faster as well. For our power testing, we took the peak power consumption values during an all-thread version of POV-Ray, using the CPU internal metrics to record full SoC power.

Power (Package), Full Load

The Core i7-2600K was built on Intel’s 32nm process, while the i7-7700K and i7-9700K were built on variants of Intel’s 14nm process family. These latter two, as shown in the benchmarks in this review, have considerable performance advantages due to microarchitectural, platform, and frequency improvements that the more efficient process node offers. They also have AVX2, which draw a lot of power in our power test.

In our peak power results graph, we see the Core i7-2600K at stock (3.5 GHz all-core) hitting only 88W, while the Core i7-7700K at stock (4.3 GHz all-core) at 95 W. These results are both respectable, however adding the overclock to the 2600K, to hit 4.7 GHz all-core, shows how much extra power is needed. At 116W, the 34% overclock is consuming 31% more power (for 24% more performance) when comparing to the 2600K at stock.

The Core i7-9700K, with eight full cores, goes above and beyond this, drawing 124W at stock. While Intel’s power policy didn’t change between the generations, the way it ended up being interpreted did, as explained in our article here:

Why Intel Processors Draw More Power Than Expected: TDP and Turbo Explained

You can also learn about power control on Intel’s latest CPUs in our original Skylake review:

The Intel Skylake Mobile and Desktop Launch, with Architecture Analysis

Gaming: F1 2018 Analyzing the Results: Impressive and Depressing?
Comments Locked

213 Comments

View All Comments

  • djayjp - Friday, May 10, 2019 - link

    Hey, I know! Let's benchmark a CPU at 4K+ using a mid-range GPU! Brilliant....
  • Ian Cutress - Friday, May 10, 2019 - link

    Guess what, there are gaming benchmarks at a wide range of resolutions!
  • eva02langley - Friday, May 10, 2019 - link

    I am not sure what is the goal of this? Is it for saying that Sandy Bridge is still relevant, Intel IPC is bad or games developers are lazy?

    One thing for sure, it is time to move on from GTA V. You cannot get anything from those numbers.

    Times to have games that are from 2018 and 2019 only. You cannot just bench old games so your database can be built upon. It doesn't represent the consumer reality.
  • BushLin - Saturday, May 11, 2019 - link

    Yeah, why benchmark a game where the results can be compared against all GPUs and CPUs from the last decade. </s>
  • StevoLincolnite - Sunday, May 12, 2019 - link

    GTA 5 is still demanding.
    Millions of gamers still play GTA 5.

    It is one of the most popular games of all time.

    Ergo... It is entirely relevant having GTA 5 benchies.
  • djayjp - Friday, May 10, 2019 - link

    Then the GPU is still totally relevant.
  • MDD1963 - Saturday, May 11, 2019 - link

    Of course it is....; no one plays at 720P anymore....
  • PeachNCream - Sunday, May 12, 2019 - link

    I'd argue that hardly anyone ever played PC games at that resolution. 720p is 1280x720. Computer screens went from 4:3 resolutions to 16:10 and when that was the case, most commonly the lower resolution panels were 1280x800. When 16:9 ended up taking over, the most common lower resolution was 1366x768. Very few PC monitors were ever actually hit 720p. Even most of the low res cheap TVs out there were 1366 or 1360x768.
  • Zoomer - Friday, June 14, 2019 - link

    Doesn't matter, the performance will be similar.
  • fep_coder - Friday, May 10, 2019 - link

    My threshold for a CPU upgrade has always been 2x performance increase. It's sad that it took this many generations of CPUs to get near that point. Almost all of the systems in my upgrade chain (friends and family) are Sandy Bridge based. I guess that it's finally time to start spending money again.

Log in

Don't have an account? Sign up now