Power Consumption

TDP or not the TDP, That is The Question

Notice: When we initially posted this page, we ran numbers with an ASRock Z370 board. We have since discovered that the voltage applied by the board was super high, beyond normal expectations. We have since re-run the numbers using the MSI MPG Z390 Gaming Edge AC motherboard, which does not have this issue.

As shown above, Intel has given each of these processors a Thermal Design Power of 95 Watts. This magic value, as mainstream processors have grown in the last two years, has been at the center of a number of irate users.

By Intel’s own definitions, the TDP is an indicator of the cooling performance required for a processor to maintain its base frequency. In this case, if a user can only cool 95W, they can expect to realistically get only 3.6 GHz on a shiny new Core i9-9900K. That magic TDP value does not take into account any turbo values, even if the all-core turbo (such as 4.7 GHz in this case) is way above that 95W rating.

In order to make sense of this, Intel uses a series of variables called Power Levels: PL1, PL2, and PL3.

That slide is a bit dense, so we should focus on the graph on the right. This is a graph of power against time.

Here we have four horizontal lines from bottom to top: cooling limit (PL1), sustained power delivery (PL2), battery limit (PL3), and power delivery limit.

The bottom line, the cooling limit, is effectively the TDP value. Here the power (and frequency) is limited by the cooling at hand. It is the lowest sustainable frequency for the cooling, so for the most part TDP = PL1.  This is our ‘95W’ value.

The PL2 value, or sustained power delivery, is what amounts to the turbo. This is the maximum sustainable power that the processor can take until we start to hit thermal issues. When a chip goes into a turbo mode, sometimes briefly, this is the part that is relied upon. The value of PL2 can be set by the system manufacturer, however Intel has its own recommended PL2 values.

In this case, for the new 9th Generation Core processors, Intel has set the PL2 value to 210W. This is essentially the power required to hit the peak turbo on all cores, such as 4.7 GHz on the eight-core Core i9-9900K. So users can completely forget the 95W TDP when it comes to cooling. If a user wants those peak frequencies, it’s time to invest in something capable and serious.

Luckily, we can confirm all this in our power testing.

For our testing, we use POV-Ray as our load generator then take the register values for CPU power. This software method, for most platforms, includes the power split between the cores, the DRAM, and the package power. Most users cite this method as not being fully accurate, however compared to system testing it provides a good number without losses, and it forms the basis of the power values used inside the processor for its various functions.

Starting with the easy one, maximum CPU power draw.

Power (Package), Full Load

Focusing on the new Intel CPUs we have tested, both of them go beyond the TDP value, but do not hit PL2. At this level, the CPU is running all cores and threads at the all-core turbo frequency. Both 168.48W for the i9-9900K and 124.27W for the i7=9700K is far and above that ‘TDP’ rating noted above.

Should users be interested, in our testing at 4C/4T and 3.0 GHz, the Core i9-9900K only hit 23W power. Doubling the cores and adding another 50%+ to the frequency causes an almost 7x increase in power consumption. When Intel starts pushing those frequencies, it needs a lot of juice.

If we break out the 9900K into how much power is consumed as we load up the threads, the results look very linear.

This is as we load two threads onto one core at a time. The processor slowly adds power to the cores when threads are assigned.

Comparing to the other two ‘95W’ processors, we can see that the Core i9-9900K pushes more power as more cores are loaded. Despite Intel officially giving all three the same TDP at 95W, and the same PL2 at 210W, there are clear differences due to the fixed turbo tables embedded in each BIOS.

So is TDP Pointless? Yes, But There is a Solution

If you believe that TDP is the peak power draw of the processor under default scenarios, then yes, TDP is pointless, and technically it has been for generations. However under the miasma of a decade of quad core processors, most parts didn’t even reach the TDP rating even under full load – it wasn’t until we started getting higher core count parts, at the same or higher frequency, where it started becoming an issue.

But fear not, there is a solution. Or at least I want to offer one to both Intel and AMD, to see if they will take me up on the offer. The solution here is to offer two TDP ratings: a TDP and a TDP-Peak. In Intel lingo, this is PL1 and PL2, but basically the TDP-Peak takes into account the ‘all-core’ turbo. It doesn’t have to be covered under warranty (because as of right now, turbo is not), but it should be an indication for the nature of the cooling that a user needs to purchase if they want the best performance. Otherwise it’s a case of fumbling in the dark.

Gaming: Integrated Graphics Overclocking
Comments Locked

274 Comments

View All Comments

  • mapesdhs - Sunday, October 21, 2018 - link

    The funny part is that, for productivity, one can pick up used top-end older hw for a pittance, have the best of both worlds. I was building an oc'd 3930K setup for someone (back when RAM prices were still sensible, 32GB DDR3/2400 kit only cost me 115 UKP), replaced the chip with a 10-core XEON E5-2680 v2 which was cheap, works great and way better for productivity. Lower single-threaded speed of course, but still respectable and in most cases it doesn't matter. Also far better heat, noise and power consumption behaviour.

    Intel is already competing with both itself (7820X) and AMD with the 9K series; add in used options and Intel's new stuff (like NVIDIA) is even less appealing. I bagged a used 1080 Ti for 450 UKP, very happy. :)
  • vanilla_gorilla - Friday, October 19, 2018 - link

    So the "Best Gaming CPU" really only has an advantage when gaming at 1080p or less? Who spends this much money on a CPU to game at 1080p? What is the point of this thing?
  • TEAMSWITCHER - Friday, October 19, 2018 - link

    Many benchmarks show the 9900k coming "oh so close" to the 10-core 7900X. I'm thinking that the "Best Gaming CPU" is Intel's wishful thinking for Enthusiasts to spend hundreds more for their X299 platform.
  • HStewart - Friday, October 19, 2018 - link

    Of course at higher resolution it depends on GPU - but from the list of games only Ashes is one stated not top of class for 4k.

    If you look at conclusion in article you will notice that most games got "Best CPU or near top in all" which also means 4k CIV 6 was interesting with "Best CPU at IGP, a bit behind at 4K, top class at 8K/16K" which tells me even though it 4k was not so great - but it was even better at 8k/16k
  • vanilla_gorilla - Friday, October 19, 2018 - link

    At 4K every CPU performs at almost the exact same frame rate. Within 1fps. Why would anyone pay this much for a "gaming CPU" that has no advantage compared to CPUs half the price over 1080p? This is insanity.

    If you are a gamer, save your money, buy a two year old intel or Ryzen CPU and spend the rest on a 4K monitor!
  • CPUGuy - Friday, October 19, 2018 - link

    This CPU is going to be amazing at 10nm.
  • eastcoast_pete - Friday, October 19, 2018 - link

    Yes, a fast chip, but those thermals?! This is the silicon equivalent to boosting an engine's performance with nitrous: you'll get the power, but at what cost? I agree with Ian and others here that this is the chip to get if a. bragging rights (fastest gaming CPU) are really important and b. money is no objective. In its intended use, I'd strongly suggest to budget at least $ 2500 -3000, including a custom liquid-cooling solution for both the 9900K and the graphics card, presumably a 2080.
    In the meantime, the rest of us can hope that AMD will keep Intel's prices for the i7 9700 in check.
  • Arbie - Friday, October 19, 2018 - link

    In the meantime, the rest of us can buy AMD, as anyone should do who doesn't require a chip like this for some professional need.
  • eastcoast_pete - Friday, October 19, 2018 - link

    @Arbie: I agree. If I would be putting a system right now, I would give first consideration to a Ryzen Threadripper 1920X. The MoBos are a bit pricey, but Amazon, Newegg and others have the 1920x on sale at around $470 or so, and its 12 cores/24 threads are enough for even very demanding applications. To me, the only reason to still look at Intel ( i7 8700) is the superior AVX performance that Intel still offers vs. AMD. For some video editing programs, it can make a sizable difference. For general productivity though, a 1920x system at current discounts is the ruling Mid/High End Desktop value king.
  • mapesdhs - Sunday, October 21, 2018 - link

    The exception is Premiere which is still horribly optimised.

Log in

Don't have an account? Sign up now