Power Consumption

TDP or not the TDP, That is The Question

Notice: When we initially posted this page, we ran numbers with an ASRock Z370 board. We have since discovered that the voltage applied by the board was super high, beyond normal expectations. We have since re-run the numbers using the MSI MPG Z390 Gaming Edge AC motherboard, which does not have this issue.

As shown above, Intel has given each of these processors a Thermal Design Power of 95 Watts. This magic value, as mainstream processors have grown in the last two years, has been at the center of a number of irate users.

By Intel’s own definitions, the TDP is an indicator of the cooling performance required for a processor to maintain its base frequency. In this case, if a user can only cool 95W, they can expect to realistically get only 3.6 GHz on a shiny new Core i9-9900K. That magic TDP value does not take into account any turbo values, even if the all-core turbo (such as 4.7 GHz in this case) is way above that 95W rating.

In order to make sense of this, Intel uses a series of variables called Power Levels: PL1, PL2, and PL3.

That slide is a bit dense, so we should focus on the graph on the right. This is a graph of power against time.

Here we have four horizontal lines from bottom to top: cooling limit (PL1), sustained power delivery (PL2), battery limit (PL3), and power delivery limit.

The bottom line, the cooling limit, is effectively the TDP value. Here the power (and frequency) is limited by the cooling at hand. It is the lowest sustainable frequency for the cooling, so for the most part TDP = PL1.  This is our ‘95W’ value.

The PL2 value, or sustained power delivery, is what amounts to the turbo. This is the maximum sustainable power that the processor can take until we start to hit thermal issues. When a chip goes into a turbo mode, sometimes briefly, this is the part that is relied upon. The value of PL2 can be set by the system manufacturer, however Intel has its own recommended PL2 values.

In this case, for the new 9th Generation Core processors, Intel has set the PL2 value to 210W. This is essentially the power required to hit the peak turbo on all cores, such as 4.7 GHz on the eight-core Core i9-9900K. So users can completely forget the 95W TDP when it comes to cooling. If a user wants those peak frequencies, it’s time to invest in something capable and serious.

Luckily, we can confirm all this in our power testing.

For our testing, we use POV-Ray as our load generator then take the register values for CPU power. This software method, for most platforms, includes the power split between the cores, the DRAM, and the package power. Most users cite this method as not being fully accurate, however compared to system testing it provides a good number without losses, and it forms the basis of the power values used inside the processor for its various functions.

Starting with the easy one, maximum CPU power draw.

Power (Package), Full Load

Focusing on the new Intel CPUs we have tested, both of them go beyond the TDP value, but do not hit PL2. At this level, the CPU is running all cores and threads at the all-core turbo frequency. Both 168.48W for the i9-9900K and 124.27W for the i7=9700K is far and above that ‘TDP’ rating noted above.

Should users be interested, in our testing at 4C/4T and 3.0 GHz, the Core i9-9900K only hit 23W power. Doubling the cores and adding another 50%+ to the frequency causes an almost 7x increase in power consumption. When Intel starts pushing those frequencies, it needs a lot of juice.

If we break out the 9900K into how much power is consumed as we load up the threads, the results look very linear.

This is as we load two threads onto one core at a time. The processor slowly adds power to the cores when threads are assigned.

Comparing to the other two ‘95W’ processors, we can see that the Core i9-9900K pushes more power as more cores are loaded. Despite Intel officially giving all three the same TDP at 95W, and the same PL2 at 210W, there are clear differences due to the fixed turbo tables embedded in each BIOS.

So is TDP Pointless? Yes, But There is a Solution

If you believe that TDP is the peak power draw of the processor under default scenarios, then yes, TDP is pointless, and technically it has been for generations. However under the miasma of a decade of quad core processors, most parts didn’t even reach the TDP rating even under full load – it wasn’t until we started getting higher core count parts, at the same or higher frequency, where it started becoming an issue.

But fear not, there is a solution. Or at least I want to offer one to both Intel and AMD, to see if they will take me up on the offer. The solution here is to offer two TDP ratings: a TDP and a TDP-Peak. In Intel lingo, this is PL1 and PL2, but basically the TDP-Peak takes into account the ‘all-core’ turbo. It doesn’t have to be covered under warranty (because as of right now, turbo is not), but it should be an indication for the nature of the cooling that a user needs to purchase if they want the best performance. Otherwise it’s a case of fumbling in the dark.

Gaming: Integrated Graphics Overclocking
POST A COMMENT

276 Comments

View All Comments

  • Targon - Friday, October 19, 2018 - link

    TSMC will do the job for AMD, and in March/April, we should be seeing AMD release the 3700X and/or 3800X that will be hitting the same clock speeds as the 9900k, but with a better IPC. Reply
  • BurntMyBacon - Friday, October 19, 2018 - link

    I am certainly happy that AMD regained competitiveness. I grabbed an R7 1700X early on for thread heavy tasks while retaining use of my i7-6700K in a gaming PC. That said, I can't credit them with everything good that comes out of Intel. To say that Intel would not have released an 8 core processor without AMD is probably inaccurate. They haven't released a new architecture since Skylake and they are still on a 14nm class process. They had to come up with some reason for customers to buy new processors rather than sit on older models. Clock speeds kinda worked for Kaby Lake, but they need more for Coffee Lake. Small, fixed function add-ons that only affect a small portion of the market probably weren't enough. A six core chip on the mainstream platform may have been inevitable. Going yet another round without a major architecture update or new process node, it is entirely possible that the 8-core processor on the mainstream platform was also inevitable. I give AMD credit for speeding up the release schedule, though.

    As to claims that the GF manufacturing is responsible for the entire 1GHz+ frequency deficit, that is only partially true. It is very likely that some inferior characteristics of the node are reducing the potential maximum frequency achievable. However, much of the limitations on frequency also depends on how AMD layed out the nodes. More capacitance on a node makes switching slower. More logic between flip-flops require more switches to resolve before the final result is presented to the flip-flops. There is a trade-off between the number of buffers you can put on a transmission line as reducing input to output capacitance ratios will speed up individual switch speeds, but they will also increase the number of switches that need to occur. Adding more flip-flops increases the depth of the pipeline (think pentium 4) and increases the penalty for branch misses as well as making clock distribution more complicated. These are just a few of the most basic design considerations that can affect maximum attainable frequency that AMD can control.

    Consequently, there is no guarantee that AMD will be able to match Intel's clock speeds even on TSMC's 7nm process. Also, given that AMD's current IPC is more similar to Haswell and still behind Skylake, it is not certain that they next processors will have better IPC than Intel either. I very much hope one or the other ends up true, but unrealistic expectations won't help the situation. I'd rather be pleasantly surprised than disappointed. As such, I expect that AMD will remain competitive. I expect that they will close the gaming performance gap until Intel releases a new architecture. I expect that regardless of how AMD's 7nm processors stack against Intel's best performance-wise, I expect that AMD likely bring better value at least until Intel gets their 10nm node fully online.
    Reply
  • Spunjji - Monday, October 22, 2018 - link

    "To say that Intel would not have released an 8 core processor without AMD is probably inaccurate."
    It's technically inaccurate to say they would have never made any kind of 8-core processor, sure, but nobody's saying that. That's a straw man. What they are saying is that Intel showed no signs whatsoever of being willing to do it until Ryzen landed at their doorstep.

    To be clear, the evidence is years of Intel making physically smaller and smaller quad-core chips for the mainstream market and pocketing the profit margins, followed by a sudden and hastily-rescheduled grab for the "HEDT" desktop market the second Ryzen came out, followed by a rapid succession of "new" CPU lines with ever-increasing core counts.

    You're also wrong about AMD's IPC, which is very clearly ahead of Haswell. The evidence is here in this very article where you can see the difference in performance between AMD and Intel is mostly a function of the clock speeds they attain. Ryzen was already above Haswell for the 1000 series (more like Broadwell) and the 2000 series brought surprisingly significant steps.
    Reply
  • khanikun - Tuesday, October 23, 2018 - link

    " What they are saying is that Intel showed no signs whatsoever of being willing to do it until Ryzen landed at their doorstep."

    Intel released an 8 core what? 3 years before Ryzen. Sure, it was one of their super expensive Extreme procs, but they still did it. They were slowly ramping up cores for the HEDT market, while slowly bringing them to more normal consumer prices. 3 years before Ryzen, you could get a 6 core i7 for $400 or less. A year before that it was like $550-600. A 1-2 years before that, a 6 core would be $1000+. 8 cores were slowly coming.

    What Ryzen did was speed up Intel's timeframe. They would have came and came at a price point that normal consumers would be purchasing them. If I had to guess, we're probably 2-3 years ahead of what Intel probably wanted to do.

    Now would Ryzen exist, if not for Intel? Core for core, AMD has nothing that can compete with Intel. So...ramp up the core count. We really don't see Intel going away from a unified die design, so that's the best way AMD has to fight Intel. I'm personally surprised AMD didn't push their MCM design years ago. Maybe they didn't want to cannibalize Opteron sales, bad yields, I don't know. Must have been some reason.
    Reply
  • Cooe - Friday, October 19, 2018 - link

    Rofl, delusional poster is delusional. And anyone who bought a 2700X sure as shit doesn't need to do anything to "defend their purchase" to themselves hahaha. Reply
  • evernessince - Saturday, October 20, 2018 - link

    Got on my level newb. The 9900K is a pittance compared to my Xeon 8176. I hope you realized that was sarcasm and how stupid it is to put people down for wanting value. Reply
  • JoeyJoJo123 - Friday, October 19, 2018 - link

    >I think far too much emphasis has been placed on 'value'.

    Then buy the most expensive thing. There's no real need to read reviews at that point either. You just want the best, money is no object to you, and you don't care, cool. Just go down the line and put the most expensive part for each part of the PC build as you browse through Newegg/Amazon/whatever, and you'll have the best of the best.

    For everyone else, where money is a fixed and limited resource, reading reviews MATTERS because we can't afford to buy into something that doesn't perform adequately for the cost investment.

    So yes, Anandtech, keep making reviews to be value-oriented. The fools will be departed with their money either way, value-oriented review or not.
    Reply
  • Arbie - Friday, October 19, 2018 - link

    They'll be parted, yes - and we can hope for departed. Reply
  • GreenReaper - Saturday, October 20, 2018 - link

    Don't be *too* harsh. They're paying the premium to cover lower-level chips which may be barely making back the cost of manufacturing, thus making them a good deal. (Of course, that also helps preserve the monopoly/duopoly by making it harder for others to break in...) Reply
  • Spunjji - Monday, October 22, 2018 - link

    Yeah, to be honest the negatives of idiots buying overpriced "prestige" products tend to outweigh the "trickle down" positives for everyone else. See the product history of nVidia for the past 5 years for reference :/ Reply

Log in

Don't have an account? Sign up now