Power Consumption: TDP Doesn't Matter

Regular readers may have come across a recent article I wrote about the state of power consumption and the magic 'TDP' numbers that Intel writes on the side of its processors. In that piece, I wrote that the single number is often both misleading and irrelevant, especially for the new Core i9 parts sitting at the top of Intel's offerings. These parts, labeled 95W, can go beyond 160W easily, and motherboard manufacturers don't adhere to Intel official specifications on turbo time. Users without appropriate cooling could hit thermal saving performance states very quickly.

Well, I'm here to tell you that the TDP numbers for the G5400 and 200GE are similarly misleading and irrelevant, but in the opposite direction.

On the official specification lists, the Athlon 200GE is rated at 35W - all of AMD's GE processors are rated at this value. The Pentium G5400 situation is a bit more complex, as it offers two values: 54W or 58W, depending on if the processor has come from a dual-core design (54W) or a cut down quad-core design (58W). There's no real way to tell which one you have without taking the heatspreader off and seeing how big the silicon is.

For our power tests, we probe the internal power registers during a heavy load (in this case, POV-Ray), and see what numbers spit out. Both Intel and AMD have been fairly good in recent memory in keeping these registers open, showing package, core, and other power values. TDP relates to the full CPU package, so here's what we see with a full load on both chips:

Power (Package), Full Load

That was fairly anticlimactic. Both CPUs have power consumption numbers well below the rated number on the box - AMD at about half, and Intel below half. So when I said those numbers were misleading and irrelevant, this is what I mean.

Truth be told, we can look at this analytically. AMD's big chips have eight cores with hyperthreading have a box number of 105W and a tested result of 117W. That's at high frequency (4.3 GHz) and all cores, so if we cut that down to two cores at the same frequency, we get 29W, which is already under the 200GE TDP. Scale the frequency back, as well as the voltage, and remember that it's a non-linear relationship, and it's quite clear to see where the 18W peak power of the 200GE comes from. The Intel chip is similar.

So why even rate it that high?

Several reasons. Firstly, vendors will argue that TDP is a measure of cooling capacity, not power (technically true), and so getting a 35W or 54W cooler is overkill for these chips, helping keep them cool and viable for longer (as they might already be rejected silicon). Riding close to the actual power consumption might give motherboard vendors more reasons to cheap out on power delivery on the cheapest products too. Then there's the argument that some chips, the ones that barely make the grade, might actually hit that power value at load, so they have to cover all scenarios. There's also perhaps a bit of market expectation: if you say it's an 18W processor, people might not take it seriously.

It all barely makes little sense but there we are. This is why we test.

Gaming: F1 2018 Overclocking on AMD Athlon 200GE
Comments Locked

95 Comments

View All Comments

  • kkilobyte - Monday, January 14, 2019 - link

    s/i3/Pentium. Obviously :)
  • freedom4556 - Monday, January 14, 2019 - link

    I think you messed up your charts for Civ 6's IGP testing. That or why are you testing the IGP at 1080p Ultra when all the other IGP tests are at 720p Low?
  • freedom4556 - Monday, January 14, 2019 - link

    Also, the 8k and 16k tests are pointless wastes of time. Especially in this review, but also in the others. Your low/med/high/ultra should be 720p/1080p/1440p/4k if you want to actually represent the displays people are purchasing.
  • nevcairiel - Monday, January 14, 2019 - link

    The Civ6 tests are like that because thats when it really starts to scale like the other games. Look at its IGP vs Low, which is 1080p vs 4K. The values are almost identical (and still pretty solid). Only if you move to 8K and then 16K you see the usual performance degredation you would see with other games.
  • AnnoyedGrunt - Tuesday, January 15, 2019 - link

    I second this motion. Please have settings to cover the various common monitor choices. 1080P is an obvious choice, but 1440P should be there too, along with 4K. I don't think you need to run two 4K versions, or two 1080P versions, or whatever. I have a 1440P monitor so it would be nice to see where I become GPU limited as opposed to CPU limited. Maybe Civ6 could use some extra high resolutions in the name of science, but to be useful, you should at least include the 1440P on all games.

    Thanks.

    -AG
  • eddieobscurant - Monday, January 14, 2019 - link

    Another pro intel article from Ian, who hopes that someday intel will hire him
  • PeachNCream - Monday, January 14, 2019 - link

    The numbers in the chart speak for themselves. You don't have to acknowledge the conclusion text. It's only a recommendation anyway. Even though I'd personally purchase a 200GE if I were in the market, I don't think there is any sort of individual bias coming into play. Where the 200GE is relevant, gaming on the IGP, Ian recommended it. In other cases the G5400 did come out ahead by enough of a margin to make it worth consideration. The only flaw I could tease out of this is the fact that the recommendation is based on MSRP and as others have noted, the G5400 is significantly above MSRP right now. It may have been good to acknowledge that in the intro and conclusion in a stronger manner, but that means the article may not stand up as well to the test of time for someone browsing this content six months later after searching for advice on the relevant CPUs via Google.
  • kkilobyte - Monday, January 14, 2019 - link

    Acknowledge "in a stronger manner"? Well, it is actually not acknowledged in the conclusion at all!

    The title of the article is: "The $60 CPU question". One of those CPU is clearly not being sold at $60 on average, but is priced significantly higher. I think the article should have compared CPUs that are really available at (around) $60.

    So maybe there is no personal bias - but there is clearly ignorance of the market state. And that's surprizing, since the G5400 price was above its MSRP for several months already; how could a professional journalist in the field ignore that?

    I guess it could be objected that "MSRP always was used in the past as the reference price". Granted - but it made sense while the MSRP was close to the real market price. It doesn't anymore once the gap gets big, which is the case for tbe G5400. Nobody gives a damn about the theorical price if it is applied nowhere on the market.

    And the 'numbers of chart' don't 'speak for themselves' - they are basically comparing CPUs whose retail price, depending on where you get them, show a 20-40% price gap. What's the point? Why isn't there a price/performance graph, as there were in past reviews? The graphs could just as well include high-end CPUs, and would be just as useless.

    If I want to invest ~$60 in a CPU, I'm not interested to know how a ~$90 one performs!
  • sonny73n - Tuesday, January 15, 2019 - link

    +1

    I couldn’t have said it better myself.
  • cheshirster - Wednesday, January 23, 2019 - link

    Yes, 5400 is priced nowhere near 60$ and reviewer definitely knows it, but fails to mention this in conclusion.

Log in

Don't have an account? Sign up now