Power Consumption: TDP Doesn't Matter

Regular readers may have come across a recent article I wrote about the state of power consumption and the magic 'TDP' numbers that Intel writes on the side of its processors. In that piece, I wrote that the single number is often both misleading and irrelevant, especially for the new Core i9 parts sitting at the top of Intel's offerings. These parts, labeled 95W, can go beyond 160W easily, and motherboard manufacturers don't adhere to Intel official specifications on turbo time. Users without appropriate cooling could hit thermal saving performance states very quickly.

Well, I'm here to tell you that the TDP numbers for the G5400 and 200GE are similarly misleading and irrelevant, but in the opposite direction.

On the official specification lists, the Athlon 200GE is rated at 35W - all of AMD's GE processors are rated at this value. The Pentium G5400 situation is a bit more complex, as it offers two values: 54W or 58W, depending on if the processor has come from a dual-core design (54W) or a cut down quad-core design (58W). There's no real way to tell which one you have without taking the heatspreader off and seeing how big the silicon is.

For our power tests, we probe the internal power registers during a heavy load (in this case, POV-Ray), and see what numbers spit out. Both Intel and AMD have been fairly good in recent memory in keeping these registers open, showing package, core, and other power values. TDP relates to the full CPU package, so here's what we see with a full load on both chips:

Power (Package), Full Load

That was fairly anticlimactic. Both CPUs have power consumption numbers well below the rated number on the box - AMD at about half, and Intel below half. So when I said those numbers were misleading and irrelevant, this is what I mean.

Truth be told, we can look at this analytically. AMD's big chips have eight cores with hyperthreading have a box number of 105W and a tested result of 117W. That's at high frequency (4.3 GHz) and all cores, so if we cut that down to two cores at the same frequency, we get 29W, which is already under the 200GE TDP. Scale the frequency back, as well as the voltage, and remember that it's a non-linear relationship, and it's quite clear to see where the 18W peak power of the 200GE comes from. The Intel chip is similar.

So why even rate it that high?

Several reasons. Firstly, vendors will argue that TDP is a measure of cooling capacity, not power (technically true), and so getting a 35W or 54W cooler is overkill for these chips, helping keep them cool and viable for longer (as they might already be rejected silicon). Riding close to the actual power consumption might give motherboard vendors more reasons to cheap out on power delivery on the cheapest products too. Then there's the argument that some chips, the ones that barely make the grade, might actually hit that power value at load, so they have to cover all scenarios. There's also perhaps a bit of market expectation: if you say it's an 18W processor, people might not take it seriously.

It all barely makes little sense but there we are. This is why we test.

Gaming: F1 2018 Overclocking on AMD Athlon 200GE
Comments Locked

95 Comments

View All Comments

  • perdomot - Saturday, January 19, 2019 - link

    How does the author of this article not know that the price of the G5400 is in the $120+ range? At that price, the 1300x would be the appropriate comparison and it clearly smokes the Intel cpu in the benches. The author needs a reprimand for this poor work.
  • mito0815 - Thursday, January 24, 2019 - link

    Oh ffs. Been a while since I was around, and OH WOULD YOU LOOK AT THAT, the AMD shilling and -fanboyism in the comments has become just as unbearable as I'd imagined. People, he set up two budget CPUs on a comparable level (AMD strong in GPU, Intel a tad bit stronger in CPU performance & clock) against each other...nothing more, nothing less. Store prices for Intel CPU's being so inflated isn't really Intel's fault now, is it? The intended stock prices are still very much comparable. By your logic, AMD would've not been quite the price/performance god you all worship during the mining GPU price explosion now, would it?

    But no, all you guys want is an article with some AMD CPU coming out on top, no matter how it's done. Get over yourselves. By the looks of it, while GPU is still a sore point with AMD, Ryzen 2 seems to look good so far. Wait for that and don't go all rampant now.
  • kkilobyte - Saturday, January 26, 2019 - link

    The article title starting with: "The $60 CPU question", it is not unreasonable 'fanboi-ism' to expect that the article is comparing CPUs costing, well, around $60.

    And the issue is not about Intel being guilty or not of the current high prices.

    The problem is that the article draws conclusions that simply don't match reality, precisely because it doesn't adress the current discrepancy between the street prices and the manufacturer's suggested one. It would have taken a single paragraph to explain that.

    My issue about the article is that, unlike what you are writing, it doesn't compare CPUs of similar (price) level. What it does is comparing CPUs of similar *theorical* price levels, but draws a conclusion as if those were the commonly seen street prices. This is dishonest and misleading.
  • watersb - Saturday, February 9, 2019 - link

    Thanks for this review. I usually build low-end systems (PCs for family members), buy off-lease enterprise stuff (test servers), or used Apple or Lenovo gear (rebuilds and workstation projects).

    Budget gamng gear for the kids, then help them upgrade graohics card later, seems to be the one remaining path to "gaming enthusiast" hobby.

    Everyone else gets a Chromebook. And a Raspberry Pi.
  • Dr Hasan - Tuesday, November 26, 2019 - link

    Why are all products are old and prices too. Athlon 3000g is 50$ rayzen 2200g is less than 100$

Log in

Don't have an account? Sign up now