The New PowerTune: Adding Further States

In 2010 AMD introduced their PowerTune technology alongside their Cayman GPU. PowerTune was a new, advanced method of managing GPU voltages and clockspeeds, with the goal of offering better control over power consumption at all times so that AMD could be more aggressive with their clockspeeds. PowerTune’s primary task was to reign in on programs like FurMark – power viruses as AMD calls them – so that these programs would not push a card past its thermal/electrical limits. Consequently, with PowerTune in place AMD would not need to set their maximum GPU clocks as conservatively merely to handle the power virus scenario.

This technology was brought forward for the entire Southern Islands family of GPUs, and remained virtually unchanged. PowerTune as implemented on SI cards without Boost had 3 states – idle, intermediate (low-3D), and high (full-3D). When for whatever reason PowerTune needed to clamp down on power usage to stay within the designated limits, it could either jump states or merely turn down the clockspeed, depending on how far over the limit the card was trying to go. In practice state jumps were rare – it’s a big gap between high and intermediate – so for non-boost cards it would merely turn down the GPU clockspeed until power consumption was where it needed to be.

Modulating clockspeeds in such a manner is a relatively easy thing to implement, but it’s not without its drawbacks. That drawback being that semiconductor power consumption scales at a far greater rate with voltage than it does with clockspeed. So although turning down clockspeeds does reduce power consumption, it doesn’t do so by a large degree. If you want big power savings, you need to turn down the voltage too.

Starting with 7790 and Bonaire, this is exactly what AMD is doing. Gone is pure clockspeed modulation – inferred states in AMD’s nomenclature – and instead AMD is moving to using a larger number of full states. GCN 1.1 has 8 states altogether, with no inferred states between them. With this change, when PowerTune needs to reduce clockspeeds it can drop to a nearby state, reducing power consumption through both clockspeed and voltage reductions at the same time.

With this change state jumping will also be a far more frequent occurrence. The lack of intermediate states and the lack of granularity (8 states over 700MHz is not fine-grained) effectively makes fast state jumping a requirement, as there’s a very good chance dropping down a state will leave some power/performance on the table. So if it’s throttling, 7790 will be able to state jump as quickly as every 10ms (that’s 100 jumps a second), typically bouncing between two or more states in order to keep the card within its limits.

At the same time, AMD’s formula for picking states on non-boost cards has changed. In a move similar to what AMD has done with Richland, AMD’s temperature-agnostic state selection system has been ditched in favor of one that includes temperatures into the calculation, making it a system that is now based on power, temperature, and load. There are some minor benefits to being temperature-agnostic that AMD is giving up – mainly that performance is going to vary a bit with temperature now – but at the end of the day this allows AMD to better min-max their GPUs to hit higher frequencies more often. This also brings them to parity with Intel and NVIDIA, who have long taken temperature into account.

The fact that this is a very boost-like system is not lost on us, and with these changes the line between PowerTune with and without boost starts to become foggy. Both are ultimately going to be doing the same thing – switching states based on power and temperature considerations – the only difference being whether a card adjusts down, or if it adjusts both up and down. In practice we rarely see cards adjust down outside of FurMark, so while PowerTune doesn’t dictate a clockspeed floor, base clocks are still base clocks. In which case the practical difference between whether an AMD card has boost or not is whether it can access some higher voltage, higher clockspeed states that it may not be able to maintain for long periods of time across all workloads. The 7790 isn’t a boost part of course, but AMD’s own presentation neatly lays out where boost would fit in, so if we do see future GCN 1.1 products with boost we have a good idea of what to expect.

Moving on, with the changes to PowerTune will also come changes to AMD’s API for 3rd party utilities, and what information is reported. First and foremost, due to the frequency of state changes with the new PowerTune, AMD will no longer be reporting the instantaneous state. Instead they will be reporting an average of the states used. We don’t know how big the averaging window is – we suspect it’s no more than 2 seconds – but the end result will be that MSI Afterburner, GPU-Z, and other utilities will now see those averages reported as the clockspeed. This will give most users a better idea of what the effective clockspeed (and thereby effective performance) is, but it does mean that it’s going to be virtually impossible to infer the clockspeeds/voltages of AMD’s new states.

The other change is that with the new PowerTune AMD will be exposing new tweaking options to 3rd parties. The current PowerTune (TDP) setting is going to be joined by a separate setting for adjusting a limit called Total Design Current (TDC), which as the name implies is how much current is allowed to be passed into the GPU. AMD limits cards by both TDP and TDC to keep total power, temperatures, and total currents in check, so this will open up the latter to tweakers. Unfortunately utilities with TDC controls were not ready in time for our 7790 review, so we can’t really comment on TDC at this time. With AMD’s changes to PowerTune however (and their insistence on calling TDP thermal management), TDP may be turning into a temperature control while TDC becomes the new power control.

Finally, since these controls are going to be user-accessible, this will spill-over to AMD’s partners. Partners will be able to set their own TDP and TDC limits if they wish, which will help them fine-tune their factory overclocked cards. This will give partners more headroom for such cards as opposed to being stuck shipping cards at AMD’s reference limits, but it means that different cards from different vendors may have different base TDP and TDC limits, along with different clockspeeds. This also means that in the future equalizing clockspeeds may not be enough to equalize two cards.

Bonaire’s Microarchitecture - What We’re Calling GCN 1.1 Meet The Radeon HD 7790 & Sapphire HD 7790 Dual-X Turbo
Comments Locked

107 Comments

View All Comments

  • Sabresiberian - Monday, March 25, 2013 - link

    A roadmap is nothing but a projection of what is PLANNED for the future, not some kind of "promise" or "guarantee". Calling AMD people liars because the released product didn't match the projection is childish at best.

    And before you slap the "fanboy" label on me, I prefer Nvidia generally speaking (but I'm not going to cut off my proverbial nose to spite my face in order to be brand loyal; if AMD has the current best solution for my purposes, I'm going to buy AMD).
  • CeriseCogburn - Saturday, March 23, 2013 - link

    128 bit bus is great, the HD5770 proved that.
    BWHAHAHHAAA
  • dishayu - Friday, March 22, 2013 - link

    Good eye. But then they metion HD7790 as Pitcairn LE in that infographic. What they have launched as HD7790 now is Bonaire.
  • ShieTar - Friday, March 22, 2013 - link

    Maybe they surprised themselves by getting GDDR5 to run at 6GHz, and realized that they can stick with 128bit at that speed?
  • Lonyo - Friday, March 22, 2013 - link

    They were going to use a cutdown Pitcairn, being 7870/7850 GPU, and cut down the GPU core to use excess cores that couldn't make the cut as 7870/7850s.
    They might have gone with 256-bit to simplify the product for AIB partners who could just re-use their HD7850 designs, rather than needing a new design for a smaller run product.

    The 7790 now is a new GPU designed to be cheaper to produce (as it's smaller) than Pitcairn, and the fact the memory can run at 6GHz is probably due in part to the fact it's a new GPU rather than a cut down Pitcairn.
  • CeriseCogburn - Friday, March 22, 2013 - link

    I don't see a launch date in the whole article, it's NOT available. I guess that's another mystery freebie for AMD's products here.
    Didn't see port config either, so what cabling do we have to buy to run 3 monitors when Asus 650ti runs 4 out of the box, 3 with dvi and vga only ?
    Not impressed with the huge AMD biased game line up either, so expect your mileage to be less than shown.
    No overclock talk really either - so it must blow at that.
    Other sites are reporting amd's beta driver, so maybe they won't even have a release driver for this card when they release it, as AMD is often known to do, for like a year sometimes or forever in terms of any sort of quality-LOL.
    Civ5 has only 1 bench rez, it must have crashed in others.
    Crossfire ? Article didn't say.
    Multi-monitor - no talk of that anymore since nVidia SPANKS amd to death on that now.
    Hopefully you've fooled the internet tards again, because amd is bankrupt, for good reason.
  • Spoelie - Friday, March 22, 2013 - link

    Let's feed the troll.

    Did you even read the article?
    -Launch date is mentioned on page 1, in one and a half week
    -Ports are clearly visible and standard, 2 DVI + HDMI + DisplayPort
    -Lineup is consistent with every other review on Anandtech.
    -There's an entire page on the new PowerTune and how it impacts overclocking, single sample OC investigation is irrelevant and best left for a dedicated vendor comparison.
    -... really?
    Who's the real tard here?
  • Spunjji - Friday, March 22, 2013 - link

    Oh for a down-vote button. We expect no less than mindless bollocks from Cerise, but failing to read the article entirely is a new low.
  • CeriseCogburn - Saturday, March 23, 2013 - link

    No, that's what you do all the time. But thanks for the compliment, since you know I always read the articles completely, yet you think I didn't this time, WRONG.
    I've made a lot of money this past short week without a lot of rest, so I'll give you and dipsy doodle a point on the svengali launch date the article writer for the first time EVER declares "solid" before it even occurs, og wait, he always does that when it's AMD, but if it's nVidia he says we'll have to wait and see as they are probably lying...
    ROFL
    Who cares, the card sucks, amd is dying, the drivers blow beta chunks, and amd is way late to the party.
  • ppeterka - Thursday, July 18, 2013 - link

    Just a question: And how much will your favored brand of GPUs cost, if AMD really dies? 10 times? 100 times? An arm, a leg, and both kidneys? Grow up, and understand how an ecosystem works for us all.

    BTW. I don't have GPU preferences, just grab what gives bets bang for bucks. If it has EasternElbonianVideoPigs GPU on it - be it...

Log in

Don't have an account? Sign up now