• What
    is this?

    You've landed on the AMD Portal on AnandTech. This section is sponsored by AMD. It features a collection of all of our independent AMD content, as well as Tweets & News from AMD directly. AMD will also be running a couple of huge giveaways here so check back for those.

    PRESENTED BY

PowerTune, Cont

PowerTune’s functionality is accomplished in a two-step process. The first step is defining the desired TDP of a product. Notably (and unlike NVIDIA) AMD is not using power monitoring hardware here, citing the costs of such chips and the additional design complexities they create. Instead AMD is profiling the performance of their GPUs to determine what the power consumption behavior is for each functional block. This behavior is used to assign a weighted score to each functional block, which in turn is used to establish a rough equation to find the power consumption of the GPU based on each block’s usage.

AMD doesn’t provide the precise equations used, but you can envision it looking something like this:

Power Consumption =( (shaderUsage * shaderWeight) + (ropUsage * ropWeight) + (memoryUsage * memoryWeight) ) * clockspeed

In the case of the Radeon HD 6970, the TDP is 250W, while the default clockspeed is 880MHz.

With a power equation established, AMD can then adjust GPU performance on the fly to keep power consumption under the TDP. This is accomplished by dynamically adjusting just the core clock based on GPU usage a few times a second. So long as power consumption stays under 250W the 6970 stays at 880MHz, and if power consumption exceeds 250W then the core clock will be brought down to keep power usage in check.

It’s worth noting that in practice the core clock and power usage do not have a linear relationship, so PowerTune may have to drop the core clock by quite a bit in order to maintain its power target. The memory clock and even the core voltage remain unchanged (these are only set with PowerPlay states), so PowerTune only has the core clock to work with.

Ultimately PowerTune is going to fundamentally change how we measure and classify AMD’s GPUs. With PowerTune the TDP really is the TDP; as a completely game/application agonistic way of measuring and containing power consumption, it’s simply not possible to exceed the TDP. The power consumption of the average game is still below the TDP – sometimes well below – so there’s still an average case and a worst case scenario to discuss, but the range between them just got much smaller.

Furthermore as a result, real world performance is going to differ from theoretical performance that much more. Just as is the case with CPUs where the performance you get is the performance you get; teraFLOPs, cache bandwidth, and clocks alone won’t tell you everything about the performance of a product. The TDP and whether the card regularly crosses it will factor in to performance, just as how cooling factors in to CPU performance by allowing/prohibiting higher turbo modes. At least for AMD’s GPUs, we’re now going to be talking about how much performance you can get for any given TDP instead of specific clockspeeds, bringing performance per watt to the forefront of importance.

So by now you’re no doubt wondering what the impact of PowerTune is, and the short answer is that there’s virtually no impact. We’ve gone ahead and compiled a list of all the games and applications in our test suite, and whether they triggered PowerTune throttling. Of the dozen tests, only two triggered PowerTune: FurMark as expected, and Metro 2033. Furthermore as you can see there was a significant difference between the average clockspeed of our 6970 in these two situations.

AMD Radeon HD 6970 PowerTune Throttling
Game/Application Throttled?
Crysis: Warhead No
BattleForge No
Metro Yes (850Mhz)
HAWX No
Civilization V No
Bad Company 2 No
STALKER No
DiRT 2 No
Mass Effect 2 No
Wolfenstein No
3DMark Vantage Yes
MediaEspresso 6 No
Unigine Heaven No
FurMark Yes (600MHz)
Distributed.net Client No

In the case of Metro the average clockspeed was 850MHz; Metro spent 95% of the time running at 880MHz, and only at a couple of points did the core clock drop to around 700MHz. Conversely FurMark, a known outlier, drove the average core clock down to 600MHz for a 30% reduction in the core clock. So while PowerTune definitely had an impact on FurMark performance it did almost nothing to Metro, never mind any other game/application. To illustrate the point, here are our Metro numbers with and without PowerTune.

Radeon HD 6970: Metro 2033 Performance
PowerTune 250W PowerTune 300W
2560x1600 25.5 26
1920x1200 39 39.5
1680x1050 64.5 65

The difference is no more than .5fps on average, which may as well be within our experimental error range for this benchmark. For everything we’ve tested on the 6970 and the 6950, the default PowerTune settings do not have a meaningful performance impact on any game or application we test. Thus at this point we’re confident that there are no immediate drawbacks to PowerTune for desktop use.

Ultimately this is a negative feedback mechanism, unlike Turbo which is a positive feedback mechanism. Without overclocking the best a 6970 will run at is 880MHz, whereas Turbo would increase clockspeeds when conditions allow. Neither one is absolutely the right way to do things, but there’s a very different perception when performance is taken away, versus when performance is “added” for free. I absolutely like where this is going – both as a hardware reviewer and as a gamer – but I’d be surprised if this didn’t generate at least some level of controversy.

Finally, while we’ve looked at PowerTune in the scope of desktop usage, we’ve largely ignored other cases so far. AMD will be the first to tell you that PowerTune is more important for mobile use than it is desktop use, and mobile use is all the more important as the balance between desktops and laptops sold continues to slide towards laptops. In the mobile space not only does PowerTune mean that AMD will absolutely hit their TDPs, but it should allow them to produce mobile GPUs that come with higher stock core clocks, comfortable in the knowledge that PowerTune will keep power usage in check for the heaviest games and applications. The real story for PowerTune doesn’t even begin until 2011 – as far as the 6900 series is concerned, this may as well be a sneak peak.

Even then there’s one possible exception we’re waiting to see: 6990 (Antilles). The Radeon HD 5970 put us in an interesting spot: it was and still is the fastest card around, but unless you can take advantage of CrossFire it’s slower than a single 5870, a byproduct of the fact that AMD had to use lower core and memory clocks to make their 300W TDP. This is in stark comparison to the 4870X2, which really was 2 4870s glued together with the same single GPU performance. With PowerTune AMD doesn’t necessarily need to repeat the 5970’s castrated clocks; they could make a 6970X2, and let PowerTune clip performance as necessary to keep it under 300W. If something is being used without CrossFire for example, then there’s no reason not to run the 1 GPU at full speed. It would be the best of both worlds.

In the meantime we’re not done with PowerTune quite yet. PowerTune isn’t just something AMD can set – it’s adjustable in the Overdrive control panel too.

Redefining TDP With PowerTune Tweaking PowerTune
POST A COMMENT

167 Comments

View All Comments

  • Remon - Wednesday, December 15, 2010 - link

    Seriously, are you using 10.10? It's not like the 10.11 have been out for a while. Oh, wait...

    They've been out for almost a month now. I'm not expecting you to use the 10.12, as these were released just 2 days ago, but you can't have an excuse about not using a month old drivers. Testing overclocked Nvidia cards against newly released cards, and now using older drivers. This site get's more biased with each release.
    Reply
  • cyrusfox - Wednesday, December 15, 2010 - link

    I could be wrong, but 10.11 didn't work with the 6800 series, so I would imagine 10.11 wasn't meant for the 6900 either. If that is the case, it makes total sense why they used 10.10(cause it was the most updated driver available when they reviewed.)

    I am still using 10.10e, thinking about updating to 10.12, but why bother, things are working great at the moment. I'll probably wait for 11. or 11.2.
    Reply
  • Remon - Wednesday, December 15, 2010 - link

    Nevermind, that's what you get when you read reviews early in the morning. The 10.10e was for the older AMD cards. Still, I can't understand the difference between this review and HardOCP's. Reply
  • flyck - Wednesday, December 15, 2010 - link

    it doesn't. Anand has the same result for 25.. resolutions with max details AA and FSAA.

    Presentation on anand however is more focussed on 16x..10.. resolutions. (last graph) if you look in the first graph you'll notice the 6970/6950 performs like HardOcp. e.g. the higher the quality the smaller the gap becomes between 6950 and 570 and 6970 and 580. the lower the more 580 is running away and 6970/6950 are trailing the 570.
    Reply
  • Gonemad - Wednesday, December 15, 2010 - link

    Oookay, new card from the red competitor. Welcome aboard.

    But, all of this time, I had to ask: why is Crysis is so punitive on the graphics cards? I mean, it was released eons ago, and still can't be run with everything cranked up in a single card, if you want 60fps...

    Is it sloppy coding? Does the game *really* looks better with all the eye candy? Or they built a "FPS bug" on purpose, some method of coding that was sure to torture any hardware that would be built in the next 18 months after release?

    I will get slammed for this, but for instance, the water effects on Half Life 2 look great even on lower spec cards, once you turn all the eye-candy on, and the FPS doesn't drop that much. The same for some subtle HDR effects.

    I guess I should see this game by myself and shut up about things I don't know. Yes, I enjoy some smooth gaming, but I wouldn't like to wait 2 years after release to run a game smoothly with everything cranked up.

    Another one is Dirt 2, I played it with all the eye candy to the top, my 5870 dropped to 50-ish FPS (as per benchmarks),it could be noticed eventually. I turned one or two things off, checked if they were not missing after another run, and the in game FPS meter jumped to 70. Yay.
    Reply
  • BrightCandle - Wednesday, December 15, 2010 - link

    Crysis really does have some fabulous graphics. The amount of foliage in the forests is very high. Crysis kills cards because it really does push current hardware.

    I've got Dirt 2 and its not close in the level of detail. Its a decent looking game at times but its not a scratch on Crysis for the amount of stuff on screen. Half life 2 is also not bad looking but it still doesn't have the same amount of detail. The water might look good but its not as good as a PC game can look.

    You should buy Crysis, its £9.99 on steam. Its not a good game IMO but it sure is pretty.
    Reply
  • fausto412 - Wednesday, December 15, 2010 - link

    yes...it's not much of a fun game but damn it is pretty Reply
  • AnnihilatorX - Wednesday, December 15, 2010 - link

    Well original Crysis did push things too far and optimization could be used. Crysis Warhead is much better optimized while giving pretty identical visuals. Reply
  • fausto412 - Wednesday, December 15, 2010 - link

    "I guess I should see this game by myself and shut up about things I don't know. Yes, I enjoy some smooth gaming, but I wouldn't like to wait 2 years after release to run a game smoothly with everything cranked up."

    that's probably a good idea. Crysis was made with future hardware in mind. It's like a freaking tech demo. Ahead of it's time and beaaaaaautiful. check it out on max settings,...then come back tell us what you think.
    Reply
  • TimoKyyro - Wednesday, December 15, 2010 - link

    Thank you for the SmallLuxGPU test. That really made me decide to get this card. I make 3D animations with Blender in Ubuntu so the only thing holding me back is the driver support. Do these cards work in Ubuntu? Is it possible for you to test if the Linux drivers work at the time? Reply

Log in

Don't have an account? Sign up now