Tweaking PowerTune

While the primary purpose of PowerTune is to keep the power consumption of a video card within its TDP in all cases, AMD has realized that PowerTune isn’t necessarily something everyone wants, and so they’re making it adjustable in the Overdrive control panel. With Overdrive you’ll be able to adjust the PowerTune limits both up and down by up to 20% to suit your needs.

We’ll start with the case of increasing the PowerTune limits. While AMD does not allow users to completely turn off PowerTune, they’re offering the next best thing by allowing you to increase the PowerTune limits. Acknowledging that not everyone wants to keep their cards at their initial PowerTune limits, AMD has included a slider with the Overdrive control panel that allows +/- 20% adjustment to the PowerTune limit. In the case of the 6970 this means the PowerTune limit can be adjusted to anywhere between 200W and 300W, the latter being the ATX spec maximum.

Ultimately the purpose of raising the PowerTune limit depends on just how far you raise it. A slight increase can bring a slight performance advantage in any game/application that is held back by PowerTune, while going the whole nine yards to 20% is for all practical purposes disabling PowerTune at stock clocks and voltages.

We’ve already established that at the stock PowerTune limit of 250W only FurMark and Metro 2033 are PowerTune limited, with only the former limited in any meaningful way. So with that in mind we increased our PowerTune limit to 300W and re-ran our power/temperature/noise tests to look at the full impact of using the 300W limit.

Radeon HD 6970: PowerTune Performance
PowerTune 250W PowerTune 300W
Crysis Temperature 78 79
Furmark Temperature 83 90
Crysis Power 340W 355W
Furmark Power 361W 422W

As expected, power and temperature both increase with FurMark with PowerTune at 300W. At this point FurMark is no longer constrained by PowerTune and our 6970 runs at 880MHz throughout the test. Overall our power consumption measured at the wall increased by 60W, while the core clock for FurMark is 46.6% faster. It was under this scenario that we also “uncapped” PowerTune for Metro, when we found that even though Metro was being throttled at times, the performance impact was impossibly small.

Meanwhile we found something interesting when running Crysis. Even though Crysis is not impacted by PowerTune, Crysis’ power consumption still crept up by 15W. Performance is exactly the same, and yet here we are with slightly higher power consumption. We don’t have a good explanation for this at this point – PowerTune only affects the core clock (and not the core voltage), and we never measured Crysis taking a hit at 250W or 300W, so we’re not sure just what is going on. However we’ve already established that FurMark is the only program realistically impacted by the 250W limit, so at stock clocks there’s little reason to increase the PowerTune limit.

This does bring up overclocking however. Due to the limited amount of time we had with the 6900 series we have not been able to do a serious overclocking investigation, but as clockspeed is a factor in the power equation, PowerTune is going to impact overclocking. You’re going to want to raise the PowerTune limit when overclocking, otherwise PowerTune is liable to bring your clocks right back down to keep power consumption below 250W. The good news for hardcore overclockers is that while AMD set a 20% limit on our reference cards, partners will be free to set their own tweaking limits – we’d expect high-end cards like the Gigabyte SOC, MSI Lightning, and Asus Matrix lines to all feature higher limits to keep PowerTune from throttling extreme overclocks.

Meanwhile there’s a second scenario AMD has thrown at us for PowerTune: tuning down. Although we generally live by the “more is better” mantra, there is some logic to this. Going back to our dynamic range example, by shrinking the dynamic power range power hogs at the top of the spectrum get pushed down, but thanks to AMD’s ability to use higher default core clocks, power consumption of low impact games and applications goes up. In essence power consumption gets just a bit worse because performance has improved.

Traditionally V-sync has been used as the preferred method of limiting power consumption by limiting a card’s performance, but V-sync introduces additional input lag and the potential for skipped frames when triple-buffering is not available, making it a suboptimal solution in some cases. Thus if you wanted to keep a card at a lower performance/power level for any given game/application but did not want to use V-sync, you were out of luck unless you wanted to start playing with core clocks and voltages manually. By being able to turn down the PowerTune limits however, you can now constrain power consumption and performance on a simpler basis.

As with the 300W PowerTune limit, we ran our power/temperature/noise tests with the 200W limit to see what the impact would be.

Radeon HD 6970: PowerTune Performance
PowerTune 250W PowerTune 200W
Crysis Temperature 78 71
Furmark Temperature 83 71
Crysis Power 340W 292W
Furmark Power 361W 292W

Right off the bat everything is lower. FurMark is now at 292W, and quite surprisingly Crysis is also at 292W. This plays off of the fact that most games don’t cause a card to approach its limit in the first place, so bringing the ceiling down will bring the power consumption of more power hungry games and applications down to the same power consumption levels as lesser games/applications.

Although not whisper quiet, our 6970 is definitely quieter at the 200W limit than the default 250W limit thanks to the lower power consumption. However the 200W limit also impacts practically every game and application we test, so performance is definitely going to go down for everything if you do reduce the PowerTune limit by the full 20%.

Radeon HD 6970: PowerTune Crysis Performance
PowerTune 250W PowerTune 200W
2560x1600 36.6 28
1920x1200 51.5 43.3
1680x1050 63.3 52

At 200W, you’re looking at around 75%-80% of the performance for Crysis. The exact value will depend on just how heavy of a load the specific game/application was in the first place.

PowerTune, Cont Another New Anti-Aliasing Mode: Enhanced Quality AA
Comments Locked

168 Comments

View All Comments

  • Remon - Wednesday, December 15, 2010 - link

    Seriously, are you using 10.10? It's not like the 10.11 have been out for a while. Oh, wait...

    They've been out for almost a month now. I'm not expecting you to use the 10.12, as these were released just 2 days ago, but you can't have an excuse about not using a month old drivers. Testing overclocked Nvidia cards against newly released cards, and now using older drivers. This site get's more biased with each release.
  • cyrusfox - Wednesday, December 15, 2010 - link

    I could be wrong, but 10.11 didn't work with the 6800 series, so I would imagine 10.11 wasn't meant for the 6900 either. If that is the case, it makes total sense why they used 10.10(cause it was the most updated driver available when they reviewed.)

    I am still using 10.10e, thinking about updating to 10.12, but why bother, things are working great at the moment. I'll probably wait for 11. or 11.2.
  • Remon - Wednesday, December 15, 2010 - link

    Nevermind, that's what you get when you read reviews early in the morning. The 10.10e was for the older AMD cards. Still, I can't understand the difference between this review and HardOCP's.
  • flyck - Wednesday, December 15, 2010 - link

    it doesn't. Anand has the same result for 25.. resolutions with max details AA and FSAA.

    Presentation on anand however is more focussed on 16x..10.. resolutions. (last graph) if you look in the first graph you'll notice the 6970/6950 performs like HardOcp. e.g. the higher the quality the smaller the gap becomes between 6950 and 570 and 6970 and 580. the lower the more 580 is running away and 6970/6950 are trailing the 570.
  • Gonemad - Wednesday, December 15, 2010 - link

    Oookay, new card from the red competitor. Welcome aboard.

    But, all of this time, I had to ask: why is Crysis is so punitive on the graphics cards? I mean, it was released eons ago, and still can't be run with everything cranked up in a single card, if you want 60fps...

    Is it sloppy coding? Does the game *really* looks better with all the eye candy? Or they built a "FPS bug" on purpose, some method of coding that was sure to torture any hardware that would be built in the next 18 months after release?

    I will get slammed for this, but for instance, the water effects on Half Life 2 look great even on lower spec cards, once you turn all the eye-candy on, and the FPS doesn't drop that much. The same for some subtle HDR effects.

    I guess I should see this game by myself and shut up about things I don't know. Yes, I enjoy some smooth gaming, but I wouldn't like to wait 2 years after release to run a game smoothly with everything cranked up.

    Another one is Dirt 2, I played it with all the eye candy to the top, my 5870 dropped to 50-ish FPS (as per benchmarks),it could be noticed eventually. I turned one or two things off, checked if they were not missing after another run, and the in game FPS meter jumped to 70. Yay.
  • BrightCandle - Wednesday, December 15, 2010 - link

    Crysis really does have some fabulous graphics. The amount of foliage in the forests is very high. Crysis kills cards because it really does push current hardware.

    I've got Dirt 2 and its not close in the level of detail. Its a decent looking game at times but its not a scratch on Crysis for the amount of stuff on screen. Half life 2 is also not bad looking but it still doesn't have the same amount of detail. The water might look good but its not as good as a PC game can look.

    You should buy Crysis, its £9.99 on steam. Its not a good game IMO but it sure is pretty.
  • fausto412 - Wednesday, December 15, 2010 - link

    yes...it's not much of a fun game but damn it is pretty
  • AnnihilatorX - Wednesday, December 15, 2010 - link

    Well original Crysis did push things too far and optimization could be used. Crysis Warhead is much better optimized while giving pretty identical visuals.
  • fausto412 - Wednesday, December 15, 2010 - link

    "I guess I should see this game by myself and shut up about things I don't know. Yes, I enjoy some smooth gaming, but I wouldn't like to wait 2 years after release to run a game smoothly with everything cranked up."

    that's probably a good idea. Crysis was made with future hardware in mind. It's like a freaking tech demo. Ahead of it's time and beaaaaaautiful. check it out on max settings,...then come back tell us what you think.
  • TimoKyyro - Wednesday, December 15, 2010 - link

    Thank you for the SmallLuxGPU test. That really made me decide to get this card. I make 3D animations with Blender in Ubuntu so the only thing holding me back is the driver support. Do these cards work in Ubuntu? Is it possible for you to test if the Linux drivers work at the time?

Log in

Don't have an account? Sign up now