A Closer Look at Clock Speeds and Power

Wrapping things up, while we've shown that BatteryBoost can certainly improve battery life, there's still the question of what exactly NVIDIA is doing behind the scenes. We know they're playing with the maximum FPS of course, but a frame rate cap alone isn't (always) able to match what BatteryBoost can deliver. To try and shed some additional light on what's going on internally, I logged performance data while running our three BatteryBoost gaming tests. This time, however, the goal was not to fully drain the battery but rather to try and find out what's going on in terms of clock speeds and power draw at a lower level; that means the tests were shorter and there may be more variance, but the numbers are generally in agreement.

There are four tests for each game where I logged data: AC power is the baseline, then I tested DC power without BatteryBoost, with BatteryBoost and a 60FPS target, and finally with BatteryBoost and a 30FPS target. I also tested all for settings with and without VSYNC. I won't guarantee the numbers are 100% accurate, as I have to rely on a utility to report clock speeds and other items, so I won't create any potentially misleading charts; nonetheless, the results are rather interesting to discuss.

First, under AC power the CPU is basically left to run free, and in most cases it will run near its maximum Turbo Boost clocks (3.2-3.3GHz); it also consumes quite a bit of power (25-35W) when VSYNC is off. The GTX 980M meanwhile is running basically full tilt (1100MHz plus or minus ~25MHz on the Core thanks to GPU Boost 2.0, and 5000MHz RAM). Turning VSYC on gives us a taste of things to come, however: average CPU clocks are typically much lower (1800-2000MHz, with spikes up to 3400MHz and lows of 800MHz) and average CPU package power is likewise substantially lower (10-30W). The GPU clocks don't change much, but GPU utilization drops from close to 100% (95-99%, depending on the game) to 32-55%. Switch to battery power and things start to get a bit interesting.

Let's discuss the three games I tested in turn, starting with Tomb Raider. The CPU clock speeds and power tend to vary substantially based on the game, and the GPU varies a bit as well though not as much as the CPU. Even without BatteryBoost, CPU clocks are often at their lowest level (800MHz), and turning on VSYNC actually resulted in higher average CPU clocks but lower average CPU power – the logging data may not be capturing fully accurate CPU clocks, though I suspect the power figures are pretty accurate. GPU clocks show some similarly odd behavior: without VSYNC the average GPU clock was 479MHz with 3200MHz GDDR5, but utilization is at 97%; with VSYNC the average GPU clocks are a bit higher (~950/3200 core/RAM) but utilization is just under 52%.

Enabling BatteryBoost with 60FPS and 30FPS targets continues to generate somewhat unexpected results. At 60FPS, the CPU is generally close to the base 800MHz, but it does average slightly higher when VSYNC is on; power draw from the CPU is pretty consistent at around 6.1-6.5W for the package. Average GPU clocks meanwhile make a bit more sense (they're slightly lower with VSYNC enabled), while average GPU utilization is slightly higher with VSYNC. Overall, however, system power use is much lower with BatteryBoost than without, which is what we'd expect from our earlier battery testing results. It looks like in Tomb Raider the GPU (plus the rest of the system except for the CPU) draws around 60-65W without BatteryBoost, and that drops to 50-55W with BatteryBoost at 60FPS. Our 30FPS BatteryBoost numbers meanwhile don't show a significant change in CPU clocks (still close to the minimum 800MHz), but with the lower FPS the CPU doesn't have to work as hard so CPU package power is now down to around 4.6-4.7W. On the GPU front, the core clocks are around 670-700MHz with close to 50% utilization, but the GDDR5 memory is now running at 1620MHz, so there are some definite power savings there. Average power draw from the GPU and system (again, minus the CPU) is around 35-40W.

Borderlands: The Pre-Sequel behaves quite differently on battery power. The AC results are about the same (CPU and GPU basically running as fast as they can), but now on DC power without BatteryBoost the CPU continues to run at relatively high clocks (3.0-3.4GHz), and as you'd expect power draw remains pretty high as well (20-25W). With BatteryBoost at 60FPS, VSYNC actually had substantially higher CPU clocks (and CPU power use – 14.6W with VSYNC compared to 11.2W without, though the test didn't last as long so there's more chance for variance), but at 30FPS things start to look a lot more like Tomb Raider: the CPU runs at 800-1500MHz, with a 1.0GHz average with VSYNC and 1.125GHz average without; CPU power is 6-7W as well (slightly lower with VSYNC). As for the GPU, things aren't all that different; there's a hard cap of 3.2GHz on the GDDR5 when running off the battery, and while the 980M is frequently at that mark when striving for 60FPS, it's mostly at 1620MHz on the 30FPS setting. The GPU (and system other than CPU) draw close to 50W at 60FPS and 35W at 30FPS, while running without BatteryBoost puts things closer to 60W.

With GRID Autosport, the results on AC power and on DC without BatteryBoost are basically similar to the other two games, though the CPU apparently isn't working as hard as in Borderlands. On AC power it uses 35W and that drops to 23W with VSYNC; on DC without BatteryBoost the CPU is drawing 25W and 15W with VSYNC. The GPU plus other system components meanwhile look to be drawing around 66W without BatteryBoost and 56W with VSYNC enabled. Turn on BatteryBoost and again at 60FPS we see higher CPU clocks (and higher CPU power use) when VSYNC is enabled, but we're talking about 10.7W without VSYNC and 13.7W with VSYNC, and apparently other factors can make up for the difference. The GPU and other components draw around 42W without VSYNC and 39W with VSYNC, so it balances out. Last but not least, at 30FPS the CPU package power averages ~7.3W without VSYNC and ~7.8W with VSYNC, while the GPU and remaining components use 35.7W without VSYCN and 31.8W with VSYNC.

Based on our testing of three different games, it appears BatteryBoost is most effective in games that don't hit the CPU as hard, though with caveats. Tomb Raider for example is known to be pretty easy on the CPU (i.e. a slower AMD APU would likely get close to the same frame rates as a fast Core i7 when paired with the same GPU). However, the type of calculations each game uses (including AI) mean that in some cases a game that doesn't appear to be very CPU intensive may still draw a fair amount of power from the CPU. In general, it looks like the GTX 980M under most gaming workloads will draw at least 25-30W of power (and another 5W or so for the motherboard, RAM, LCD, etc.), which means the lower the CPU load the better. In some cases it should be possible to get the entire GT72 notebook close to 35W while gaming, which would mean the 87Wh battery might last up to nearly 2.5 hours; more realistically, I'd expect most games will pull 40-45W even at the 30FPS target with BatteryBoost, which equates to 1.9 to 2.2 hours at most. Obviously if you have a game that's more taxing (e.g. Metro: Last Light), you'll get even less battery life.

With that said, one other interesting piece of information is that in our Light battery test (Internet surfing) using the same Balanced power profile, with the GTX 980M enabled the GT72 manages around 220 minutes of mobility. (Our Heavy battery test drops it to 165 minutes, if you're wondering.) While two hours of gaming isn't going to be enough for a LAN party, it's still quite impressive to see the GTX 980M effectively drawing about as much power as a GT 750M when BatteryBoost is enabled – though in most cases it's also providing roughly the same level of performance of the GT 750M (under AC power).

BatteryBoost: Gaming Battery Life x 3 Closing Thoughts
Comments Locked

26 Comments

View All Comments

  • inighthawki - Friday, October 24, 2014 - link

    Yeah, at such high framerates, it wouldn't be uncommon to not always be at the max queue depth, so you'll get the illusion that it's always continuously rendering. But in this case you're really just rendering frames ahead. One nice advantage sis that if you hit the queue depth, you'll actually get more consistently smooth motion, since the frame rate is more consistent. Having the game wake up consistently every vblank and rendering one frame provides a more fixed timestep for things like animation, compared to having a variable rate by rendering as fast as you can. Most people will likely never notice though.

    It's unfortunate that Windows forces the games into that model, since sometimes I'd love the triple buffering model instead. I like the lower latency of that mode, while also removing screen tearing. Given that I run a GTX780 plugged into my wall socket, I'm not too concerned about the power savings - especially considering I usually disable vsync anyway, so I'm not really wasting any more than normal.
  • HiTechObsessed - Friday, October 24, 2014 - link

    If what you're saying is true, battery life would decrease when turning on VSync... Looking at the results here, with BatteryBoost off, turning VSync on increases battery life.
  • thepaleobiker - Thursday, October 23, 2014 - link

    Yes, please read the article good sir.
  • limitedaccess - Thursday, October 23, 2014 - link

    Is there any actual difference in terms of thermal performance? Either lower temps and/or fanspeed (fan noise)? I would assume if the GPU itself is consuming significantly less power its average heat output should be lower as well and less stress placed upon the cooling system.

    As an extension of this are you able to ask Nvidia to comment on whether or not it is technically possible to extend a variation of this to desktop GPUs and if there is any plan to? This would enable the flexibility of building a system that is extremely low noise (or even passive) for certain gaming workloads yet still have performance on demand.
  • nevertell - Thursday, October 23, 2014 - link

    As there is less energy consumed, there is less energy dissapated. Ultimately, all energy that is used by any computer that isn't then used to power LED's or displays will be turned into heat.
  • limitedaccess - Thursday, October 23, 2014 - link

    Yes I'm aware of the theory. However I am curious as to what the actual tested impact would be in this case and how significant (or insignificant) the difference might be.
  • Brett Howse - Thursday, October 23, 2014 - link

    When I tested the Razer Blade, I noticed a significant decrease in temperatures and of course noise when playing with Battery Boost enabled, which is what you would expect since it is working far less.
  • JarredWalton - Thursday, October 23, 2014 - link

    Yup. Running the GPU at lower clocks and reducing power consumed means the fans don't have to work as hard to keep the system cool. Targeting 30FPS, the GT72 is pretty quiet -- not silent, but not loud at all. I didn't take measurements (I'll try that for the final full review), but there's nothing too shocking: lower performance => less heat => less noise.
  • CrazyElf - Thursday, October 23, 2014 - link

    All in all, this new Battery Boost feature seems to indicate a modest incremental improvement in battery life. It's not as good as say, the leap in performance per watt that Maxwell gave, but it's welcome nonetheless.

    The issue has always been that there's a tradeoff between size, mobility, and battery life, especially for a large hungry gaming GPU.

    Jarred, by any chance, are you aware that there is going to be are variants of the GT72 with an IPS monitor coming out in the coming months? It's already up for pre-order at many of the laptop sellers. Downside is there's a pretty big price premium.
  • sonicmerlin - Thursday, October 23, 2014 - link

    Don't these laptops have nvidia Optimus and Haswell processors? Why is their non gaming runtime so low despite their large batteries?

Log in

Don't have an account? Sign up now