Battery Life and Charge Time

Battery life takes a back seat in gaming laptops, especially since the nature of G-SYNC requires that the dGPU has to be directly connected to the display, and therefore Optimus is unavailableUpdate: Some readers have let me know that Acer offers an option in the BIOS or in their PredatorSense to choose between having G-SYNC or Optimus, meaning they have a MUX installed allowing the GPU to be directly connected to the display, or through the iGPU for battery life. The Battery Tests were re-run with Optimus enabled. Running a massive GPU to do desktop tasks takes a lot of power, full stop. Acer has offered an 84 Wh battery to try to compensate, and they also offer optional Optimus which means you can get some battery life savings by disabling G-SYNC and using the Intel iGPU for light loads.

2013 Battery Life Light

Battery Life 2013 - Light

Our lightest test just cycles four web pages per minute, and isn’t very much work for a modern system. With Optimus enabled, the battery life is very reasonable for a gaming laptop system. The base power drain is still quite high, but compared to gaming systems that don't offer Optimus the Acer is far ahead.

2016 Web

Battery Life 2016 - Web

Our newer web test is much more demanding of the processor, and generally results in quite a bit drop in time compared to the light test, but gaming laptops are a different beast, and the high base power draw generally masks any such changes in CPU usage. In fact, the more demanding test actually provided slightly longer runtime, and with Optimus enabled the battery life is downright reasonable.

Normalized Results

Battery Life 2013 - Light Normalized

Battery Life 2016 - Web - Normalized

Removing the battery size from the results gives us our normalized results, where we can see the efficiency of the various platforms. The Acer Predator Triton 500 is actually a fairly efficient gaming laptop, even though it’s still not great. The industry is actually making progress here, just far slower than they are on the Ultrabook side. The Acer allows the GPU to be switched off with a multiplexer, and when Optimus is enabled the efficiency is significantly higher. The downside is it does require a reboot, and you lose G-SYNC until you re-enabled and reboot again, but if there are scenarios where the extra battery life is needed, the Acer offers the best of both worlds.

Movie Playback

Battery Life Movie Playback

Once again with Optimus enabled the battery life is reasonable here, and it's a fantastic option that Acer has to allow you to choose between Optimus and G-SYNC. The base power draw is still quite high compared to an Ultrabook, but the overall runtime is a lot better than a gaming laptop that forces the media playback to leverage the dGPU.

Tesseract

Battery Life Tesseract

Dividing move playback time by the length of a long movie gives us our Tesseract score, and you can playback the Avengers about 1.5 times before this laptop shuts off with G-SYNC enabled, and two movies played back with Optimus enabled.

Battery Life Conclusion

Acer is one of a few manufacturers offering a multiplexer on the dGPU, allowing the end user to choose between Optimus and G-SYNC, and the results are worth it. If you are doing desktop work and need some extra battery life, the Triton 500 deilvers far more than most gaming laptops. It can't keep up with a low-powered Ultrabook here, but still easily outclasses the competition that does not offer the MUX.

Charge Time

Acer ships the Triton 500 with a 180-Watt AC Adapter with a barrel connector. For those wishing for USB-C charging, the maximum power for USB-C power distribution is 100 Watts, which is not enough for a gaming laptop, which is why they still rely on proprietary power connectors.

Battery Charge Time

You can go from zero to full charge in just over two hours with this laptop, which is pretty good considering the size of the battery. But most likely, the battery will be a mini-UPS for moments when the power goes out anyway, since this laptop is made to be plugged in most of the time.

Display Analysis Wireless, Audio, Thermals, and Software
Comments Locked

46 Comments

View All Comments

  • shabby - Thursday, April 25, 2019 - link

    Is it really a 2080 when the base clock is cut in half?
  • Daeros - Thursday, April 25, 2019 - link

    No, but that's nvida's game - they can honestly say it's the same chip, even though performance is a few steps down in the hierarchy. Just like that 8750h's TDP is nowhere near 45w - probably closer to 120w under load.
  • Opencg - Thursday, April 25, 2019 - link

    you can see in the cpu benchmarks that draw real power for a significant portion of time that it loses a good deal of performance. all in all its about where it should be for a laptop this thin. i would be surprised if it is really designed to handle more than 45w. personally i would bet it can start to throttle on sustained 45w loads
  • philehidiot - Thursday, April 25, 2019 - link

    I saw the 2080 and then the screen resolution. In reality, you'd probably want 4K + gsync for a shortish lifespan machine or 1440P for one with a good few years on it. 1080P says the performance is compromised and they HAD to drop it down. You'd never ever run a desktop on a 2080 with 1080P. I bought a gaming laptop once when I had a real need for it, back in the P4 days. The thing had about 6 fans and chucked out 50C hot air. I required it at the time but I'd never buy one now unless I absolutely needed it. That had 1050 lines, so 1080 isn't really a step up, it's a marketing ploy ("FULL HD!")

    This GPU can not be considered alongside a real 2080 and whilst I appreciate the screen size means resolutions greater that 1440P would be silly (and arguably even that but you must remember you're usually closer to a laptop screen and even a 6" mobile can benefit from the upgrade from 1080 to 1440), to me a gaming laptop generally is 17" anyway. If you go down this path you're rarely looking for real portability but more because you (in my experience) live in two or three different places and want to take a full gaming PC with you with your suitcase and so on.
  • wintermute000 - Thursday, April 25, 2019 - link

    Exactly, a 2060 would have been perfect for 1080p 144hz and then maybe the cooling would have coped.
    Must be a marketing decision to shove the biggest number into the spec sheet....
  • PeachNCream - Friday, May 3, 2019 - link

    I would not mind the slightest pushing 1080p resolutions with a 2080 GPU, but not with this particular laptop given the network adapter selection. It just isn't worth messing with Killer NICs at all when there are other options out there.
  • wintermute000 - Thursday, April 25, 2019 - link

    a.) The TDP as defined by intel (i.e. base clock) IS 45W.
    b.) Power under boost is much higher for sure, but 120W is a total exaggeration. I can get it to run steady on 3.6Ghz (thermal throttling is a different question LOL) on around 60W with an undervolt.
    3.) It would take a mean cooler and power delivery / VRMs on a laptop chassis to let it boost anywhere near its paper specs for long durations. I haven't looked at the built-like-a-tank laptops in depth but none of the mainstream designs have managed it so far.
  • wintermute000 - Thursday, April 25, 2019 - link

    by it I mean the i7-8750H (in an XPS 9570 if that matters)
  • Retycint - Saturday, April 27, 2019 - link

    Intel's definition of TDP is very much meaningless because they can change the base clock to fit in the TDP envelope. The i7-8750H maintained the 45W TDP despite having 2 more cores than the 7700HQ, not because the former has had a huge leap in efficiency, but rather because Intel dropped the base clock from 2.8 to 2.2GHz.

    In other words, Intel can theoretically claim that the 9750H has a 10W TDP, when at the base clock of 0.8 GHz, for instance. Which is why TDP numbers are bull
  • jordanclock - Thursday, April 25, 2019 - link

    Welcome to Max Q! Where the models are made up and the clocks don't matter!

Log in

Don't have an account? Sign up now