Battery Life and Charge Time

On most notebooks this section is one of the most important, but on a machine like this, portability takes a back seat to performance. There is still the expectation that you can have some battery life, but with the understanding that all of the components inside which make the Phobos 8716 so fast come at a cost of power consumption.

That is certainly the case if the machine is being stressed. With a CPU featuring a 91-Watt TDP, and a GPU which has an undisclosed (but under 180-Watt) TDP, the battery capacity of 82 Wh can be exhausted pretty quickly. But if you do need to use the notebook off the mains, having the ability to watch a movie, or surf the net a bit would be nice.

To test battery life, we have two main tests. Our older 2013 light battery life test loads four web pages every minute, and the test continues this until the laptop stops. The newer 2016 battery life test leverages the same test we use on mobile, which is much more stressful. There is no perfect way to measure battery life, since everyone’s use case is different, but by providing consistent testing with the displays set to 200 nits, we can at least get a good comparison across devices on a common usage scenario.

One note about this laptop is that despite being set to not change the display brightness, when the battery hits 7%, the brightness drops to zero, meaning this laptop will score a few extra minutes than it should.

2013 Light

Battery Life 2013 - Light

On our older battery life test, the P870DM2 / Phobos 8716 does surprisingly well. It achieves a result of just under three hours, which is terrible compared to an Ultrabook, but compared to the previous Clevo DTR, there is a pretty significant jump. Since the battery size is the same between the P750ZM, albeit the older model does have a UHD display which would certainly impact the result. Still, it’s a reasonable result.

2016 Web

Battery Life 2016 - Web

With the new web test, which is much more stressful to the CPU, the Phobos 8716 actually scores a few minutes higher than on the older test. The average power consumption doesn’t change much despite the increased workload. That isn’t a big surprise when you have high power components, since their baseline power usage will already be a lot higher than something meant for long battery life like a Cherry Trail Atom, where every milliwatt matters.

Normalized

To give an actual value to efficiency, the battery size is factored out of the battery life to provide a minutes / Wh result.

Battery Life 2013 - Light Normalized

Battery Life 2016 - Web - Normalized

Unsurprisingly, with desktop components stuffed inside, the Clevo P870 DM2 is one of the least efficient notebooks tested, with only the Clevo P750ZM being worse. Better is better, but the target use case for this machine is not an all day battery powered notebook, so it’s not as big of an issue as it would be on smaller laptops.

Additional Battery Life Testing

OK, so we’ve already determined that the Clevo P870DM2 is not the world’s best notebook in terms of battery life, but there are a couple of other scenarios which warranted testing. Since it’s a gaming notebook, just how long can you play while gaming? To test this, Tomb Raider was fired up, and it was configured to use the Battery Booster settings with NVIDIA’s GeForce Experience (GFE) software.

Battery Life Rise of the Tomb Raider

The settings in GFE capped the frame rate to 60 frames per second, exactly. Minimum was 60.0 frames per second, average was 60.0 frames per second, and maximum was 60.0 frames per second. The result was being able to play Tomb Raider for just over an hour. With a bit more tweaking, and adjusting the GFE a bit, maybe a few more minutes could be eked out, but the runtime of one hour gaming is pretty decent. Plus, you get the added benefit of the fans barely spooling up since the notebook isn’t even working hard.

The other potential reason to need battery life is when watching a movie. Perhaps you are on a road trip and you have your 12 lb laptop in your lap. Can you get through one movie? Two? Let’s find out.

Battery Life Movie Playback

The result playing a movie is not that much better than surfing the web. On Ultrabooks, this task is offloaded to fixed function hardware, and it can increase the battery life significantly, but the idle power usage of the Clevo P870DM2 is just too high for this to make much of a difference.

To put this time into perspective, we’ve come up with a new movie battery life rating which we have deemed the Tesseract. Each Tesseract equals 143 minutes, or the length of The Avengers movie.

Tesseract

You can easily get through one run of The Avengers, but only through 40% of a second run, so unless you love cliff hangers, you may want to find somewhere to plug in.

Display Wireless, Audio, Thermals, Noise, and Software
Comments Locked

61 Comments

View All Comments

  • BrokenCrayons - Thursday, October 27, 2016 - link

    Minor details...the MYTH Control Center shows an image of a different laptop. It struck me right away because of the pre-Pentium MMX Compaq Presario-esque style hinge design.

    As for Pascal, the performance is nice, but I continue to be disappointed by the cooling and power requirements. The number of heat pipes affixed to the GPU, the fact that it's still reaching thermal limits with such cooling, and the absurd PSU requirements for SLI make it pretty obvious the whole desktop-class GPU in a laptop isn't a consumer-friendly move on NV's part. Sure it cuts back on engineering, manufacturing, and part inventory costs and results in a leaner organization, but it's hardly nice to people who want a little more than iGPU performance, but aren't interested in running up to the other extreme end of the spectrum. It's interesting to see NV approach the cost-cutting measure of eliminating mobile GPU variants and turning it into a selling point. Kudos to them for keeping the wool up on that aspect at least.

    The Killer NIC is something I think is a poor decision. An Intel adapter would probably have been a better choice for the end user since the benefits of having one have yet to be proven AND the downsides of poor software support and no driver flexibility outweigh the dubious claims from Killer's manufacturer.
  • ImSpartacus - Thursday, October 27, 2016 - link

    Nvidia just named their mobile GPUs differently.

    Fundamentally, very little has changed.

    A couple generations ago, we had a 780mx that was based on an underclocked gk104. Nvidia could've branded it as the "laptop" 770 because it was effectively an underclocked 770, just like the laptop 1080 is an underclocked 1080.

    But the laptop variants are surely binned separately and they are generally implemented on the mxm form factor. So there isn't any logistical improvements just by naming their laptop GPUs differently.
  • The_Assimilator - Thursday, October 27, 2016 - link

    "The number of heat pipes affixed to the GPU, the fact that it's still reaching thermal limits with such cooling, and the absurd PSU requirements for SLI make it pretty obvious the whole desktop-class GPU in a laptop isn't a consumer-friendly move on NV's part."

    nVIDIA is doing crazy things with perf/watt and all you can do is complain that it's not good enough? The fact that they can shoehorn not just one, but TWO of the highest-end consumer desktop GPUs you can buy into a bloody LAPTOP, is massively impressive and literally unthinkable until now. (I'd love to see AMD try to pull that off.) Volta is only going to be better.

    And it's not like you can't go for a lower-end discrete GPU if you want to consume less power, the article mentioned GTX 1070 and I'm sure the GTX 1060 and 1050 will eventually find their way into laptops. But this isn't just an ordinary laptop, it's 5.5kg of desktop replacement, and if you're in the market for one of these I very much doubt that you're looking at anything except the highest of the high-end.
  • BrokenCrayons - Thursday, October 27, 2016 - link

    Please calm down. I realize I'm not in the target market for this particular computer or the GPU it uses. I'm also not displaying disappointment in order to cleverly hide some sort of fangirl obsession for AMD's graphics processors either. What I'm pointing out are two things:

    1.) The GPU is forced to back off from its highest speeds due to thermal limitations despite the ample (almost excessive) cooling solution.

    2.) While performance per watt is great, NV elected to put all the gains realized from moving to a newer, more efficent process into higher performance (in some ways increasing TDP between Maxwell/Kepler/etc. and Pascal in the same price brackets such as the 750 Ti @ 60W vs the 1050 Ti @ 75W) and my personal preference is that they would have backed off a bit from such an aggressive performance approach to slightly reduce power consumption in the same price/performance categories even if it cost in framerates.

    It's a different perspective than a lot of computer enthusiasts might take, but I much perfer gaining less performance while reaping the benefits of reduced heat and power requirements. I realize that my thoughts on the matter aren't shared so I have no delusion of pressing them on others since I'm fully aware I don't represent the majority of people.

    I guess in a lot of ways, the polarization of computer graphics into basically two distinct categories that consist of "iGPU - can't" and "dGPU - can" along with the associated power and heat issues that's brought to light has really spoiled the fun I used to find in it as a hobby. The middle ground has eroded away somewhat in recent years (or so it seems from my observations of industry trends) and when combined with excessive data mining across the board, I more or less want to just crawl in a hole and play board games after dumping my gaming computer off at the local thrift store's donation box. Too bad I'm screen addicted and can't escape just yet, but I'm working on it. :3
  • bji - Thursday, October 27, 2016 - link

    "Please calm down" is an insulting way to begin your response. Just saying.
  • BrokenCrayons - Thursday, October 27, 2016 - link

    I acknowledge your reply as an expression of your opinion. ;)
  • The_Assimilator - Friday, October 28, 2016 - link

    Yeah, but my response wasn't exactly calm and measured either, so it's all fair.
  • BrokenCrayons - Friday, October 28, 2016 - link

    "...so it's all fair."

    It's also important to point out that I was a bit inflammatory in my opening post. It wasn't directed at anyone in particular, but was/is more an expression of frustration with what I think is the industry's unintentional marginalization of the lower- and mid-tiers of home computer performance. Still, being generally grumpy about something in a comments box is unavoidably going to draw a little ire from other people so, in essence, I started it and it's my fault to begin with.
  • bvoigt - Thursday, October 27, 2016 - link

    "my personal preference is that they would have backed off a bit from such an aggressive performance approach to slightly reduce power consumption in the same price/performance categories even if it cost in framerates."

    They did one better, they now give you same performance with reduced power consumption, and at a lower price (980 Ti -> 1070). Or if you prefer the combination of improved performance and slightly reduced power consumption, you can find that also, again at a reduced price (980 Ti -> 1080 or 980 -> 1070).

    Your only complaint seems to be that the price and category labelling (xx80) followed the power consumption. Which is true, but getting hung up on that is stupid because all the power&performance migration paths you wanted do exist, just with a different model number than you'd prefer.
  • BrokenCrayons - Thursday, October 27, 2016 - link

    You know, I never thought about it like that. Good point! Here's to hoping there's a nice, performance boost realized from a hypothetical GT 1030 GPU lurking in the product stack someplace. Though I can't see them giving us a 128-bit GDDR5 memory bus and sticking to the ~25W TDP of the GT 730. We'll probably end up stuck with a 64-bit memory interface with this generation.

Log in

Don't have an account? Sign up now