Mobility Marathon

Cranking out the fastest performance in applications or games is one measure of a good notebook, but honestly it's not the standard by which many users select their laptops. For many, size and weight as well as battery life are going to be the more important areas when purchasing a laptop. Hopefully it's already obvious that the Gateway P-series FX laptops are not going to be great candidates when it comes to size, and battery life is also generally disappointing. Some of the lower-end models ship with integrated graphics and can offer reasonably good battery life, but they're still a very large chassis to lug around. What's more, the battery protrudes about an inch and a half out the rear of the chassis, making the laptop seem even larger. You will definitely want to pay attention to your choice of carrying case if you want to use it with the P-series; we have a couple 17" notebook bags that can't hold these notebooks (even with the battery removed).

Returning to the question we want to answer, does the switch to DDR3 and a P8400 - both of which lower power requirements - have a noticeable impact on battery life and power requirements? Of course, the lower power CPU and memory may be offset by a more power hungry GPU in some cases, but as long as you're not running a 3D application we expect the P-7811 to surpass the previously tested FX notebooks.

Power Requirements

System Power Requirements

System Power Requirements

System Power Requirements

Power draw (measured at the outlet) does indeed drop with the P-7811 relative to the P-6831. The P-171XL really wasn't in the running, given the second hard drive and X7900 CPU. Particularly in the 100% CPU load test, we see a massive benefit from the P8400 and DDR3; the P-7811 uses 13W less than the P-6831 (a 23% difference). In the maximum load test, where we tax both the CPU and GPU, the 7811 does use 9W more than the 6831; that's somewhat expected, however, as you can't generate a maximal GPU load without a faster CPU. In gaming power draw (not shown) the two systems are pretty much tied.

Battery Life

Of course, power requirements when a laptop is plugged in don't necessarily reflect power requirements when a laptop is on battery power. Hardware can provide better performance when plugged in and better battery life when in power saving mode, and the 9800M and P8400 should both provide such an advantage. For our battery life testing, we have now switched to running all laptops at around 100 nits brightness. Differences between displays and brightness adjustments mean we are not always at exactly 100 nits, but the range is 90-110 nits in all cases. If you choose to run your LCD at maximum brightness, you may lose anywhere from 10 to 60 minutes depending on the laptop - the latter is mostly for ultra-mobile options while the former is for gaming notebooks.

Battery Life

Battery Life

Battery Life - Idle

In terms of battery life, the P-7811 shows some significant improvements over the 6831 and 171XL. Clearly, the DDR3 and 25W TDP processor are doing their job, with P-7811 battery life improving by up to 58% over the lower performing P-6831. Our three test scenarios test DVD playback, web surfing (using the wireless adapter), and a best-case idle benchmark where we simply unplug the laptop and let it sit. Keep in mind that even light use of the laptop should reduce the battery life from our idle scenario, so it is purely a high water mark.

With the P8400 (and Centrino 2) offering improved deep sleep states over the previous Santa Rosa refresh, the largest improvements are found in the idle test. DVD playback improves by an equally impressive 50%, and Internet surfing improves by 34%. Two and a half hours of battery life for movies or surfing is certainly nothing exceptional compared to some laptops, but for a gaming laptop it's actually one of the best results we've seen.

General Application Performance Initial Thoughts
Comments Locked

45 Comments

View All Comments

  • JarredWalton - Friday, August 15, 2008 - link

    9800M GT has 64 SPs; GTS has 96 SPs (like the GTX), and the 9800M GTX has 112 SPs. There's some debate about whether there's rebranding or if there are actual differences; judging by the performance, I'd bet on there being some changes. I believe, for example, that 9800M has the VP3 video processing engine and it is also fabbed on 55nm instead of 65nm... but I might be wrong.
  • JarredWalton - Friday, August 15, 2008 - link

    Suck... I screwed that up. I don't know why NVIDIA switches GT/GTS meanings all the time. 8800 GTS 320/640 < 8800 GT < 8800 GTS 512. Now we have 8800M GTS < 8800M GT. Stupid. Also worth noting is that NVIDIA has released no specific details on the core/RAM clock speeds for the 9800M series.
  • fabarati - Friday, August 15, 2008 - link

    I was basing my information upon what Clevo resellers were saying in the Notebook Review forums. There was this huge fight about this, due to nVidia posting the wrong specs on their webpage. When the NDA was lifted, they could come out and say that they were the same card.

    But yea, nVIDIA is being really annoying with the suffixes. ATI has a pretty clear lineup, for now.
  • JarredWalton - Friday, August 15, 2008 - link

    Okay, updated with the clock speed info from nTune (as well as NVIDIA's specs pages). It looks like all of the shaders are 1250MHz, while the RAM speed on all the units I've seen so far is 800MHz (1600MHz DDR3). I don't know for sure what the clocks are on the 9800M GT/GTX, as I haven't seen a laptop with that GPU yet. So in order of performance, and assuming 600MHz GPU clocks on all the 9800 cores, we have:

    8800M GTS
    9800M GTS (up to ~20% faster than 8800M GTS)
    8800M GTX (up to ~50% faster than 8800M GTS)
    9800M GT (up to ~80% faster than 8800M GTS)
    9800M GTX (up to ~110% faster than 8800M GTS)

    Now, the maximum performance increase relative to the 8800M GTS is based on the game being purely shader processing limited. Many games depend on GPU memory bandwidth and fill rate as well, in which case the difference will be much smaller.
  • fabarati - Friday, August 15, 2008 - link

    Oh, and a 1440x900 resolution is a WXGA+ resolution, not SXGA+.

Log in

Don't have an account? Sign up now