Lighting the Flame

Back in March, we reviewed what has to be regarded as one of the most well-balanced and affordable gaming notebooks we have ever seen. The Gateway P-6831 FX offered very good gaming performance, keeping pace with some of the heavy hitting boutique laptop vendors. The truly impressive aspect was that Gateway managed to ship all of this in a notebook that cost only $1300.

Their approach was to mass produce a notebook that offered one of the slower Core 2 Duo processors paired up with one of the fastest mobile GPUs. With most games still bottlenecked by graphics performance - particularly on laptops - this was a great move. Sure, a little bit more CPU power would have been nice (and the follow-up P-6860 did increase the CPU from the T5450 to the T5550), but otherwise the P-6831 FX was an excellent design. Besides, if you really wanted CPU performance, you could always go out and purchase your own T8300 and still come out with a total cost much lower than the competition. The result was that we gave to 6831 our Gold Editors' Choice award.

If there was one serious problem with the P-6831, it was availability. That particular model was only available through Best Buy, and while there appeared to be a reasonable number of laptops at launch, the favorable press and amazing price quickly made it difficult to find any in stock. One alternative was to simply shop online and purchase a similarly configured notebook from Gateway, and although the price was a few hundred dollars more you also got some upgrades. also carries many Gateway notebooks, including the P-173X FX for $1350, which bumps the processor up to a T7500.

Six months later, Gateway and Best Buy are teaming up again with an upgraded version of the P-6831. We are still working on a larger laptop roundup, but we felt it would be beneficial to alert our readers to the availability of this amazing value sooner rather than later. We will have additional details as part of the roundup; for now, we present some initial benchmark results and an overview of the upgrades.

Features and Specifications
Comments Locked


View All Comments

  • JarredWalton - Friday, August 15, 2008 - link

    9800M GT has 64 SPs; GTS has 96 SPs (like the GTX), and the 9800M GTX has 112 SPs. There's some debate about whether there's rebranding or if there are actual differences; judging by the performance, I'd bet on there being some changes. I believe, for example, that 9800M has the VP3 video processing engine and it is also fabbed on 55nm instead of 65nm... but I might be wrong.
  • JarredWalton - Friday, August 15, 2008 - link

    Suck... I screwed that up. I don't know why NVIDIA switches GT/GTS meanings all the time. 8800 GTS 320/640 < 8800 GT < 8800 GTS 512. Now we have 8800M GTS < 8800M GT. Stupid. Also worth noting is that NVIDIA has released no specific details on the core/RAM clock speeds for the 9800M series.
  • fabarati - Friday, August 15, 2008 - link

    I was basing my information upon what Clevo resellers were saying in the Notebook Review forums. There was this huge fight about this, due to nVidia posting the wrong specs on their webpage. When the NDA was lifted, they could come out and say that they were the same card.

    But yea, nVIDIA is being really annoying with the suffixes. ATI has a pretty clear lineup, for now.
  • JarredWalton - Friday, August 15, 2008 - link

    Okay, updated with the clock speed info from nTune (as well as NVIDIA's specs pages). It looks like all of the shaders are 1250MHz, while the RAM speed on all the units I've seen so far is 800MHz (1600MHz DDR3). I don't know for sure what the clocks are on the 9800M GT/GTX, as I haven't seen a laptop with that GPU yet. So in order of performance, and assuming 600MHz GPU clocks on all the 9800 cores, we have:

    8800M GTS
    9800M GTS (up to ~20% faster than 8800M GTS)
    8800M GTX (up to ~50% faster than 8800M GTS)
    9800M GT (up to ~80% faster than 8800M GTS)
    9800M GTX (up to ~110% faster than 8800M GTS)

    Now, the maximum performance increase relative to the 8800M GTS is based on the game being purely shader processing limited. Many games depend on GPU memory bandwidth and fill rate as well, in which case the difference will be much smaller.
  • fabarati - Friday, August 15, 2008 - link

    Oh, and a 1440x900 resolution is a WXGA+ resolution, not SXGA+.

Log in

Don't have an account? Sign up now