Out with the Old, In with the New: 8800 GT vs. 7950 GT and 1950 XT

Many gamers are likely still rocking either GeForce 7 or Radeon X1k based hardware. We understand that gamers don't have a continuous $250 fund in order to upgrade their graphics card whenever something new comes out. A good many of us have been waiting (and not so patiently) for a DX10 class graphics card in the $200 - $250 range. The 8800 GTS 320MB has been a great option for those who could afford it, but the 8600 GTS and 2600 XT really haven't delivered anything close to the kind of performance we wanted for the price.

We don't expect many people to "upgrade" to an 8800 GT from an 8800 GTS 320MB, we do expect those who spent at least $250+ on a previous generation DX9 class card to be interested in moving up to a current generation product. In order to paint a good picture of what gamers with older hardware can expect, we decided to pick only a couple reference points. While we could have tested everything out there, we felt that looking at the absolute fasted DX9 class card available (the Radeon X1950 XTX) and a card that offered good performance at between $250 and $300 (the GeForce 7950 GT) would give us a fairly complete picture of what to expect.

The reason this really makes sense, as we will show in a second, is that the 8800 GT absolutely blows away every DX9 class part out there. The only thing we really need to show is what kind of performance improvement you can expect depending on the type of hardware you own. If you own the best possible previous generation card, you get a very good performance improvement at most resolutions. If you own a previous generation card from the same price segment, you can expect a huge improvement in performance across the board. That said, feast your eyes on what everyone who hasn't upgraded yet can look forward to (in addition to all the added features of the GeForce 8 Series).











What must AMD do? 8800 GT vs. 2900 XT GeForce 8800 GT MultiGPU Scaling
Comments Locked

90 Comments

View All Comments

  • defter - Monday, October 29, 2007 - link

    Yes it has VP2 processor for video decoding. But why would you need a fast gaming card for HTPC? Wouldn't 8400/8600 be a cheaper/cooler solution?
  • Hulk - Monday, October 29, 2007 - link

    Thanks for the reply.
    This card looks to be pretty cool running and when not running 3D intensive apps I'm sure power consumption and noise is really low.
    So it might be nice to be able to play a little on a 52"LCD!
  • DerekWilson - Monday, October 29, 2007 - link

    also, if you go with a less powerful card for HD HTPC you'll want at minimum the 8600 GTS -- which is not a good card. The 8800 GT does offer a lot more bang for the buck, and Sparkle is offering a silent version.
  • spittledip - Monday, October 29, 2007 - link

    Nothing like cherry picking the games... I don't understand why games like Stalker and Prey weren't tested as the 2900XT has superior performance on those titles, as well as other titles. Seems like a biased test.
  • AssBall - Monday, October 29, 2007 - link

    They didn't test The Sims2 or DeerHunter either...
  • DerekWilson - Monday, October 29, 2007 - link

    lol ... stalker and prey?

    we tested quake wars, which is effectively updated prey (id's engine).

    and stalker runs better on nvidia hardware -- when tested properly (many people use demo flybys that point up at the sky way too much rather than fraps run throughs).
  • abe88 - Monday, October 29, 2007 - link

    Hmmm I thought ATI's RV630 and RV610 chips both support PCI-E 2.0?
  • Wirmish - Monday, October 29, 2007 - link

    Yeah but it's not worth mentioning because theses GPU are not from nVidia.
  • defter - Monday, October 29, 2007 - link

    quote:

    The G92 is fabbed on a 65nm process, and even though it has fewer SPs, less texturing power


    G92 has the same amount of SPs and MORE texturing power (twice as many addressing units) than G80. However, 8800GT card has some SPs and texture units disabled.

  • DerekWilson - Monday, October 29, 2007 - link

    well, first, if G92 has those units disabled, then it can't claim them.

    second, NVIDIA would not confirm that the G92 as incarnate on 8800 GT has units disabled, but it is fair to speculate that this configuration was chosen to work out yields on their first 65nm part.

Log in

Don't have an account? Sign up now