Introduction

A little over a month and a half after the initial GeForce 7800 GTX launch, we are now looking at the next member of the 7800 family: the 7800 GT. Launching at Quakecon today, the slightly slower, slightly cheaper incarnation of the G70 will also usher in a restructuring in the prices of 6 series parts. The new card will spend some time trying to find its price niche, which will be something like $400 (although the NVIDIA MSRP is $449). Due to the price point and promised performance of this part, we can expect the GT to shake up the market a bit more than the ultra high end incarnation of the G70.



With the 6800 Ultra still a little overpriced at between four and five hundred USD, the introduction of the 7800 GT will either push the 6800 Ultra way down in price or out of the market. With the line of G70 parts, we don't see a fundamental feature set expansion from the previous 6 Series parts. Moving from the GeForce 4 to the FX series, DX9 was first introduced, and the 6 Series brought SM 3.0 real programmable shader performance and native PCI Express. Without offering extremely compelling new features, the spotlight will shift to pure performance, cost, and power/heat. Rounding out the high end, the 7800 GT fills in the performance gap between the 7800 GTX and the 6800 Ultra.

In our minds, there is really no reason for NVIDIA to release any more consumer desktop parts based on G70 when the NV4x series takes care of the rest of the line-up very well. Perhaps it would be possible to release a faster passively cooled card based on a very low clock speed part with more pipelines than the current low end for the Home Theater PC (HTPC) crowd who demand silence along side performance. Other than that, the mobile space is the only other segment that we see really yearning for G70 power. Until performance is increased beyond the 7800 GTX, it will be hard for us to see a reason for a new desktop 7 series part.

Time will tell if our prediction is correct. For now, we are interested in finding out if the 7800 GT is worth the money. Is the performance of the 7800 GTX enough to warrant the price difference, or should we all just be looking at the GT instead? Will the 6800 Ultra be cannibalized by the 7800 GT?

The Card, Specs and Test
POST A COMMENT

77 Comments

View All Comments

  • Quiksel - Thursday, August 11, 2005 - link

    Like I mentioned in one of the other articles:

    "(1) I understand that taking new tech and reviewing it on launch day, etc., is important. (2) Then comes the mass production of the tech by different manufacturers, so there's a need for the readers to be informed on the differences between the different products. (3) Then there's the difference between the interim releases after the initial launch of the new tech that also need reviewing and explanation. From those three different times of a piece of new tech, I would typically expect 3 articles or so for each piece of said new tech. From my initial post, I have just been surprised that what seems to be happening are lots of reviews centered around the second phase of your review cycle, and so that's why I was asking whether this is really what readers want to see on AT all the time (i.e., $500 graphic cards to oggle and wish a relative would die so that we could afford it)."

    "Can't tell you how weird I felt last night to read the new article about the $3000 desk. I guess it helps to have some off-the-wall review about such a nice piece of desk. But is that really what the readers want to see? More hardware that they can't afford? One poster above me here mentioned that you've lost touch with your readers, and sometimes, I wonder whether you're really just trying to fill a niche that no one else is really pursuing in an effort to either drive the industry in that direction or just cater to a crowd that may or may not even visit here. Who knows. I sure got confused with such an article. These 7800GTX articles have done the same for me."

    "I don't know what to tell ya to do, because I'm not in your position. But I certainly don't feel as at home on this site as I used to. Am I getting too old to appreciate all this nice shiny new expensive hardware?? :)"

    4 out of the last 5 articles on AT are all this high-end tech! Where's the sweet spot? The budget? ANYTHING ELSE BUT THE HIGH-END??

    flame away, thanks :)
    Reply
  • coldpower27 - Thursday, August 11, 2005 - link

    What else is there to review? I mean it's not like Nvidia has relased the 7600 Series yet??? Neither is RV530 anywhere to be found. And typically a high end piece of hardware is new, and you remember Anandtech did review the Athlon 64 X2 3800+. Though I would like to see a reivew of the recently announced Sempron 3400+. I would also like to see how the new Celeron D 351 stacks up as well.

    I am not sure it's all that interesting to review the same video card over and over again like reference 6600 GT vs a new one with a new more advanced heatsink, then a new one with a better bundle of software etc...

    Reply
  • JarredWalton - Friday, August 12, 2005 - link

    I have my doubts as to whether a 7600 type card will even *BE* launched in the next six months. Think about it: why piss off all the owners of 6800GT cards by releasing a new card that isn't SLI compatible? From the customer support standpoint, it's better to keep the older SLI-capable cards in production and simply move them to the mid-range and value segments. Which is exactly what NVIDIA did with 6800 and 6800GT with this launch. Now if the 6800U would just drop to $350, everything would be about right. Reply
  • jkostans - Thursday, August 11, 2005 - link

    The 7800GT is slightly slower than a 6800 Ultra SLI setup and the GTX is on par or faster. The GT AND GTX cost less than the additional 6800 ultra upgrade to SLI, so SLI is rather useless. Why opt for an extra power hungry 6800 ultra when you can just swap for a lower power 7800 GT or better performing and lower power GTX for less money? This will happen with the 7800 GTX SLI setup too. SLI should only be a considerationas an initial buy (for rich gamers who want the absolute best), not as an upgrade path for later. Gotta love nVIDIA "rendering" their own technology useless lol!. Reply
  • JNo - Thursday, August 11, 2005 - link

    Hear Hear! Good point, well made and I think intelligent people realised this from the off. Let me think - 2x 6800U dustbusters causing a racket or 1 new 7800GT(X)... Reply
  • Anemone - Thursday, August 11, 2005 - link

    Hi there

    I'd like to suggest maybe using 1920x1200 for high res tests. The popularity of widescreen gaming (where possible) is growing, and this provides a more commonly used "extreme resolution" than the 2048x1536, thus, imo a bit more relevant.

    Just my $.02

    Thanks
    Reply
  • JNo - Thursday, August 11, 2005 - link

    I second this motion for 1920x1200!! Why test at 2048x1536 when most people who could afford these monitors (albeit CRTs) would likely go for widescreen instead? Slightly less pixels but better visual impact... (nb love watching other CS players not spotting an enemy on the peripheral of my screen presumably cos their monitors are not widescreen!) Reply
  • adonn78 - Thursday, August 11, 2005 - link

    First off, no gamer plays videogames at resolutions above 1600x1200! Most of us stick to 1024x768 so that we can get high framerates and enable all the features and play the game on the highest settings. In addition you did not show how the GT and GTX stacked up against the previous generation suchs as the 6800 ultra, GT and the 5950 ultra. And Where is the AGP version? My computer is 2 years old and I am upgrading my graphics card soon. I guess I'll wait to see if ATI makes AGP cards for their next generation. And where the heck is the R520? ATI is really lagging this time around. Hopefully we will get some AGP love. AGP still got a good 2 years of life left in it. Reply
  • DerekWilson - Thursday, August 11, 2005 - link

    I play games at 1920x1080 and 1920x1200 depending on what room I'm in ... and I like to have at least 8xAF on and 4xAA if I can.

    When I'm not playing at those resolutions, I'm playing at 1600x1200 with 4xAA 8xAF period. Any lower than that and I feel like I'm back in 1996.

    But that may just be me :-)

    If I ran benchmarks at 1024x768, no matter the settings, all these cards would give me the same number (barring everquest 2 on extreme quality which would probably still be slow).

    I also play with vsync on so I don't get tearing ... but we test with it off so we can remove the limits and see the cards potential.
    Reply
  • neogodless - Thursday, August 11, 2005 - link

    Hey, that's good to know about the vsync... back when I played Doom III, I noticed some of that, but didn't know much about it. I just felt "robbed" because my Geforce 6800GT was giving me tearing... thought maybe it couldn't keep up with the game. But everywhere I went I saw people saying "Vsync off! Two legs good!" Reply

Log in

Don't have an account? Sign up now