The Card

The GeForce 8800 GT, whose heart is a G92 GPU, is quite a sleek card. The heatsink shroud covers the entire length of the card so that no capacitors are exposed. The card's thermal envelop is low enough, thanks to the 65nm G92, to require only a single slot cooling solution. Here's a look at the card itself:



The card makes use of two dual-link DVI outputs and a third output for analog HD and other applications. We see a single SLI connector on top of the card, and a single 6-pin PCIe power connector on the back of the card. NVIDIA reports the maximum dissipated power as 105W, which falls within the 150W power envelope provided by the combination of one PCIe power connector and the PCIe x16 slot itself.

The fact that this thing is 65nm has given rise to at least one vendor attempting to build an 8800 GT with a passive cooler. While the 8800 GT does use less power than other cards in its class, we will have to wait and see if passive cooling will remain stable even through the most rigorous tests we can put it through.

Earlier this summer we reviewed NVIDIA's VP2 hardware in the form of the 8600 GTS. The 8800 GTX and GTS both lacked the faster video decode hardware of the lower end 8 Series hardware, but the 8800 GT changes all that. We now have a very fast GPU that includes full H.246 offload capability. Most of the VC-1 pipeline is also offloaded by the GPU, but the entropy encoding used in VC-1 is not hardware accelerated by NVIDIA hardware. This is less important in VC-1, as the decode process is much less strenuous. To recap the pipeline, here is a comparison of different video decode hardware:



NVIDIA's VP2 hardware matches the bottom line for H.264, and the line above for VC-1 and MPEG-2. This includes the 8800 GT.

We aren't including any new tests here, as we can expect performance on the same level as the 8600 GTS. This means a score of 100 under HD HQV, and very low CPU utilization even on lower end dual core processors.

Let's take a look at how this card stacks up against the rest of the lineup:

Form Factor 8800 GTX 8800 GTS 8800 GT 8600 GTS
Stream Processors 128 96 112 32
Texture Address / Filtering 32 / 64 24 / 48 56 / 56 16 / 16
ROPs 24 20 16 8
Core Clock 575MHz 500MHz 600MHz 675MHz
Shader Clock 1.35GHz 1.2GHz 1.5GHz 1.45GHz
Memory Clock 1.8GHz 1.6GHz 1.8GHz

2.0GHz

Memory Bus Width 384-bit 320-bit 256-bit 128-bit
Frame Buffer 768MB 640MB / 320MB 512MB / 256MB 256MB
Transistor Count 681M 681M 754M 289M
Manufacturing Process TSMC 90nm TSMC 90nm TSMC 65nm TSMC 80nm
Price Point $500 - $600 $270 - $450 $199 - $249 $140 - $199


On paper, the 8800 GT completely gets rid of the point of the 8800 GTS. The 8800 GT has more shader processing power, can address and filter more textures per clock, and only falls short in the number of pixels it can write out to memory per clock and overall memory bandwidth. Even then, the memory bandwidth advantage of the 8800 GTS isn't that great (64GB/s vs. 57.6GB/s), amounting to only 11% thanks to the 8800 GT's slightly higher memory clock. If the 8800 GT does end up performing the same, if not better, than the 8800 GTS then NVIDIA will have truly thrown down an amazing hand.

You see, the GeForce 8800 GTS 640MB was an incredible performer upon its release, but it was still priced too high for the mainstream. NVIDIA turned up the heat with a 320MB version, which you'll remember performed virtually identically to the 640MB while bringing the price down to $300. With the 320MB GTS, NVIDIA gave us the performance of its $400 card for $300, and now with the 8800 GT, NVIDIA looks like it's going to give us that same performance for $200. And all this without a significant threat from AMD.

Before we get too far ahead of ourselves, we'll need to see how the 8800 GT and 8800 GTS 320MB really do stack up. On paper the decision is clear, but we need some numbers to be sure. And we can't get to the numbers until we cover a couple more bases The only other physical point of interest about the 8800 GT is the fact that it takes advantage of the PCIe 2.0 specification. Let's take a look at what that really means right now.

G92: Funky Naming for a G80 Derivative The First PCIe 2.0 Graphics Card
Comments Locked

90 Comments

View All Comments

  • vijay333 - Monday, October 29, 2007 - link

    just activated the step-up on my current 8800GTS 320MB -- after shipping costs and discounting the MIR from back then, I actually get the 8800GT 512MB for -$12 :)
  • bespoke - Monday, October 29, 2007 - link

    Lucky bastard! :)
  • vijay333 - Monday, October 29, 2007 - link

    hehe...great timing too. only had 5 days remaining before the 90day limit for the step-up program expired :)
  • clockerspiel - Monday, October 29, 2007 - link

    Genrally, Anandtech does an excellent job with it's reviews and uses robust benchmarking methodology. Any ideas why the Tech Report's results are so different?

    http://www.techreport.com/articles.x/13479">http://www.techreport.com/articles.x/13479


  • Frumious1 - Monday, October 29, 2007 - link

    Simply put? TechReport is doing some funny stuff (like HardOCP often does) with their benchmarking on this one. I have a great idea: let's find the WORST CASE SCENARIO for the 8800 GT vs. the 8800 GTS 640 and then ONLY show those resolutions! 2560x1600 4xAA/16xAF? Ignoring the fact that 16xAF isn't noticeably different from 8xAF - and that 4xAA is hardly necessary at 2560x1600 there are just too many questions left by the TR review. They generally come to the same conclusion that this is a great card, but it's almost like they're struggling to find ANY situation where the 8800 GT might not be as good as the 8800 GTS 640.

    For a different, more comprehensive look at the 8800 GT, why not try http://www.firingsquad.com/hardware/nvidia_geforce...">the FiringSquad review? They test at a variety of resolutions with a decent selection of GPUs and games. Out of all of their results, the only situation where the 8800 GTS 640 comes out ahead of the 8800 GT is in Crysis at 2xAA/8xAF at 1920x1200. Granted, they don't have 2560x1600 resolutions in their results, but how many midrange people use 30" LCDs? For that matter, how many highend gamers use 30" LCDs? I'm sure they're nice, but for $1300+ I have a lot of other stuff I'd be interested in purchasing!

    There are a lot of things that we don't know about testing methodology with all of the reviews. What exact detail settings are used, for example, and more importantly how realistic are those settings? Remember Doom 3's High Quality and Ultra Quality? Running everything with uncompressed textures to artificially help 512MB cards appear better than 256MB cards is stupid. Side by side screenshots showed virtually no difference. I don't know what the texture settings are in the Crysis demo, but I wouldn't be surprised if a bunch of people are maxing everything out and then crying about performance. Being a next gen title, I bet Crysis has the ability to stress the 1GB cards - whether or not it really results in an improved visual experience.

    Maybe we can get some image quality comparisons when the game actually launches, though - because admittedly I could be totally wrong and the Crysis settings might be reasonable.
  • Frumious1 - Monday, October 29, 2007 - link

    Simply put? TechReport is doing some funny stuff (like HardOCP often does) with their benchmarking on this one. I have a great idea: let's find the WORST CASE SCENARIO for the 8800 GT vs. the 8800 GTS 640 and then ONLY show those resolutions! 2560x1600 4xAA/16xAF? Ignoring the fact that 16xAF isn't noticeably different from 8xAF - and that 4xAA is hardly necessary at 2560x1600 there are just too many questions left by the TR review. They generally come to the same conclusion that this is a great card, but it's almost like they're struggling to find ANY situation where the 8800 GT might not be as good as the 8800 GTS 640.

    For a different, more comprehensive look at the 8800 GT, why not try
  • Parafan - Monday, October 29, 2007 - link

    I just dont like being fed by the same site to tell 2 totally different things when picking my new GPU card.
  • Parafan - Monday, October 29, 2007 - link

    Ive been following anandtech testresults very carefully since the UT3 demo was released. What i can find comparing these results to the others in UT3 just doesnt make any sense ;

    1.st

    Looking at : http://www.anandtech.com/video/showdoc.aspx?i=3140...">http://www.anandtech.com/video/showdoc.aspx?i=3140...
    shows the new 8800GT card beating 2900XT by, almost 120fps vs 105fps or so, in 1280*1024 @ UT3.

    2.nd
    Looking at the first & second GPU test : http://www.anandtech.com/video/showdoc.aspx?i=3128...">http://www.anandtech.com/video/showdoc.aspx?i=3128...
    Shows the 2900XT being on top with about 108,5fps, vs 8800 ULTRA, GTX and GTS, with 104,2 98,3 and 97.2 @ 1280 * 1024.
    Prett close nr.s you see.

    3.rd
    Looking at the new test again, 8800GT VS 8800GTS : http://www.anandtech.com/video/showdoc.aspx?i=3140...">http://www.anandtech.com/video/showdoc.aspx?i=3140...
    Shows the 8800GT beating 8800GTS. @ 1280 * 1024 = close to 120fps vs 105fps. The GTS still over 100, when being below a 100 on the previous test.
    But the huge difference is @ 1600 * 1200. 8800GT right above 100fps, when the GTS around 90? On the previous test GTS showed results as low as 77fps, cmon something smells wierd.

    See where im going?

    http://www.anandtech.com/video/showdoc.aspx?i=3140...">http://www.anandtech.com/video/showdoc.aspx?i=3140...
    just showed the 8600GTS performing alot worse in this new test compared to the old one, @ all resolutions.

    and again

    http://www.anandtech.com/video/showdoc.aspx?i=3140...">http://www.anandtech.com/video/showdoc.aspx?i=3140...
    8800GT and 8800GTX performing about the same, at the highest almost 120fps. compared to the previous test thats like 20 fps better than the GTX performed last time. Why dont these tests corresponde at all to the one just made?

    Seems like all the 8800GT, GTX, ULTRA cards just got awhole freaking lot better, and making the 2900xt looking worse. WHICH I FIND DOUBLTY.. Someone bring the facts to the table.

    dont tell me 2extra gb of ram made the nvidia cards play alot better, and the ati card alot worse!

  • DerekWilson - Monday, October 29, 2007 - link

    We used a different driver version this time -- in fact, we've gone through two driver revisions from NVIDIA here.

    The AMD card didn't slip significatnly in performance at all (differences were all within 3%).

    We did rerun the numbers, and we really think its a driver issue -- the new NV driver improved performance.
  • Parafan - Wednesday, November 7, 2007 - link

    Well clearly a graphics issue this must be. But I read nvidia 169.xx drivers were made for optimizing the performance, but lowering the quality of the graphics.
    This was prooved when the water was less nicer in crysis etc with 169.04 and 169.01, than with their previous 163.xx drivers.

Log in

Don't have an account? Sign up now