Final Words

It's really not often that we have the pleasure to review a product so impressively positioned. The 8800 GT is a terrific part, and it is hitting the street at a terrific price (provided NVIDIA's history of properly projecting street prices continues). The performance advantage and price utterly destroyed our perception of the GPU landscape. We liked the value of the 8800 GTS 320, and we were impressed when NVIDIA decided to go that route, providing such a high performance card for so little money. Upping the ante even more this time around really caught us off guard.

This launch really has the potential to introduce a card that could leave the same lasting impression on the computer industry that the Ti4200 left all those years ago. This kind of inflection point doesn't come along every year, or even every generation. But when architecture, process enhancements, and design decisions line up just right, the potential for a revolutionary product is high. Maybe our expectations were lowered due to the lack luster performance of the 8600 and 2600 series of cards, as well as the lack of true midrange cards priced between $200 and $250. Even without the sad state of the low end and lack of a midrange part, the 8800 GT is a great option.

What we expect going forward is for NVIDIA to fill in their now mostly devastated product line (we only count the 8400, 8500, 8800 GT, and 8800 GTX/Ultra as viable offerings from NVIDIA) with a new range of 65nm parts. As soon as their process is up to speed and validated by a strong run of G92 hardware, it will only be logical to move all (or most) other GPUs over. The production of smaller die sizes directly translates to monetary savings. There is a cost associated with moving a design over to a new process, but the 8800 GT could have been built with this in mind. It could be that 8800 GT is simply a way to ramp up 65nm production for the rest of the lineup. They could have hidden some extra transistors up there to enable them to simply turn on a higher end part when yields get high enough. Alternately, perhaps we could see another line of low end cards make their way out based on the 65nm process (because smaller die size adds up to reduced manufacturing cost).

Whatever the reason for the 8800 GT, we are glad of its existence. This truly is the part to beat in terms of value.

Power Consumption
POST A COMMENT

90 Comments

View All Comments

  • DukeN - Monday, October 29, 2007 - link

    This is unreal price to performance - knock on wood; play oblivion at 1920X1200 on a $250 GPU.

    Could we have a benchmark based on the Crysis demo please, how one or two cards would do?

    Also, the power page pics do not show up for some reason (may be the firewall cached it incorrectly here at work).

    Thank you.
    Reply
  • Xtasy26 - Monday, October 29, 2007 - link

    Hey Guys,

    If you want to see Crysis benchmarks, check out this link:

    http://www.theinquirer.net/gb/inquirer/news/2007/1...">http://www.theinquirer.net/gb/inquirer/.../2007/10...

    The benches are:

    1280 x 1024 : ~ 37 f.p.s.
    1680 x 1050 : 25 f.p.s.
    1920 x 1080 : ~ 21 f.p.s.

    This is on a test bed:

    Intel Core 2 Extreme QX6800 @2.93 GHz
    Asetek VapoChill Micro cooler
    EVGA 680i motherboard
    2GB Corsair Dominator PC2-9136C5D
    Nvidia GeForce 8800GT 512MB/Zotac 8800GTX AMP!/XFX 8800Ultra/ATI Radeon HD2900XT
    250GB Seagate Barracuda 7200.10 16MB cache
    Sony BWU-100A Blu-ray burner
    Hiper 880W Type-R Power Supply
    Toshiba's external HD-DVD box (Xbox 360 HD-DVD drive)
    Dell 2407WFP-HC
    Logitech G15 Keyboard, MX-518 rat
    Reply
  • Xtasy26 - Monday, October 29, 2007 - link

    This game seems real demanding. If it is getting 37 f.p.s. at 1280 x 1024, imagine what the frame rate will be with 4X FSAA enabled combined with 8X Anistrophic Filtering. I think I will wait till Nvidia releases there 9800/9600 GT/GTS and combine that with Intel's 45nm Penryn CPU. I want to play this beautiful game in all it's glory!:) Reply
  • Spuke - Monday, October 29, 2007 - link

    Impressive!!!! I read the article but I saw no mention of a release date. When's this thing available? Reply
  • Spuke - Monday, October 29, 2007 - link

    Ummm.....When can I BUY it? That's what I mean. Reply
  • EODetroit - Monday, October 29, 2007 - link

    Now.

    http://www.newegg.com/Product/ProductList.aspx?Sub...">http://www.newegg.com/Product/ProductLi...18+10696...
    Reply
  • poohbear - Wednesday, October 31, 2007 - link

    when do u guys think its gonna be $250? cheapest i see is $270, but i understand when its first released the prices are jacked up a bit. Reply
  • EateryOfPiza - Monday, October 29, 2007 - link

    I second the request for Crysis benchmarks, that is the game that taxes everything at the moment. Reply
  • DerekWilson - Monday, October 29, 2007 - link

    we actually tested crysis ...

    but there were issues ... not with the game, we just shot ourselves in the foot on this one and weren't able to do as much as we wanted. We had to retest a bunch of stuff, and we didn't get to crysis.
    Reply
  • yyrkoon - Monday, October 29, 2007 - link

    Yes, I am glad instead of purchasing a video card, I instead changed motherboard/CPU for Intel vs AMD. I still like my AM2 Opteron system a lot, but performance numbers, and the effortless 1Ghz OC on the ABIT IP35-E/(at $90usd !) was just too much to overlook.

    I can definitely understand your 'praise' as it were when nVidia is now lowering their prices, but this is where these prices should have always been. nVidia, and ATI/AMD have been ripping us, the consumer off for the last 1.5 years or so, so you will excuse me if I do not show too much enthusiasm when they finally lower their prices to where they should be. I do not consider this to be much different than the memory industry over charging, and the consumer getting the shaft(as per your article).

    I am happy though . . .
    Reply

Log in

Don't have an account? Sign up now