Here we are, a year after the launch of G80, and we are seeing what amounts to the first real "refresh" part. Normally, we see a new or revamped version of hardware about 6 months after its introduction, but this time NVIDIA introduced its latest architecture over a six month period instead. First we saw the high end hardware hit, then the low end parts emerged after resting on previous generation hardware to serve as the low end. We haven't seen a true midrange part come out over the past year, which has disappointed many.

Rather than actually create a midrange part based on G80, NVIDIA opted to tweak the core, shrink to a 65nm process, integrate the display engine, and come out with hardware that performed somewhere between the high end 8800 GTS and GTX (G92). While this, in itself, isn't remarkable, the fact that NVIDIA is pricing this card between $200 and $250 is. Essentially, we've been given a revised high end part at midrange prices. The resulting card, the 8800 GT, essentially cannibalizes a large chunk of NVIDIA's own DX10 class hardware lineup. Needless to say, it also further puts AMD's 2900 XT to shame.



We will certainly provide data to back up all these ridiculous claims (I actually think NVIDIA may have invented the question mark as well), but until then, let's check out what we are working with. We've got a lot to cover, so let's get right to it.

G92: Funky Naming for a G80 Derivative
POST A COMMENT

90 Comments

View All Comments

  • AggressorPrime - Monday, October 29, 2007 - link

    I made a typo. Let us hope they are not on the same level. Reply
  • ninjit - Monday, October 29, 2007 - link

    This page has my very confused:
    http://www.anandtech.com/video/showdoc.aspx?i=3140...">http://www.anandtech.com/video/showdoc.aspx?i=3140...

    The text of the article goes on as if the GT doesn't really compare to the GTX, except on price/performance:

    quote:

    We would be out of our minds to expect the 8800 GT to even remotely compete with the GTX, but the real question is - how much more performance do you get from the extra money you spent on the GTX over the GT?


    quote:

    But back to the real story, in spite of the fact that the 8800 GT doesn't touch the GTX, two of them will certainly beat it for either equal or less money.



    Yet all the graphs show the GT performing pretty much on par with the GTX, with at most a 5-10fps difference at the highest resolution.

    I didn't understand that last sentence I quoted above at all.
    Reply
  • archcommus - Monday, October 29, 2007 - link

    This is obviously an amazing card and I hope it sets a new trend for getting good gaming performance in the latest titles for around $200 like it used to be, unlike the recent trend of having to spend $350+ for high end (not even ultra high end). However, I don't get why a GT part is higher performing than a GTS, isn't that going against their normal naming scheme a bit? I thought it was typically: Ultra -> GTX -> GTS -> GT -> GS, or something like that. Reply
  • mac2j - Monday, October 29, 2007 - link

    I've been hearing rumors about an Nvidia 9800 card being released in the coming months .... is that the same card with an outdated/incorrect naming convention or a new architecture beyond G92?

    I guess if Nvidia had a next-gen architecture coming it would explain why they dont mind wiping some of their old products off the board with the 8800 GT which seems as though it will be a dominant part for the remaining lifetime of this generation of parts.
    Reply
  • MFK - Monday, October 29, 2007 - link

    After lurking on Anandtech for two layout/design revisions, I have finally decided to post a comment. :D
    First of all hi all!

    Second of all, is it okay that nVidia decided not to introduce a proper next gen part in favour of this mid range offering? Okay so its good and what not, but what I'm wondering is, something that the article does not talk about, is what the future value of this card is. Can I expect this to play some upcoming games (Alan Wake?) on 1600 x 1200? I know its hard to predict, but industry analysts like you guys should have some idea. Also how long can I expect this card to continue playing games at acceptable framerates? Any idea, any one?
    Thanks.
    Reply
  • DerekWilson - Monday, October 29, 2007 - link

    that's a tough call ....

    but really, it's up to the developers.

    UT3 looks great in DX9, and Bioshock looks great in DX10. Crysis looks amazing, but its a demo, not final code and it does run very slow.

    The bottom line is that developers need to balance the amazing effects they show off with playability -- it's up to them. They know what hardware you've got and they chose to push the envelope or not.

    I konw that's not an answer, sorry :-( ... it is just nearly impossible to say what will happen.
    Reply
  • crimson117 - Monday, October 29, 2007 - link

    How much ram was on the 8800 GT used in testing? Was is 256 or 512? Reply
  • NoBull6 - Monday, October 29, 2007 - link

    From context, I'm thinking 512. Since 512MB are the only cards available in the channel, and Derek was hypothesizing about the pricing of a 256MB version, I think you can be confident this was a 512MB test card. Reply
  • DerekWilson - Monday, October 29, 2007 - link

    correct.

    256MB cards do not exist outside NVIDIA at this point.
    Reply
  • ninjit - Monday, October 29, 2007 - link

    I was just wondering about that too.

    I thought I missed it in the article, but I didn't see it in another run through.

    I see I'm not the only one who was curious
    Reply

Log in

Don't have an account? Sign up now