G92: Funky Naming for a G80 Derivative

If we expect the G9x to represent a new architecture supporting the GeForce 9 series, we would be wrong. In spite of the fact that part of the reason we were given for NVIDIA's move away from NVxx code naming was to bring code name and product name closer to parity (G7x is GeForce 7, G8x is GeForce 8), it seems NVIDIA has broken this rule rather early on. Code names are automatically generated, but how we only ended up with three different G8x parts before we hit G9x is certainly a mystery. One that NVIDIA didn't feel like enlightening us on, as it no doubt has to do with unannounced products.

While not a new architecture, the GPU behind the 8800 GT has certainly been massaged quite a bit from the G80. The G92 is fabbed on a 65nm process, and even though it has fewer SPs, less texturing power, and not as many ROPs as the G80, it's made up of more transistors (754M vs. 681M). This is partly due to the fact that G92 integrates the updated video processing engine (VP2), and the display engine that previously resided off chip. Now, all the display logic including TMDS hardware is integrated onto the GPU itself.

In addition to the new features, there have been some enhancements to the architecture that likely added a few million transistors here and there as well. While we were unable to get any really good details, we were told that lossless compression ratios were increased in order to enable better performance at higher resolutions over the lower bandwidth memory bus attached to the G92 on 8800 GT. We also know that the proportion of texture address units to texture filtering units has increased to a 1:1 ratio (similar to the 8600 GTS, but in a context where we can actually expect decent performance). This should also improve memory bandwidth usage and texturing power in general.

Because NVIDIA was touting the addition of hardware double precision IEEE 754 floating point on their workstation hardware coming sometime before the end of the year, we suspected that G92 might include this functionality. It seems, however, that the hardware behind that advancement has been pushed back for some reason. G92 does not support hardware double precision floating point. This is only really useful for workstation and GPU computing applications at the moment, but because NVIDIA design one GPU for both consumer and workstation applications, it will be interesting to see if they do anything at all with double precision on the desktop.

With every generation, we can expect buffers and on chip memory to be tweaked based on experience with the previous iteration of the hardware. This could also have resulted in additional transistors. But regardless of the reason, this GPU packs quite a number of features into a very small area. The integration of these features into one ASIC is possible economically because of the 65nm process: even though there are more transistors, the physical die takes up much less space than the G80.

Index The Card
Comments Locked

90 Comments

View All Comments

  • Spacecomber - Monday, October 29, 2007 - link

    It's hard to tell what you are getting when you compare the results from one article to those of another article. Ideally, you would like to be able to assume that the testing was done in an identical manner, but this isn't typically the case. As was already pointed out, look at the drivers being used. The earlier tests used nvidia's 163.75 drivers while the tests in this article used nvidia's 169.10 drivers.

    Also, not enough was said about how Unreal 3 was being tested to know, but I wonder if they benchmarked the the game in different manners for the different articles. For example, were they using the same map "demo"? Were they using the game's built-in fly-bys or where they using FRAPS? These kind of differences between articles could make direct comparisons between articles difficult.
  • spinportal - Monday, October 29, 2007 - link

    Have you checked the driver versions? Over time drivers do improve performance, perhaps?
  • Parafan - Monday, October 29, 2007 - link

    Well the 'new' drivers made the GF 8600GTS Perform alot worse. But the higher ranked cards better. I dont know how likely that is
  • Regs - Monday, October 29, 2007 - link

    To blacken. I am a big AMD fan, but right now it's almost laughable how they're getting stepped and kicked on by the competition.

    AMD's ideas are great for the long run, and their 65nm process was just a mistake since 45nm is right around the corner. They simply do not know how to compete when the heat is on. AMD is still traveling in 1st gear.
  • yacoub - Monday, October 29, 2007 - link

    "NVIDIA Demolishes... NVIDIA? 8800 GT vs. 8600 GTS"

    Well the 8600GTS was a mistake that never should have seen the light of day: over-priced, under-featured from the start. The 8800 GT is the card we were expecting back in the Spring when NVidia launched that 8600 GTS turd instead.
  • yacoub - Monday, October 29, 2007 - link

    First vendor to put a quieter/larger cooling hsf on it gets my $250.
  • gamephile - Monday, October 29, 2007 - link

    Dih. Toh.
  • CrystalBay - Monday, October 29, 2007 - link

    Hi Derek, How are the Temps on load? I've seen some results of the GPU pushing 88C degrees plus with that anemic stock cooler.
  • Spacecomber - Monday, October 29, 2007 - link

    I may be a bit misinformed on this, but I'm getting the impression that Crysis represents the first game that makes major use of DX10 features, and as a consequence, it takes a major bite out of the performance that existing PC hardware can provide. When the 8800GT is used in a heavy DX10 game context does the performance that results fall into a hardware class that we typically would expect from a $200 part? In other words, making use of the Ti-4200 comparison, is the playable performance only acceptable at moderate resolutions and medium settings?

    We've seen something like this before, when DX8 hardware was available and people were still playing DX7 games with this new hardware, the performance was very good. Once games started to show up that were true DX8 games, hardware (like the Ti-4200) that first supported DX8 features struggled to actually run these DX8 features.

    Basically, I'm wondering whether Crysis (and other DX10 games that presumably will follow) places the 8800GT's $200 price point into a larger context that makes sense.
  • Zak - Monday, November 5, 2007 - link

    I've run Vista for about a month before switching back to XP due to Quake Wars crashing a lot (no more crashes under XP). I've run bunch of demos during that month including Crysis and Bioshock and I swear I didn't see a lot of visual difference between DX10 on Vista and DX9 on XP. Same for Time Shift (does it use DX10?). And all games run faster on XP. I really see no compelling reason to go back to Vista just because of DX10.

    Zak

Log in

Don't have an account? Sign up now