Under the Hood of G84

So the quick and dirty summary of the changes is that the G84 is a reduced width G80 with a higher proportion of texture to shader hardware and a reworked PureVideo processing engine (dubbed VP2 as opposed to G80's VP1). Because there are fewer ROPs, fill rate and antialiasing capabilities will be reduced from the G80 as well. This isn't as necessary on a budget card where shader power won't be able to keep up with huge resolutions either.

We expect the target audience of the 8600 series to be running 1280x1024 resolution panels. Of course, some people will be running larger panels and we will test some higher resolutions to see what kind of capabilities the hardware has, but above 1600x1200 tests are somewhat academic. As 1080p TVs become more popular in the coming years, however, we may start putting pressure on graphics makers to target 1920x1200 as their standard resolution for mainstream parts even if average computer monitor sizes weigh in with fewer pixels.

In order to achieve playable performance at 1280x1024 with good quality settings, NVIDIA has gone with 32 shaders, 16 texture address units, and 8 ROPs. Here's the full breakdown:

GeForce 8600/8500 Hardware
GeForce 8600 GTS GeForce 8600 GT GeForce 8500
Stream Processors 32 32 16
Texture Address / Filtering 16/16 16/16 8/8
ROPs 8 8 4
Core Clock 675 MHz 540 MHz 450 MHz
Shader Clock 1.45 GHz 1.19 GHz 900 MHz
Memory Clock (Data Rate) 2 GHz 1.4 GHz 800 MHz
Memory Bus Width 128-bit 128-bit 128-bit
Frame Buffer 256 MB 256 MB 256MB / 512MB
Outputs 2x dual-link DVI 2x dual-link DVI ?
Transistor count 289 M 289 M ?
Price $200 - $230 $150 - $160 $90 - $130

We'll tackle the 8500 in more depth when we have hardware. For now, we'll include the data as reference. As for the 8600, right out of the gate, 32 SPs mean one third the clock for clock shader power of the 8800 GTS. At the same time, NVIDIA has increased the ratio of Texture address units to SPs from 1:4 to 1:2. We also see a 1:1 ratio of texture address and filter units. These changes prompted NVIDIA to further optimize their scheduling algorithms.

The combination of greater resource availability and improved scheduling allow for increased efficiency. In other words, clock for clock, G84 SPs are more efficient than G80 SPs. This makes it harder to compare performance based on specifications. Apparently stencil culling performance has also been improved, which should help boost algorithms like the Doom 3 engine's shadowing technique. NVIDIA didn't give us any detail on how stencil culling performance was improved, but indicated that this, among other things, was also tweaked with the new hardware.

Top this off with the fact that G84 has also been enhanced for higher clock speeds than G80 and we can expect much more work to be done by each SP per second than on 8800 hardware. Exactly how much is something we don't have an easy way of measuring as changes in efficiency will vary by the algorithms running on the hardware as well.

With 256 MB of memory on a 128-bit bus, we can expect a little more memory pressure than on the 8800 series. The 2 x 64-bit wide channels provide 40% of the bus width of an 8800 GTS. This isn't as cut down as the number of SPs; remember that the texture address units have only been reduced from 24 on the 8800 GTS to 16 on the 8600 series. Certainly the reduction of 20 ROPs to 8 will help cut down on memory traffic, but that extra texturing power won't be insignificant. While we don't have quantitative measurements, our impression is that memory bandwidth is more important in NVIDIA's more finely grained unified architecture than it was with the GeForce 7 series pipelined architecture. Sticking with a 128-bit memory interface for their mainstream part might work this time around, but depending on what we see from game developers over the next six months, this could easily change in the near future.

Let's round out our architectural discussion with a nice block diagram for the 8600 series:



We can see very clearly that this is a cut down G80. As we have discussed, many of these blocks have been tweaked and enhanced to provide more efficient processing. The fundamental function of each block remains the same, and the inside of each SP remains unchanged as well. The features supported are also the same as G80. For 8500 hardware, based on G86, we drop down from two blocks of Shaders and ROPs to one each.

Two full dual-link DVI ports on a $150 card is a very nice addition. With the move from analog to digital displays, seeing a reduction in maximum resolution on budget parts because of single-link bandwidth limitations, while not devastating, isn't desirable. There are tradeoffs in moving from analog to digital display hardware, and now an additional issue has a resolution. Now we just need to see display makers crank up pixel density and improve color space without reducing response time and this old Sony GDM-F520 can finally rest in peace.

In the video output front, G84 makes a major improvement over all other graphics cards on the market: G84 based hardware supporting HDCP will be capable of HDCP over dual-link connections. This is a major feature, as a handful of larger widescreen monitors like Dell's 30" only support 1920x1080 with a dual-link connection. Unless both links are protected with HDCP, software players will refuse to play AACS protected HD content. NVIDIA has found a way around the problem by using one key ROM but sending the key over both links. The monitor is able to handle HDCP connections on both links, and is able to display the video properly at the right resolution.

As for manufacturing, the G84 is still an 80 nm part. While G80 is impressively huge at 681M transistors, G84 is "only" 289M transistors. This puts it at nearly the same transistor count as G71 (7900 GTX). While performance of the 8600 series doesn't quite compare to the 7900 GTX, the 80 nm process makes smaller die sizes (and lower prices) possible.

In addition to all this, PureVideo has received a significant boost this time around.

Index The New Face of PureVideo HD
Comments Locked

60 Comments

View All Comments

  • shabby - Tuesday, April 17, 2007 - link

    3dmark vista edition of course!
  • munky - Tuesday, April 17, 2007 - link

    Why did you not include the x1950xt in the test lineup? It can also be had for about $200 now, like the 7950gt. You didn't want to make the 8600 series look worse than they already do, or what?
  • DerekWilson - Tuesday, April 17, 2007 - link

    Wow, sorry for the ommission -- I was trying to include specific comparison points -- 3 from AMD and 3 from NVIDIA, but this one just slipped through the cracks. Sorry. It will be included in my performance update.
  • Elwe - Tuesday, April 17, 2007 - link

    Now, now guys. True that these cards are not going to be what many of you want (there are some good reasons to stay with what you have considering the performance differential of several of the last generation cards). And it is clear that these cards will not touch the 8800 cards (from what I can tell, the only these these do better are are 100% Pure HD on the card, which I guess is because these might be paired with not-so-great cpus.

    But for some of us, they might work. I recently bought a Dell 390 workstation. I packed it with fast drives, QX6700 cpu, and 4gb ram. There were very few BTO graphics choices, and most centered around the Pro market (this is a "workstation" after all). These is a new machine, and quite powerful! I want to work and play on this box. Because of the relatively week power supply (rated at 375 watts or something like that) and because I need both available non-graphics PCIe slots (if you put it an 8800 GTS, even if you changed the power supply, this type of dual slot card will cover one of those slots), I have been waiting for something reasonably powerful to come along (again, I am not going to just work on this box; I would like to play UT2k7, too:). Since I run Linux, I was trying to stick with the Nvidia line (my experience is that they have better drivers for this platform, but perhaps ATI has stepped up in the last half year or so). I could have gone with the 79xx line (single slot), but I wanted to see what the new generation would bring. Depending on what you need/want, I think either a slightly-used 7950GT OC or a 8600 GTS would work just fine for me. It does not seem unreasonable to me that in some things the older higher end card is faster than the newer mid range card, and vice versa. But I did not see any benchmark where the 79xx line whooped the 8600 GTS thoroughly (like what happend with several benchmarks comparing the 8800 and 8600).

    I would say that the only immediate problem I might have with using the 8600 GTS is for gaming at high resolutions. I have a Dell 2407, and Anandtech's benchmarks make it clear I should not be gaming at that high a resolution. Bummer. The 7950 GT OC might very well be the better option here.

    In an ideal world, I really would like the power of an 8800 (and, fortunately, I can pay for it). But I really need the PCIe slot more, and changing out the power supply would add even more cost. I could have gotten another Dell model (like the XPS 710 or the Precision 490)--and I am thinking about just that. But I got the 390 for what I considered good reasons (a damned sight cheaper than the 490, and I have no need of another cpu socket when I can have 4 cores in one socket), and the XPS 710 did not have BTO storage options that I wanted (not sure why they could not design that thing to have more than two internal drives--the thing is big enough; maybe most games do not need it, as that is what the machine was designed for). I bet I am not the only one.
  • GhandiInstinct - Tuesday, April 17, 2007 - link

    Masses would include AGP cards...

    I see no AGP DX10 cards...
  • aka1nas - Tuesday, April 17, 2007 - link

    The "masses" don't build their own computers, and thus have long since stopped purchasing machines with AGP slots.
  • GhandiInstinct - Tuesday, April 17, 2007 - link

    The "masses" also don't go hunting for DX10 cards to add FPS to their hardcore Dell and Gateway gaming rigs.

    Be honest with yourself, the people going for these cards are custom riggers.

    AGP DX10 please, theres hundreds of thousands with Pentium 3.4 Northwoods that know their processors will run BioShock well, but they need DX10 without paying for a new motherboard, DDR2, and everything else, including Vista!!!
  • JarredWalton - Tuesday, April 17, 2007 - link

    Actually, I don't think anyone "knows" whether or not any current system will run BioShock well or not. Let's wait for the game to appear at least. We're still at least four months away (assuming they hit the current release date).

    While I can understand people complaining about the lack of AGP cards, let's be honest: why should either company invest a lot of money in an old platform? It takes time to make the AGP cards and more time to make sure the drivers all work right. At some point, the old tech has to be left behind. The cost to transition from an AGP setup to a PCIe setup is often under $100, so if the AGP cards had a $50 price premium you'd only save yourself $50 and still be stuck with the older platform.

    I figure AMD/ATI and NVIDIA basically ignored the complaints with X1900/7900 class hardware (the best was several notches below what was available on PCIe), and at this point I think they're done. I'd even go so far as to say we're probably now at the point where an AGP platform would start to be a bottleneck with current hardware - maybe not midrange stuff, but certainly the high-end offerings.

    Let's put it another way: why can't I get something faster than a single core 2.4GHz 1MB cache Athlon 64 3700+ for socket 754? Why can't I get Pentium D or Core 2 Duo for socket 478? Why do we need new motherboards for Core 2 Duo when socket 775 is used sing 915/925? Intel and AMD have forced transitions on users for years, and after a long run it's tough to say that AGP hasn't fulfilled its purpose. Such is the life of PC hardware.
  • GhandiInstinct - Tuesday, April 17, 2007 - link

    Good points, I agree with them all.

    Basically, I feel that my 3.2 northwood, 2GB ram is worth salvaging for BioShock and Hellgate, obviously not Crysis, but it's convenient that it will be released in 08.

    I figure I can hold out 8 more months, save up during this time, and switch to quad and DDR3.

    I service hundreds of clients a week in tech support that have AGP setups, and I don't think Nvidia and ATi will abandon AGP with DX10, especially since there is speculation to believe they will be releasing this cards in the future: http://www.theinquirer.net/default.aspx?article=37...">http://www.theinquirer.net/default.aspx?article=37...

    :)
  • LoneWolf15 - Tuesday, April 17, 2007 - link

    While it would be nice to have this hardware in NVIDIA's higher end offerings, this technology arguably makes more sense in mainstream parts. High end, expensive graphics cards are usually paired with high end expensive CPUs and lots of RAM. The decode assistance that these higher end cards offer is more than enough to enable a high end CPU to handle the hardest hitting HD videos. With mainstream graphics hardware providing a huge amount of decode assistance, the lower end CPUs that people pair with this hardware will benefit greatly.

    IMO, this is absolute bollocks.

    If I'm paying for nVidia's high-end stuff, I expect high-end everything. And this is at least the second time nVidia has only improved video on their second-round or midrange parts (anybody remember NV40/45?).

    I game some, and I want good performance for that. But, I also have a 1920x1200 display, and I want the best video playback experience I can get on it. I also want the lower CPU-usage so I can playback video while my system is left to do other processor-intensive tasks in the background.

    Once again, nVidia has really disappointed me in this area. In comparison, ATI seems to be much better at making sure their full range of cards supports their best video technologies. This (along with nVidia's driver development) continues to make the G80 seem like a "rushed-out-the-door" product.

Log in

Don't have an account? Sign up now