The Cards and The Test

Both of our cards, the 8600 GT and the 8600 GTS, feature two DVI ports and a 7-pin video port. The GTS requires a 6-pin PCIe power connector, while the GT is capable of running using only the power provided by the PCIe slot. Each card is a single slot solution, and there isn't really anything surprising about the hardware. Here's a look at what we're working with:





In testing the 8600 cards, we used 158.16 drivers. Because we tested under Windows XP, we had to use the 93 series driver for our 7 series parts, the 97 series driver for our 8800 parts and the 158.16 beta driver for our new 8600 hardware. While Vista drivers are unified and the 8800 drivers were recently updated, GeForce 7 series running Windows XP (the vast majority of NVIDIA's customers) have been stuck with the same driver revision since early November last year. We are certainly hoping that NVIDIA will release a new unified Windows XP driver soon. Testing with three different drivers from one hardware manufacturer is less than optimal.

We haven't done any Windows Vista testing this time around, as we still care about maximum performance and testing in the environment most people will be using their hardware. This is not to say that we are ignoring Vista: we will be looking into DX10 benchmarks in the very near future. Right now, there is just no reason to move our testing to a new platform.

Here's our test setup:

System Test Configuration
CPU: Intel Core 2 Extreme X6800 (2.93GHz/4MB)
Motherboard: EVGA nForce 680i SLI
Chipset: NVIDIA nForce 680i SLI
Chipset Drivers: NVIDIA nForce 9.35
Hard Disk: Seagate 7200.7 160GB SATA
Memory: Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 2)
Video Card: Various
Video Drivers: ATI Catalyst 7.3
NVIDIA ForceWare 93.71 (G70)
NVIDIA ForceWare 97.94 (G80)
NVIDIA ForceWare 158.16 (8600)
Desktop Resolution: 1280 x 800 - 32-bit @ 60Hz
OS: Windows XP Professional SP2

The latest 100 series drivers do expose an issue with BF2 that enables 16xCSAA when 4xMSAA is selected in game. To combat this, we used the control panel to select 4xAA under the "enhance" application setting.

All of our games were tested using the highest selectable in-game quality options with the exception of Rainbow Six: Vegas. Our 8600 hardware had a hard time keeping up with hardware skinning enabled even at 1024x768. In light of this, we tested with hardware skinning off and medium blur. We will be doing a follow up performance article including more games. We are looking at newer titles like Supreme Commander, S.T.A.L.K.E.R., and Command & Conquer 3. We will also follow up with video decode performance.

For the comparisons that follow, the 8600 GTS is priced similarly to AMD's X1950 Pro, while the 8600 GT competes with the X1950 GT.

The New Face of PureVideo HD Battlefield 2 Performance
Comments Locked

60 Comments

View All Comments

  • kilkennycat - Tuesday, April 17, 2007 - link

    (As of 8AM Pacific Time, April 17)

    See:-

    http://www.zipzoomfly.com/jsp/ProductDetail.jsp?Pr...">http://www.zipzoomfly.com/jsp/ProductDetail.jsp?Pr...

    http://www.zipzoomfly.com/jsp/ProductDetail.jsp?Pr...">http://www.zipzoomfly.com/jsp/ProductDetail.jsp?Pr...
  • Chadder007 - Tuesday, April 17, 2007 - link

    Thats really not too bad for a DX10 part. I just wish we actually had some DX10 games to see how it performs though....
  • bob4432 - Tuesday, April 17, 2007 - link

    that performance is horrible. everyone here is pretty dead on - this is strictly for marketing to the non-educated gamer. too bad they will be disappointed and probably return such a piece of sh!t item. what a joke.

    come on ati, this kind of performance should be in the low end cards, this is not a mid-range card. maybe if nvidia sold them for $100-$140 they may end up in somebody htpc but that is about all they are good for.

    glad i have a 360 to ride out this phase of cards while my x1800xt still works fine for my duties.

    if i were the upper management at nvidia, people would be fired over this horrible performance, but sadly the upper management is more than likely the cause of this joke of a release.
  • AdamK47 - Tuesday, April 17, 2007 - link

    nVidia needs to have people with actual product knowledge dictate what the specifications of future products will be. This disappointing lineup has marketing written all over it. They need to wise up or they will end up like Intel and their failed marketing derived netburst architecture.
  • wingless - Tuesday, April 17, 2007 - link

    In the article they talk about the Pure Video features as if they are brand new. Does this mean they ARE NOT implemented in the 8800 series? The article talked about how 100% of the video decoding process is on the GPU but it did not mention the 8800 core which worries the heck outta me. Also does the G84 have CUDA capabilities?
  • DerekWilson - Tuesday, April 17, 2007 - link

    CUDA is supported
  • DerekWilson - Tuesday, April 17, 2007 - link

    The 8800 series support PureVideo HD the same way GeForce 7 sereis does -- through VP1 hardware.

    The 8600 and below support PureVideo HD through VP2 hardware, the BSP, and other enhancements which allow 100% offload of decode.

    While the 8800 is able to offload much of the process, it's not 100% like the 8600/8500. Both support PureVideo HD, but G84 does it with lower CPU usage.
  • wingless - Tuesday, April 17, 2007 - link

    I just checked NVIDIA's website and it appears only the 8600 and 8500 series support Pure Video HD which sucks balls. I want 8800GTS performance with Pure Video HD support. Guess I'll have to wait a few more months, or go ATI but ATI's future isn't stable these days.
  • defter - Tuesday, April 17, 2007 - link

    Why you want 8800GTS performance with improved Purevideo HD support? Are you going to pair 8800GTS with $40 Celeron? 8800GTS has more than enough power to decode H.264 at HD resolutions as long as you pair with modern CPU: http://www.anandtech.com/printarticle.aspx?i=2886">http://www.anandtech.com/printarticle.aspx?i=2886

    This improved Purevide HD is aimed for low-end systems that are using a low end-CPU. That's why this feature is important for low/mid-range GPUs.

  • wingless - Tuesday, April 17, 2007 - link

    If I'm going to spend this kind of money for an 8800 series card then I want VP2 100% hardware decoding? Is that too much to ask? I want all the extra bells and whistles. Damn, I may have to go ATI for the first time since 1987 when I had that EGA Wonder.

Log in

Don't have an account? Sign up now