Intel did a very good job of drumming up support for PCI Express over the past two years.  Look around and note that all of the motherboard manufacturers have quite a few PCI Express based motherboard designs.  Then look at the latest GPU launches from ATI and NVIDIA, all of the exciting products appear to be launched first (or primarily) as PCI Express designs.  While everyone industry-wide has done a great job of supporting PCI Express, there's one little problem - no one seems to be interested in buying PCI Express solutions just yet. 

The OEM markets have no problems shipping PCI Express motherboards and graphics cards in their systems, after all they want to sell the idea of buying an entirely new PC in order to get access to brand new technologies like PCI Express.  However in the channel and upgrade markets, PCI Express solutions aren't selling well at all.  Most enthusiast users appear to be sticking with their AGP platforms and while they would consider a GPU upgrade, they are not willing to upgrade their motherboard (and sometimes CPU and memory) just to get a faster graphics card. 

There was a huge debate early on about whose PCI Express design would prove to be the best for performance.  ATI chose to produce separate PCI Express and AGP enabled GPUs, offering a native solution for both interfaces; meanwhile, NVIDIA chose to keep manufacturing their AGP GPUs and use a bridge chip to interface with PCI Express.  While ATI argued that NVIDIA's solution offered less performance, NVIDIA said that ATI's approach was far too costly.  The problem with ATI's approach was that their production was inherently split between AGP and PCI Express chips, and predicting market demands for an appropriate ratio between the chips is quite difficult.  If you overproduce PCI Express chips, then there will be a shortage of AGP cards, and vice versa.  ATI's initial approach to only producing native PCI Express or AGP designs is part of the reason why their latest mainstream GPUs (e.g. X700) are still only available as PCI Express designs.

Even though NVIDIA has gone to manufacturing native PCI Express GPUs (e.g. GeForce 6600GT), they already have a working chip to bridge back down to an AGP interface, which is what makes today's launch possible.  Thanks to the use of NVIDIA's PCI Express-to-AGP bridge chip, NVIDIA is able to not only launch but also begin selling an AGP version of their GeForce 6600GT today.  We are told by NVIDIA that cards should be available for sale today continuing a very recent trend of announcing availability alongside a product launch, which we greatly applaud. 

NVIDIA's PCI Express to AGP bridge

ATI is working on a PCI Express-to-AGP bridge of their own, but it will not be ready until later this year - meaning that ATI will not have an AGP version of their PCI Express X700 until early next year.

The GeForce 6600GT AGP runs at the same core clock speed as the PCI Express version (500MHz) but has a slightly lower memory clock (900MHz vs. 1GHz on the PCI Express version).  By lowering the memory clock NVIDIA helps to offset the additional cost of the PCI Express-to-AGP bridge.  The performance impact of the reduction in memory clock as well as the on-board bridge is between 0 - 5%.  For example, in Doom 3 at 1024 x 768 (High Quality) the PCI Express version of the GeForce 6600GT is 3.5% faster than the AGP version.  There is a performance difference, but it does not appear to be huge. The AGP version of the 6600GT obviously lacks SLI support given that you can only have a single AGP slot on a motherboard.

The latest AGP specification calls for a maximum of around 45W of power to be delivered via the AGP slot itself, while a PCI Express x16 slot can supply up to 75W.  Because of the reduction in power that can be delivered via the slot interface, the GeForce 6600GT AGP requires the use of a 4-pin molex connector on the board itself to deliver extra power to the GPU.  You may remember that the PCI Express version of the 6600GT does not require a separate power connector. 

This 4-pin molex connector is only present on the AGP version of the 6600GT

As of now, NVIDIA is only releasing the 6600GT in an AGP flavor; the regular non-GT 6600 will remain PCI Express only. the 6600GT AGP will retail for between $200 and $250. If you are interested in learning more about the architecture of the 6600GT, feel free to read our review of the PCI Express version for greater detail.

The Cards


View All Comments

  • Pete - Tuesday, November 16, 2004 - link

    Great article, Anand. Are you sure about your 9700P numbers for Far Cry, though? They seem awfully low, especially in relation to a 5900XT. Reply
  • SlinkyDink - Tuesday, November 16, 2004 - link

    /*The AGP version of the 6600GT obviously lacks SLI support given that you can only have a single AGP slot on a motherboard.*/

    Actually I believe that AGP 3.0 specs allow up two AGP slots (and both could be used used at once), but nobody ever decided to implement it :P

  • Anand Lal Shimpi - Tuesday, November 16, 2004 - link

    I am not treating NVIDIA's Video Processor as a feature of any NV4x GPU until NVIDIA provides a working driver and commits to a public release date. The 6600GT AGP supposedly has the same video processor that the PCI Express version has (since they are the same GPU), but to this date NVIDIA has failed to deliver a working driver set to take advantage of it.

    Take care,
  • slurmsmackenzie - Tuesday, November 16, 2004 - link


    remember, the point is that ati didn't have a bridge in the works at the release of the x700, so now that it has become apparent that agp is still the front running solution, they're behind it it's agp equivelant releases. so, as far as agp interface is concerned, the closest ati comparison is the 9800.
  • vailr - Tuesday, November 16, 2004 - link

    Any comments on: comparing the hardware video decoding, of the 6600 vs. the (reportedly faulty)6800; and overall video quality, in comparison with ATI's offerings?
    For those people interested in the best cost-to-performance video solution, for Home Theater PC use.
  • Cybercat - Tuesday, November 16, 2004 - link

    They couldn't have been using the NF4 reference motherboard, these are all AGP cards. Also, why is it that the 9800 Pro does 63% better than the 9700 Pro in FarCry? At most that card is around 30% better. Did you guys really rerun the tests with the 9700 Pro using the latest drivers, or did you merely recycle some of the numbers? Reply
  • marcnakm - Tuesday, November 16, 2004 - link

    The card I was waiting for.
    Good review, just missing the comparison with the regular 6800 which is very important.
  • Regs - Tuesday, November 16, 2004 - link

    This review shows a lot of things. One of them was how the FX series was a horrible failure. Reply
  • draazeejs - Tuesday, November 16, 2004 - link

    Did nVidia pay for this article? Is it really fair to put up this card against a 2-years old card, like R9800Pro? As far as I understood, the X700 should be the real competitor for 6600GT, because the X700 is supposed to be in the same price cathegory, no? There have been numerous reviews of the X700 on the net, why not include it here??? Reply
  • Anand Lal Shimpi - Tuesday, November 16, 2004 - link

    The impact of the bridge, as I mentioned in the review, is negligible. The bridge + slower memory results in a 0 - 5% performance difference between the PCI Express and AGP versions of the 6600GT (the 5% figure being because of the additional memory bandwidth courtesy of the 500/1000 clock vs. 500/900).

    Just so you guys know, I went out and picked up a vanilla 6800 for inclusion in my upcoming Half Life 2 GPU comparison. Know that your voice has been heard :)

    Take care,

Log in

Don't have an account? Sign up now