Intel did a very good job of drumming up support for PCI Express over the past two years.  Look around and note that all of the motherboard manufacturers have quite a few PCI Express based motherboard designs.  Then look at the latest GPU launches from ATI and NVIDIA, all of the exciting products appear to be launched first (or primarily) as PCI Express designs.  While everyone industry-wide has done a great job of supporting PCI Express, there's one little problem - no one seems to be interested in buying PCI Express solutions just yet. 

The OEM markets have no problems shipping PCI Express motherboards and graphics cards in their systems, after all they want to sell the idea of buying an entirely new PC in order to get access to brand new technologies like PCI Express.  However in the channel and upgrade markets, PCI Express solutions aren't selling well at all.  Most enthusiast users appear to be sticking with their AGP platforms and while they would consider a GPU upgrade, they are not willing to upgrade their motherboard (and sometimes CPU and memory) just to get a faster graphics card. 

There was a huge debate early on about whose PCI Express design would prove to be the best for performance.  ATI chose to produce separate PCI Express and AGP enabled GPUs, offering a native solution for both interfaces; meanwhile, NVIDIA chose to keep manufacturing their AGP GPUs and use a bridge chip to interface with PCI Express.  While ATI argued that NVIDIA's solution offered less performance, NVIDIA said that ATI's approach was far too costly.  The problem with ATI's approach was that their production was inherently split between AGP and PCI Express chips, and predicting market demands for an appropriate ratio between the chips is quite difficult.  If you overproduce PCI Express chips, then there will be a shortage of AGP cards, and vice versa.  ATI's initial approach to only producing native PCI Express or AGP designs is part of the reason why their latest mainstream GPUs (e.g. X700) are still only available as PCI Express designs.

Even though NVIDIA has gone to manufacturing native PCI Express GPUs (e.g. GeForce 6600GT), they already have a working chip to bridge back down to an AGP interface, which is what makes today's launch possible.  Thanks to the use of NVIDIA's PCI Express-to-AGP bridge chip, NVIDIA is able to not only launch but also begin selling an AGP version of their GeForce 6600GT today.  We are told by NVIDIA that cards should be available for sale today continuing a very recent trend of announcing availability alongside a product launch, which we greatly applaud. 


NVIDIA's PCI Express to AGP bridge

ATI is working on a PCI Express-to-AGP bridge of their own, but it will not be ready until later this year - meaning that ATI will not have an AGP version of their PCI Express X700 until early next year.

The GeForce 6600GT AGP runs at the same core clock speed as the PCI Express version (500MHz) but has a slightly lower memory clock (900MHz vs. 1GHz on the PCI Express version).  By lowering the memory clock NVIDIA helps to offset the additional cost of the PCI Express-to-AGP bridge.  The performance impact of the reduction in memory clock as well as the on-board bridge is between 0 - 5%.  For example, in Doom 3 at 1024 x 768 (High Quality) the PCI Express version of the GeForce 6600GT is 3.5% faster than the AGP version.  There is a performance difference, but it does not appear to be huge. The AGP version of the 6600GT obviously lacks SLI support given that you can only have a single AGP slot on a motherboard.

The latest AGP specification calls for a maximum of around 45W of power to be delivered via the AGP slot itself, while a PCI Express x16 slot can supply up to 75W.  Because of the reduction in power that can be delivered via the slot interface, the GeForce 6600GT AGP requires the use of a 4-pin molex connector on the board itself to deliver extra power to the GPU.  You may remember that the PCI Express version of the 6600GT does not require a separate power connector. 


This 4-pin molex connector is only present on the AGP version of the 6600GT

As of now, NVIDIA is only releasing the 6600GT in an AGP flavor; the regular non-GT 6600 will remain PCI Express only. the 6600GT AGP will retail for between $200 and $250. If you are interested in learning more about the architecture of the 6600GT, feel free to read our review of the PCI Express version for greater detail.

The Cards
Comments Locked

66 Comments

View All Comments

  • Visual - Wednesday, November 17, 2004 - link

    i think it's important to compare the 6600GT with the normal 6800 and even the OEM 6800LEs that can be found around... also, any non-GT 6600s? because all these cards are around the best performance/value ratio... you should try to find/show which one is the best buy. you should also consider the 6800's chance of modding...
  • nserra - Wednesday, November 17, 2004 - link

    #54
    But if the nvidia AGP numbers are equal to PCIe, wouldn’t ati numbers too? Is it because the card doesn’t exist?

    Why do you keep talking so much about the nvidia VP, when Ati has this feature since 2002, is it because it doesn’t work?

  • DerekWilson - Wednesday, November 17, 2004 - link

    #53

    There will be no x700 agp before years end. If we had one to test we would love to have included numbers for it.

    -- Derek Wilson
  • nserra - Wednesday, November 17, 2004 - link

    #7 Decoder
    They can be solved, the funny thing is that Ati as decoding features since 07/18/02 and didn’t need of lots of transistors or processors to do that, and no one says that! Its like is a nv only feature. http://www.ati.com/vortal/videoinnovations/flash/i...

    #28 draazeejs
    There is no X700 AGP, but even so the 9800 does a great job for a 2 years old card.
    But there should be an X700 PCIe in the test to see how fast it is over the 6600.

    #29 Regs
    You are absolutely right; some one should pay for having recommended nvidia cards over Ati in some “past” reviews.

    #31 Cybercat
    9700 cards aren’t that bad...

    #32 vailr
    See my #7 reply.

    #34 Anand Lal Shimpi
    Yeah, Ati as encoding and decoding since 07/18/02 and I didn’t remember anyone talking about it. The new Ati driver as WMV acceleration option also, why I didn’t get tested yet!!! Just because of the nvidia fancy name “Video processor” yeah right, they are playing catch up with Ati they were 2 years late!

    #38 ChronoReverse
    You are so relying in the SM3.0, is a new shading language, which delivers new shading code. Are you sure it will be that BIG. Remember that Microsoft from PS1.1->1.4 calls DirectX8.1 and from SM2.0->SM3.0 Microsoft call it DirectX9.0c, if it is that important don’t you think the right name would be DirectX9.1 or 9.5?

    #39 Read the conclusion of own Anandtech on 04/11/04:
    "If all of the cards in this review actually stick to their MSRPs, then the clear suggestion would be the $149 ATI Radeon X700. In every single game outside of Doom 3, the X700 does extremely well, putting even the GeForce 6600 to shame" I don’t think that putting the X700 out of this test was a good idea! Since 6600AGP and PCIe is at the same level, so the X700 AGP should too?

    #40 Pete
    See the up post! And it’s also an Anandtech conclusion!
  • R3MF - Wednesday, November 17, 2004 - link

    good article, cheers.

    i'll look forward to the vanilla 6800 benchies, as i'm building two PC's in short order, and i need the info. :)
  • Calin - Wednesday, November 17, 2004 - link

    Edit to #49: Radeon 9800 Pro should be at a tie with the 6600GT in Wolfenstein - Enemy Territory, as the speed difference is at best very hard to see (the biggest difference of performance is in the 6% range, just like the difference between the 5900XT and the 6600GT)

    Calin
  • danyel - Wednesday, November 17, 2004 - link

    Does anyone know if there a plan for an AGP version of the plain geforce 6600 as I don’t see it mentioned anywhere?
  • Calin - Wednesday, November 17, 2004 - link

    Wolfenstein - Enemy territory is a tie, with the speed difference at most 6%. This is hardly a difference (however, the minimum frame rate have a bigger influence on game play than maximum frame rate)

    Calin
  • Samus - Wednesday, November 17, 2004 - link

    im ganna buy one asap, its exactly what i've been waiting for... my radeon 9600 pro chugs in doom3, and i can't even run it at my lcd's native res of 1280x1024...

    i've been holding off playing doom3 until a card like this came out.
  • jay75 - Wednesday, November 17, 2004 - link

    the ATI 9800pro beats the 6600gt when anti-aliasing(AA) is implemented in far cry. it doesn't in counter strike but the frame rate is above 75 at 1280*1024(4x AA) anyway. this is something to be noted.

Log in

Don't have an account? Sign up now