Special thanks to Newegg for supplying hardware for this comparison test.


Although it's seeing very slow adoption among end users, PCI Express platforms are getting out there and the two graphics giants are wasting no time in shifting the competition for king of the hill over to the PCI Express realm.

ATI and NVIDIA have both traded shots in the mid-range with the release of the Radeon X700 and GeForce 6600. Today, the battle continues in the entry-level space with NVIDIA's latest launch - the GeForce 6200.



The GeForce 6 series is now composed of 3 GPUs: the high end 6800, the mid-range 6600 and now the entry-level 6200. True to NVIDIA's promise of one common feature set, all three of the aforementioned GPUs boast full DirectX 9 compliance, and thus, can all run the same games, just at different speeds.

What has NVIDIA done to make the 6200 slower than the 6600 and 6800?

For starters, the 6200 features half the pixel pipes of the 6600, and 1/4 that of the 6800. Next, the 6200 will be available in two versions: one with a 128-bit memory bus like the 6600 and one with a 64-bit memory bus, effectively cutting memory bandwidth in half. Finally, NVIDIA cut the core clock on the 6200 down to 300MHz as the final guarantee that it would not cannibalize sales of their more expensive cards.

The 6200 is a NV43 derivative, meaning it is built on the same 0.11-micron (110nm) process on which the 6600 is built. In fact, the two chips are virtually identical with the 6200 having only 4 active pixel pipelines on its die. There is one other architectural difference between the 6200 and the rest of the GeForce 6 family, and that is the lack of any color or z-compression support in the memory controller. Color and Z-compression are wonderful ways of reducing the memory bandwidth overhead of enabling technologies such as anti-aliasing. So, without support for that compression, we can expect the 6200 to take a bigger hit when turning on AA and anisotropic filtering. The benefit here is that the 6200 doesn't have the fill rate or the memory bandwidth to run most games at higher resolutions. Therefore, those who buy the 6200 won't be able to play at resolutions where the lack of color and z-compression would really matter with AA enabled. We'll investigate this a bit more in our performance tests.



Here's a quick table summarizing what the 6200 is and how it compares to the rest of the GeForce 6 family:

 GPU  Manufacturing Process  Vertex Engines  Pixel Pipelines  Memory Bus Width
GeForce 6200 0.11-micron 3 4 64/128-bit
GeForce 6600 0.11-micron 3 8 128-bit
GeForce 6800 0.13-micron 6 16 256-bit

The first thing to notice here is that the 6200 supports either a 64-bit or 128-bit memory bus, and as far as NVIDIA is concerned, they are not going to be distinguishing cards equipped with either a 64-bit or 128-bit memory configuration. While NVIDIA insists that they cannot force their vendor partners to distinguish the two card configurations apart, we're more inclined to believe that NVIDIA simply would like all 6200 based cards to be known as a GeForce 6200, regardless of whether or not they have half the memory bandwidth. NVIDIA makes a "suggestion" to their card partners that they should add the 64-bit or 128-bit designation somewhere on their boxes, model numbers or website, but the suggestion goes no further than just being a suggestion.

The next issue of variability comes in the topic of clock speeds. NVIDIA has "put a stake in the ground" at 300MHz as the desired clock speed for the 6200 GPUs regardless of configuration, and it does seem that add-in board vendors would have no reason to clock their 6200s any differently, since they are all paying for a 300MHz part. The variability really comes when you start talking about memory speeds. The 6200 only supports DDR1 memory and is spec'd to run at 275MHz (effectively 550MHz). However, as we've seen in the past, this is only a suggestion - it is up to the manufacturers as to whether or not they will use cheaper memory.

NVIDIA is also only releasing the 6200 as a PCI Express product - there will be no AGP variant at this point in time. The problem is that the 6200 is a much improved architecture compared to the current entry-level NVIDIA card in the market (the FX 5200), yet the 5200 is still selling quite well as it is not really purchased as a hardcore gaming card. In order to avoid cannibalizing AGP FX 5200 sales, the 6200 is kept out of competition by being a strictly PCI Express product. While there is a PCI Express version of the FX 5200, its hold on the market is not nearly as strong as the AGP version, so losing some sales to the 6200 isn't as big of a deal.

In talking about AGP versions of recently released cards, NVIDIA has given us an update on the status of the AGP version of the highly anticipated GeForce 6600GT. We should have samples by the end of this month and NVIDIA is looking to have them available for purchase before the end of November. There are currently no plans for retail availability of the PCI Express GeForce 6800 Ultras - those are mostly going to tier 1 OEMs.

The 6200 will be shipping in November and what's interesting is that some of the very first 6200 cards to hit the street will most likely be bundled with PCI Express motherboards. It seems like ATI and NVIDIA are doing a better job of selling 925X motherboards than Intel these days.

The expected street price of the GeForce 6200 is between $129 and $149 for the 128-bit 128MB version. This price range is just under that of the vanilla ATI X700 and the regular GeForce 6600 (non-GT), both of which are included in our performance comparison - so in order for the 6200 to truly remain competitive, its street price will have to be closer to the $99 mark.

The direct competition to the 6200 from ATI are the PCI Express X300 and X300SE (128-bit and 64-bit versions respectively). ATI has a bit of a disadvantage here because the X300 and X300SE are still based on the old Radeon 9600 architecture and not a derivative of the X800 and X700. ATI is undoubtedly working on a 4-pipe version of the X800, but for this review, the advantage is definitely in NVIDIA's court.

NV4x’s Video Processor – What Happened?
Comments Locked

44 Comments

View All Comments

  • PrinceGaz - Tuesday, October 12, 2004 - link

    I'm assuming the 6200 you tested was a 128-bit version? You don't seem to mention it at all in the review, but I doubt nVidia would send you a 64-bit model unless they wanted to do badly in the benchmarks :)

    I don't think the X700 has appeared on an AT review before, only the X700 XT. Did you underclock your XT, or have you got hold of a standard X700? I trust those X700 results aren't from the X700 XT at full speed! :)

    As #11 and #12 mentioned, with the exception of Doom 3, the X600 Pro is faster than the 6200:

    Doom 3 - 39.3 60.1 (-35%)
    HL2 Stress Test - 91 76 (+20%)
    SW Battlefront - 45 33 (+36%)
    Sims 2 - 33.9 32.2 (+5%)
    UT2004 (1024x768) - 46.3 37 (+25%) [they were CPU limited at lower resolutions]
    BF Vietnam - 81 77 (+5%)
    Halo - 45.2 44 (+3%)
    Far Cry - 74.7 60.6 (+23%)

    So the X600 Pro is slower than the 6200 (128-bit) in Doom 3 by a significant amount, but its marginally faster than it in three games, and its significantly faster than the 6200 in the other three games and also the HL2 Stress Test. So that makes the X600 Pro the better card.

    The X700 absolutely thrashed even the 6600, let alone the 6200, in every game except of course Doom 3 where the 6600 was faster, and Halo where the X700 was a bit faster than the 6600 but not by such a large amount.

    Given the prices of the ATI cards, X300SE ($75), X300 ($100), X600 Pro ($130), X700 (MSRP $149); the 6600 is going to have to be priced at under its MSRP of $149 because of the far superior X700 at the same price point. Lets say a maximum of $130 for the 6600.

    If thats the case, I can't see how the 6200 could have a street-price of $149 (128-bit) and $129 (64-bit). How can the 6200 (128-bit) even have the same price as the faster 6600 anyway? Its also outperformed by the $130 X600 Pro which makes a $149 price ridiculous. I think the 6200 will have to be priced more like the X300 and X300SE-- $100 and $75 for the 128-bit and 64-bit versions respectively, if they are to be successful.

    Maybe most 6200's will end up being cheap 64-bit cards that are sold to people who aren't really bothered about gaming, or who mistakenly believe the amount of memory is the most important factor. You just have to look at how many 64-bit FX5200's are sold.
  • Shinei - Tuesday, October 12, 2004 - link

    The PT Barnum theory, wilburpan. There's a sucker born every minute, and if they're willing to drop $60 for a 64-bit version of a card when they could have had a 128-bit version, so much the better for profits. The FX5200 continues to be one of the best selling AGP cards on the market, despite the fact that it's worse than a Ti4200 at playing games, let alone DX9 games.
  • wilburpan - Tuesday, October 12, 2004 - link

    "The first thing to notice here is that the 6200 supports either a 64-bit or 128-bit memory bus, and as far as NVIDIA is concerned, they are not going to be distinguishing cards equipped with either a 64-bit or 128-bit memory configuration."

    This really bothers me a lot. If I knew there were two versions of this card, I definitely would want to know which version I was buying.

    What would be the rationale for such a policy?
  • wilburpan - Tuesday, October 12, 2004 - link

  • nserra - Tuesday, October 12, 2004 - link

    Why do you all keep talking about the Geforce 6600 cards (buying them) when the X700 was the clear winner?
    You all want to buy the worst card (less performing)? I dont understand.

    Why dont anantech use 3Dmark05?

    No doubt that mine 9700 was a magnificent buy almost 2 years ago. What a piece of cheat are the Geforce FX line of cards....
    Why didnt they use one (a 5600/5700) just to see...

    Even 4pipe line Ati cards can keep up with 8 pipe nvidia, gee what a mess... old tech yeah right.
  • coldpower27 - Tuesday, October 12, 2004 - link

    I am very happy you included Sims 2 into your benchmark suite:)

    I think this game like the amount of vertex processor on X700 plus it's advanatge in fillrate and memory bandwidth, could you please test the Sims 2 when you can on the high end cards from both vendors? :P
  • jediknight - Tuesday, October 12, 2004 - link

    What I'm wondering is.. how do previous generation top-of-the-line cards stack up to current gen mainstream cards?
  • AnonymouseUser - Tuesday, October 12, 2004 - link

    Saist, you are an idiot.

    "OpenGl was never really big on ATi's list of supported API's... However, adding in Doom3, and the requirement of OGL on non-Windows-based systems, and OGL is at least as important to ATi now as DirectX."

    Quake 3, RtCW, HL, CS, CoD, SW:KotOR, Serious Sam (1&2), Painkiller, etc, etc, etc, etc, are OpenGL games. Why would they ONLY NOW want to optimize for OpenGL?
  • Avalon - Monday, October 11, 2004 - link

    Nice review on the budget sector. It's good to see a review from you again, Anand :)
  • Bonesdad - Monday, October 11, 2004 - link

    Affordable gaming??? Not until the 6600GT AGP's come out...affordable is not replacing your mobo, cpu and video card...

Log in

Don't have an account? Sign up now