Final Words

As we have seen from our tests, the X700 XT packs a lot of punch into a small package. Most of the time its not quite enough to keep up with the NVIDIA 6600 GT, but the X700 XT proves its worth in the Source Engine Video Stress Test, and FarCry, and Unreal Tournament 2004. Most of the X700 XT's power shines through when anisotropic filtering and antialiasing are disabled (the exception being UT2K4). Current and previous generation OpenGL titles do show the Radeon X700 XT lagging the NVIDIA GeForce 6600 GT. NVIDIA has traditionally been stronger in OpenGL performance than ATI, so this is not really a surprise.

What is interesting is that we usually see ATI cards push past NVIDIA cards when aniso and antialiasing are enabled, but we are seeing the reverse this time around. This could be because NVIDIA has finally got a solution with the same number of pixel pipes at a higher core clock speed than a comparably priced ATI product. The NVIDIA part also seems to maintain performance a little better when its limits are pushed (i.e. at 1600x1200). This could indicate that NVIDIA is making more effective use of memory bandwidth, as the 6600 GT is actually running at a very slightly slower memory data rate.

It is possible that we are only seeing the X700 XT pull further ahead at lower resolutions and when AA and AF are disabled because geometry processing is a larger percentage of overall performance in those cases, but we don't have a very sound way of testing this at the moment. We have also not explored the impact of coupling these GPUs with a lower performing CPU. The added geometry processing power of the X700 XT may or may not help alleviate the strain on slower systems, though it doesn't seem likely that this would be a significant advantage.

In addition to it's other advantages, the NVIDIA GeForce 6600 GT is capable of being used in an SLI configuration. Whether or not this will prove to be worth the investment is still up in the air (we still don't have that bridge connector from NVIDIA), but certainly it poses a potential that ATI can't offer right now.

In the end, the GeForce 6600 GT is a more versatile solution than the Radeon X700 XT that can deliver higher performance at more demanding settings. The X700 is certainly not a bad card, and street prices still remain to be seen. At publication, we found a 6600 GT available for $209 on pricewatch, though street prices for the X700 are not yet available. Unless the X700 XT is priced comfortably below its $199 MSRP, or you need the 256MB of the X700 Pro, the 6600 GT is the way to go for midrange cards.

Unreal Tournament 2004 Performance
Comments Locked

40 Comments

View All Comments

  • PrinceGaz - Wednesday, September 22, 2004 - link

    #27 The Plagrimaster:

    Do you know what a paragraph is?

    Anyway, 1280x960 or 1280x1024 is becoming the more common resolution used by many people with fairly recent systems, even if its only because 1280x1024 is the native resolution of their LCD display, so anything else looks inferior.

    How fast someone's CPU is really only determines the maximum framerate that can be achieved in any given game sequence regardless of resolution. The CPU itself won't churn out frames more quickly just because the graphics-card is rendering at a lower resolution. That answers the first half or so of your post.

    As the X700 series are upper mid-range cards, they are intended to be used at quite high resolutions, not 1024x768 or less. The tests showed the X700XT was easily capable of producing a more than satisfactory framerate at 1280x1024 in every game tried including Doom 3, so why run more tests at 1024x768? Only if it were a slower which could only manage 30-40fps or less at 1280x960 would tests at lower resolutions be worthwhile.
  • kmmatney - Wednesday, September 22, 2004 - link

    Since these are now "low-end" cards, it would be great to see how they perform with slower cpus. I still have a lowly XP 2400+ thoroughbred...and I'd rather spend money on my Video card than another MB/CPU, if it can perform (at 1024 x 768).
  • Chuckles - Wednesday, September 22, 2004 - link

    I don't know about you, #27, but I think 10x7 is tunnel vision. Decent sized monitors are not all that expensive, and they allow you to do so much more with the space.
  • ThePlagiarmaster - Wednesday, September 22, 2004 - link

    What I want to know is how everything performs at 1024x768 with and without 4xaa/8xan. Lets face it 95% of the people running these games are NOT doing it in anything higher. To cut this res out of everything but doom3 (an oddball engine to begin with) is ridiculous. Sure higher shows us bandwidth becomes a big issue. But for most people running at 1024x768 (where most of us have cpu's that can keep a decent fps), does bandwidth really matter at all? Is a 9700pro still good at this res? You have to test 1024x768, because all you're doing here is showing one side of the coin. People who have the FASTEST FREAKING CPU's (eh, most don't - raise your hand if you have an athlonFX 53 or A64 3400+ or better? - Or even a P4 of 3.4ghz or faster? - I suspect most hands are DOWN now), to go with the fastest GPU's. Most people cut one or the other. So you need to show how a card does at a "NORMAL" res. I usually can't even tell the difference between 1024x768 and 1600x1200. At the frenetic pace you get in a FPS you don't even see the little details. Most of us don't hop around in different resolutions for every different game either. Most of my customers couldn't even tell you what resolution IS! No, I'm not kidding. They take it home in the res I put it in and leave it there forever (1024x768). If you're like me you pick a res all games run in without tanking the fps. Which for me is 1024x768. I don't have to care what game I run, I just run it. No drops during heated action. I hope you re-bench with the res most people use so people can really see, is it worth the money or not at the res the world uses? Why pay $200-400 for a new card if the 9700pro still rocks at 1024x768, and that expensive card only gets you another couple fps this low. I know it gets tons better with much higher res's but at the normal persons res does it show its value or not? In doom it seems to matter, but then this game is a graphical demo. No other engine is quite this punishing on cards. A good 70% or so of my customers still buy 17inchers! Granted some games have multi-res interfaces, but some get really small at larger resolutions on a 17in. This article is the complete opposite of running cpu tests in 640x480 but yeilds the same results. If nobody runs at 640x480 how real-world is it? If "almost" nobody runs in 1600x1200 should we spend more time looking at 1024x768 where 90% or so run? That's more real world right? 1600x1200 is for the fastest machines on the planet. Which is NOT many people I know, and I sell pc's...LOL.
  • AtaStrumf - Wednesday, September 22, 2004 - link

    At the very least have a look at Far Cry and Halo results. They realy seem to be upside down.

    I don't know who's making the mistake here, but it's something that needs looking into.
  • AtaStrumf - Wednesday, September 22, 2004 - link

    Derek I think your GPU scores urgently need updateing. We need to be able to compare new cards to old ones and we just can't do that reliably right now. Have a look at xbitlabs test results.

    http://www.xbitlabs.com/articles/video/display/ati...

    Relative positions between 9800 XT and X700 XT are more often then not different from your results.

    In their results it seems like R9800 XT fares much better relative to X700 XT. We might be making the wrong conclusions based on your scores.
  • Da3dalus - Tuesday, September 21, 2004 - link

    Quite clearly a win for nVidia in this match :)

    Hey Derek, are you gonna do a big Fall 2004 Video Card Roundup like you did last year? That would be really nice :)
  • jm0ris0n - Tuesday, September 21, 2004 - link

    #17 My thoughts exactly ! :)
  • DerekWilson - Tuesday, September 21, 2004 - link

    #8:

    ATI has stated that they will be bridging the RV410 back to AGP from PCIe -- they will not be running seperate silicon. They didn't have any kind of date they could get us, but they did indicate that it should be available before the end of the year. It's just hard to trust having such distant dates thrown around when both ATI and NVIDIA have shown that they have problems filling the channel with currently announced products.


    #18:

    This is likely a result of the fact that only the X700 XT, 6600 GT, and X600 XT were run with the most recent drivers -- the 6800 series cards are still running on 61.xx while the 6600 GT was powered by the 65.xx drivers. We are looking into a driver regression test, and we will take another look at performance with the latest drivers as the dust starts to settle.
  • Aquila76 - Tuesday, September 21, 2004 - link

    OK, I phrased the first part of my post VERY badly. In my defense, I had not yet had any coffee. ;)
    What I was trying to get across was that ATI does OK competing with NVidia in DX games, but still gets killed in OpenGL. They used to smoke NVidia in DX, but now NVidia has fixed whatever issues they had with that and are making a very good competitive card to ATI's offering. The 6600GT is clearly the better card here, for either D3 or HL2 engines.

Log in

Don't have an account? Sign up now