Introduction

The past couple months have been very exciting, entertaining, and interesting as far as consumer level computer graphics cards go. Everyone was blown away when NVIDIA essentially doubled performance of their previous generation in their new top of the line 6800 Ultra. Shortly thereafter, we had the pleasure of discovering that this generation of graphics cards would see two GPU makers on equal footing for the first time since hardware acceleration was introduced. Each IHV has strong points and weak points, but the overall picture is surprisingly balanced.

Ever since our initial X800 review went to press, we've been promising a look at NVIDIA's "6850", which actually turned out to be a 6800 Ultra Extreme. So, why did it take so long? NVIDIA's reference 6800 Ultra Extreme was essentially DOA: drivers wouldn't install on the card, though we could boot standard VGA. It's been a long road, but since then, we've gotten a hold of eVGA's 6800 Ultra Extreme Edition.

Though 6800 non-ultras have been in the Press' hands for a while, some might wonder why we haven't yet covered it. Our NVIDIA reference card wouldn't even boot, but luckily, LeadTek was gracious enough to lend us a 6800 card for the purposes of this review. We haven't had any problems with our vendor cards, and we don't think there's any reason to worry about NVIDIA cards because of this one issue. Nevertheless, it's still worth mentioning.

So, now we have 3 real flavors of NV40, plus one overclocked (less available) version, and 2 flavors of R420, plus one overclocked version (which is suppose to be as available as the XT, but we'll have to wait and see what happens there).

It's difficult to tell whether the sheer number of options available will be helpful to consumers, or just plain confusing. And pricing is still going to be a difficulty. The real factor that will determine whether or not to buy the card will be price/performance. Though prices right now may be highly dependant on availability, we'll review some price/performance numbers in our conclusion to see where things stand. Hopefully, this review will help to show the full picture of the AGP 8x playing field, and help take some of the pain out of the decision-making process.

But first, we need to take a look at the two cards with which we will be using.

The Cards
Comments Locked

46 Comments

View All Comments

  • TrogdorJW - Friday, July 9, 2004 - link

    My final comment (for now):

    On the Warcraft III page, you had this to say: "Even at 16x12, this benchmark is very CPU limited, and yes, vsync was disabled. Oddly, when AA/AF is enabled, the FX 5950U actually outperforms the X800 XT PE. This is an atypical situation, and we will try to look into the matter further."

    My thought on this is that the likely reason has to do with optimizations. In most benchmarks, the 6800 series of cards outperforms their X800 equivalents when running at standard settings. Enabling 4xAA and 8xAF often diminishes the gap or shifts the benchmark into ATI's favor. However, you don't really do a full suite of benchmarks, so it's difficult to say why the shift takes place. Having looked at other sites, the shift seems to be related almost entirely to the anisotropic filtering. Turning on/off AA seems to have very little impact on the placing of the cards when you're not CPU limited, while turning on/off AF can put a much larger burden on the Nvidia cards, especially cards of the FX era.

    So what does this have to do with Warcraft III? Well, I won't bother arguing which of the two AF methods is actually better out of Nvidia and ATI. They seem to be roughly equivalent. However, ATI seems to get more AF performance out of their hardware. Basically, the ATI algorithm simply appears to be superior in performance.

    So again, what does this have to do with Warcraft III and the Geforce FX? One word: perspective. WCIII uses an overhead perspective, so much of the screen is filled with polygons (the ground) that are perpendicular to the camera angle. If I recall correctly from my graphics programming classes, there is less that can be done to optimize the AF algorithms in this scenario. I believe that perpendicular polygons are already almost "perfectly optimized". (Or maybe it's just that Nvidia has better optimizations on the FX architecture in this instance?) The end result is that the GPU doesn't have to do a whole lot of extra work, so in this particular instance, the FX architecture does not suffer nearly as much when enabling AF. Not that any of us would actually go out and buy an FX5950 these days....

    Honestly, though, the benchmarking methodology for WCIII (playback of a demo at 8X speed) seems pretty much worthless - i.e. on the level of 3DMark usefulness. It's a DX7 game that will run well even on old Pentium 3 systems with GeForce 2 cards, and anything more recent than a GeForce 4 Ti with a 2 GHz CPU will have no difficulty whatsoever with the game. Running a demo playback at 8X might not work well, but that's not actually playing the game. I'm sure there are plenty of WCIII fans that think this is a meaningful performance measurement, but there are probably people out there that still play the original Quake and think that it gives meaningful results. :)
  • TrogdorJW - Friday, July 9, 2004 - link

    A few other comments from the article:

    "The 9700 Pro may be a good value for many games, but it just won't deliver the frame rates in current and future titles, at the resolutions to which people are going to want to push their systems."

    I really have to disagree with that opinion. These tests were done exclusively at 1280x1024 and 1600x1200, as well as with 4xAA and 8xAF. Only the extreme fringe of gamers actually have a desire to push their systems that far. Well, I suppose we would all *want* to, but most of us simply cannot afford to. First, you would need a much better monitor than the typical PC is equipped with - 19" CRT or 17" LCD would be the minimum. You would also need to run at 4xAA and 8xAF at the maximum resolution your display supports in several of the games. Finally, you would need to max out all the graphics in each game. While some people certainly feel this is "necessary", I'm pretty sure they're in the minority.

    My opinion? The difference between 800x600 and 800x600+2xAA is rather noticeable; the difference between 800x600+2xAA and 800x600+4xAA is much less so. I also think that 800x600+4xAA is roughly equivalent to 1024x768+2xAA or 1280x1024 without any AA. Personally, I would prefer higher resolutions up to a point (beyond 1280x1024, it's not nearly as important). For graphical quality, there's a pretty major improvement from bilinear to trilinear filtering, but you don't notice the bump to anisotropic filtering nearly as much. There is also a very drastic change in quality when going from low detail to medium detail, and generally a noticeable change when going from medium to high detail. Beyond that (going to very high or ultra high - assuming the game permits), there is usually very little qualitative difference, while the performance generally suffers a lot.

    But hey - it's just one man's opinion against anothers. I point this out not as a rebuke of your opinion. It is as disagreement with your pushing your opinion as being something more. Often, writers don't like wishy-washy conclusions, but a more moderate stance is probably warranted with many of the hardware sites. The fastest hardware comes with a major price increase that most people are simply unwilling to pay. The use of a logarithmic scale is also part of this problem, as most people would be more than happy to pay half as much for 75% of the performance.
  • TrogdorJW - Friday, July 9, 2004 - link

    #24 - I'm amazed that you're the only other person that even wondered about that. Basically, using the Log of the performance/price makes everything a lot closer. There is a reason for this, of course: if you take the straight performance/price (multiplied by 10 or 100 if you want to get the numbers into a more reasonable range), it makes all the expensive cards look really, really bad.

    However, the reality is that while an X800 Pro or 6800 GT might cost over twice as much as the 9700 Pro, there is a real incentive to purchase the faster cards. Minimum frame rates on a 9700 Pro would often be completely unacceptable at these resolutions. The use of a logarithmic chart makes large differences in price and/or performance less of a deal killer.

    For example, let's look at Warcraft III 1600x1200 without AA/AF. The cards range from 58.2 to 61.1 FPS, but the price range is from $300 to $600. In this particular instance, the $300 6800 would be almost twice as "desirable" as the 6800UE or X800XTPE. Apply their log-based calculation to it, though, and the 6800 is now only 30% more desirable than the $600 cards.

    What it amounts to, though, is their statement at the beginning: every person has a different set of criteria for rating overall "value". In Anandtech's case, they like performance and are willing to pay a lot of extra money for it. (Which of course flies in the face of their comments about the $10 difference in price between the 6800GT and X800 Pro, but that's a different story. As someone already pointed out, if the GT leads in performance, costs a little less, and also has more features, what numbskull wouldn't choose it over the X800 Pro?!? Of course, there are instances where the X800 Pro still wins, so if you value those specific situations more, then you might want the X800 instead of the GT.)
  • Leuf - Friday, July 9, 2004 - link

    How can you leave out the 9800 pro when talking value, especially when the video card guide right under this article says the 9800 pro is the best price/perfomance now?

    One thing you don't take into account is that someone buying a lower end card probably doesn't have the same cpu as someone buying a top end card. While it wouldn't make sense to test each card with a different cpu for this article it's worth mentioning. I'd actually like to see a perfomance plot of a couple value cards tested across the gamut of cpus. Looking at video card value and cpu value completely separate from each other isn't necessarily going to lead to the best choices.
  • Neekotin - Friday, July 9, 2004 - link

    araczynski, i used an asetek waterchill v2. dedicated only for the GPU and a custom coolant, my recipe... i havent tried it with a CPU, my 3400 is barely overclockable.
  • Marsumane - Friday, July 9, 2004 - link

    Yes that is a good deal. But it doesnt represent the actual price of the card w/o promotions. The GT was $300 from bestbuy. Thats not the overall price tho, just a pricing mistake. You cant count them.
  • snikrep - Friday, July 9, 2004 - link

    Did you guys notice the Far Cry specs? That's some pretty huge numbers... the X800XT is beating the 6800 Ultra by about 15FPS from what I could tell.

    That's huge!!

    And those "actual" pricing numbers seem way off... I picked up a retail X800 Pro from Best Buy 2 weeks ago for $399, so I don't see why we'd include price gouging vendors.

    And the X800XT Platinum Edition is below $499 at most places, I personally have it on order from Gateway for $390 which makes it the best deal by FAR (of course I'll get it sometime in August with my luck, but who cares, it's cheap).
  • nserra - Friday, July 9, 2004 - link

    So many rich people here, discussing de price of 600$ card's, and worried about their little price differences (10$), funny.

    After watching this review, I would go for a 9700pro or 9800pro, or even better a softmod 9500/9800se.
    I play always (if the games permit) at 1024x768 2xAA and 4xAF. It's more than enough.

    The review doesn’t take into account that most of monitors (CRT) do 60 Hz at 1280x1024 and 1600x1200.
  • araczynski - Friday, July 9, 2004 - link

    Neekotin: What hardware are you using for your liquid cooling setup? I've been thinking about incorporating it into my next build possibly.
  • Drayvn - Friday, July 9, 2004 - link

    Sorry to post again, but the cheapest 6800 Ultra, we dont even have the Extreme yet is....
    $621

    So the difference is about $100 still, which in my opinion i would by the XT-PE still, but prices could go down when the UE comes out, dunno...

Log in

Don't have an account? Sign up now