NVIDIA's Last Minute Effort

On the verge of ATI's R420 GPU launch, NVIDIA brought out a new card called the GeForce 6850 Ultra. This new card is to be sold as an OEM overclocked part (ala the "Golden Sample" and other such beasts), and will be able to run at 450MHz+ core and 1.1GHz+ memory clock speeds. It is very clear to us getting this board out here right now was a bit of a rush for NVIDIA, and it would seem that they didn't expect to see the kind of performance ATI's X800 series can deliver. We were unable to get drivers installed and running on our 6850 Ultra card until about two hours ago, but we will be following this article up with an update to the data as soon as we are able to benchmark the card. The 6850 Ultra looks exactly the same as the 6800 Ultra (it really is the same card with an overclock), so we'll forego the pictures.

The other part NVIDIA is launching today is their $399 price point card, the GeForce 6800 GT. This card won't be shipping for a while (mid June), and NVIDIA cites the fact that they didn't want to announce the card too far ahead of availability as the reason for the timing of this announcement.

The 6800 GT is a 16x1 architecture card that runs at 350MHz core and 1GHz memory clocks. As we can see from the beautiful picture, NVIDIA is bringing out a single slot card with one molex power connector based on NV40. Even if its not the fastest thing we'll see today, it is still good news. We will definitely be trying our hand at a little overclocking in the future. Power requirements are much less than the 6800 Ultra part, with something like a 300W PSU being perfectly fine to run this card.

Along with this new card release, NVIDIA have pushed out new beta drivers (61.11), of which we are still evaluating the image quality. We haven't seen any filtering differences, but we currently exploring some shader based image quality tests.

The Test

The key factor in the ongoing battle is DirectX 9 performance. We will be taking the hardest look at games that exploit PS 2.0, but that's not all people play (and there aren't very many on the market yet), we have included some past favorites as well.

Our test system is:

FIC K8T800 Motherboard
AMD Athlon 64 3400+
Segate 120GB PATA HDD
510W PC Power & Cooling PSU

The drivers we used in testing are:

NVIDIA Beta ForceWare 60.72
NVIDIA Beta ForceWare 61.11
ATI CATALYST BETA (version unknown)

We didn't observe any performance difference when moving to the Beta CATAYLST from 4.4 on 9800 XT, so we chose to forgo retesting everything on the new drivers. Also, the 61.11 driver does show a slight increase in performance on NVIDIA cards, so we retested previously benchmarked hardware with the 61.11 drivers. Old numbers will be left in the benchmarks for completeness sake.

As mentioned earlier, we will be updating our tests later today with number collected from the GeForce 6850 Ultra (and we'll throw in a few other surprises as well).

The Cards Pixel Shader Performance Tests


View All Comments

  • TrogdorJW - Tuesday, May 4, 2004 - link

    Nice matchup we've got here! Just what we were all hoping for. Unfortunately, there are some disappointing trends I see developing....

    ShaderMark 2.0, we see many instances where the R420 is about 25% faster than the NV40. Let's see... 520 MHz vs 400 MHz. 'Nuf said, I think. Too bad for Nvidia that they have 222 million transistors, so they're not likely to be able to reach 500 MHz any time soon. (Or if they can, then ATI can likely reach 600+ MHz.)

    How about the more moderately priced card matchup? The X800 Pro isn't looking that attractive at $400. 25% more price gets you 33% more pipelines, which will probably help out on games that process a lot of pixels. And the Pro also has 4 vertex pipelines compared to 6? The optimizations make it better than a 9800XT, but not by a huge margin. The X800 SE with 8 pipelines is likely going to be about 20% faster than an 9800XT. Hopefully, it will come in at a $200 price point, but I'm not counting on that for at least six months. (Which is why I recently purchased a $200 9800 Pro 128.)

    The Nvidia lineup is currently looking a little nicer. The 6800 Ultra, Ultra Special, and GT all come with 16 pipelines, and there's talk of a lower clocked card for the future. If we can get a 16 pipeline card (with 6 vertex pipelines) for under $250, that would be pretty awesome. That would be a lot like the 5900 XT cards. Anyone else notice how fast the 9800 Pro prices dropped when Nvidia released the 5900 XT/SE? Hopefully, we'll see more of that in the future.

    Bottom line has to be that for most people, ATI is still the choice. (OpenGL gamers, Linux users, and professional 3D types would still be better with Nvidia, of course.) After all, the primary benefit of NV40 over R420 - Shader Model 3.0 - won't likely come into play for at least six months to a year. Not in any meaningful way, at least. By then, the fall refresh and/or next spring will be here, and ATI could be looking at SM3.0 support. Of course, adding SM3 might just knock the transistor counts of ATI's chips up into the 220 million range, which would kill their clock speed advantage.

    All told, it's a nice matchup. I figure my new 9800 Pro will easily last me until the next generation cards come out, though. By then I can look at getting an X800 Pro/XT for under $200. :)
  • NullSubroutine - Tuesday, May 4, 2004 - link

    I forgot to ask if anyone else noticed a huge difference (almost double) between AnandTechs UnrealTourment 2003 scores and that of Toms Hardware?

    (Its not the CPU difference, because the A64 3200+ had a baseline score of ~278 and the 3.2 P4 had a ~247 on a previous section.)

    So what gives?
  • NullSubroutine - Tuesday, May 4, 2004 - link

    The guy talking about the 400mhz and the 550mhz I have this to say.

    I agree with the other guy about the transistor count.

    Dont forget that ATi's cards used to be more powerful per clock speed compared to Nvidia a generation or two ago. So dont be babbling fanboy stuff.

    I would agree with that one guy (the # escapes me) about the fanboy stuff, but I said it first! On this thread anyways.
  • wassup4u2 - Tuesday, May 4, 2004 - link

    #30 & 38, I believe that while the ATI line is fabbed at TSMC, NVidia is using IBM for their NV40. I've heard also that yields at IBM aren't so good... which might not bode well for NVidia. Reply
  • quanta - Tuesday, May 4, 2004 - link

    > #14, I think it has more to do with the fact those OpenGL benchmarks are based on a single engine that was never fast on ATI hardware to begin with.

    Not true. ATI's FireGL X2 and Quadro FX 1100 were evenly matched in workstation OpenGL tests[1], which do not use Quake engines. Considering FireGL X2 is based on the Radeon 9800XT and Quadro FX 1100 is based on GeForce FX 5700 Ultra, such result is unacceptable. If I were an ATI boss, I would have made sure the OpenGL driver team would not make such a blunder, especially when R420 still sucks in most OpenGL games compared to GeFocre 6800 Ultra cards.

    [1] http://www.tomshardware.com/graphic/20040323/index...
  • AlexWade - Tuesday, May 4, 2004 - link

    From my standpoint the message is clear: nVidia is no longer THE standard in graphic cards. Why do I say that? It half the size, it requires less power, it has less transistors, and the performance is about the same. Even if the performance was slightly less, ATI would still be winner. Anyway, whatever, its not like these benchmarks will deter the hardcore gotta-have-it-now fanboys.

    Its not like I'm going to buy either. Maybe this will lower the prices of all the other video cards. $Dreams$
  • rsaville - Tuesday, May 4, 2004 - link

    If any 6800 users are wondering how to make their 6800 run the same shadows as the 5950 in the benchmark see this post:

    Also if you want to make your GeForceFX run the same shadows as the rest of the PS2.0 capable cards then find a file called driverConfig.lua in the homeworld2\bin directory and remove line 101 that disables fragment programs.
  • raskren - Tuesday, May 4, 2004 - link

    I wonder if this last line of AGP cards will ever completely saturate the AGP 8X bus. It would be interesting to see a true PCI-Express card compared to the same AGP 8X counterpart.

    Remember when Nvidia introduced the MX440 (or was it 460?) with an 8X AGP connector...what a joke.
  • sisq0kidd - Tuesday, May 4, 2004 - link

    That was the cheesiest line #46, but very true... Reply
  • sandorski - Tuesday, May 4, 2004 - link

    There is only 1 clear winner here, the Consumer!

    ATI and NVidia are running neck-neck.

Log in

Don't have an account? Sign up now