NVIDIA's Last Minute Effort

On the verge of ATI's R420 GPU launch, NVIDIA brought out a new card called the GeForce 6850 Ultra. This new card is to be sold as an OEM overclocked part (ala the "Golden Sample" and other such beasts), and will be able to run at 450MHz+ core and 1.1GHz+ memory clock speeds. It is very clear to us getting this board out here right now was a bit of a rush for NVIDIA, and it would seem that they didn't expect to see the kind of performance ATI's X800 series can deliver. We were unable to get drivers installed and running on our 6850 Ultra card until about two hours ago, but we will be following this article up with an update to the data as soon as we are able to benchmark the card. The 6850 Ultra looks exactly the same as the 6800 Ultra (it really is the same card with an overclock), so we'll forego the pictures.

The other part NVIDIA is launching today is their $399 price point card, the GeForce 6800 GT. This card won't be shipping for a while (mid June), and NVIDIA cites the fact that they didn't want to announce the card too far ahead of availability as the reason for the timing of this announcement.

The 6800 GT is a 16x1 architecture card that runs at 350MHz core and 1GHz memory clocks. As we can see from the beautiful picture, NVIDIA is bringing out a single slot card with one molex power connector based on NV40. Even if its not the fastest thing we'll see today, it is still good news. We will definitely be trying our hand at a little overclocking in the future. Power requirements are much less than the 6800 Ultra part, with something like a 300W PSU being perfectly fine to run this card.

Along with this new card release, NVIDIA have pushed out new beta drivers (61.11), of which we are still evaluating the image quality. We haven't seen any filtering differences, but we currently exploring some shader based image quality tests.

The Test

The key factor in the ongoing battle is DirectX 9 performance. We will be taking the hardest look at games that exploit PS 2.0, but that's not all people play (and there aren't very many on the market yet), we have included some past favorites as well.

Our test system is:

FIC K8T800 Motherboard
AMD Athlon 64 3400+
1GB OCZ PC3400 RAM
Segate 120GB PATA HDD
510W PC Power & Cooling PSU

The drivers we used in testing are:

NVIDIA Beta ForceWare 60.72
NVIDIA Beta ForceWare 61.11
ATI CATALYST BETA (version unknown)
ATI CATALYST 4.4

We didn't observe any performance difference when moving to the Beta CATAYLST from 4.4 on 9800 XT, so we chose to forgo retesting everything on the new drivers. Also, the 61.11 driver does show a slight increase in performance on NVIDIA cards, so we retested previously benchmarked hardware with the 61.11 drivers. Old numbers will be left in the benchmarks for completeness sake.

As mentioned earlier, we will be updating our tests later today with number collected from the GeForce 6850 Ultra (and we'll throw in a few other surprises as well).

The Cards Pixel Shader Performance Tests
POST A COMMENT

95 Comments

View All Comments

  • rms - Tuesday, May 4, 2004 - link

    "the near-to-be-released goodlooking PS 3.0 Far Cry update "

    When is that patch scheduled for? I recall seeing some rumour it was due in September...

    rms
    Reply
  • Fr0zeN - Tuesday, May 4, 2004 - link

    Yeah I agree, the GT looks like it's gonna give the x800P a run for its money. On a side note, the differences between P and XT versions seem to be greater than r9800's, hmm.

    In the end it's the most overclockable $200 card that'll end up in my comp. There's no way I'm paying $500 for something that I can compensate for by turning the rez down to 10x7... Raw benchmarks mean nothing if it doesn't oc well!
    Reply
  • Doop - Tuesday, May 4, 2004 - link

    The cards seem very close, I tend to favor nVidia now since they have superior multi monitor and professional 3D drivers and I regret buying my Fire GL X1.

    It's strange ATi didn't announce a 16 pipeline card orginally, it will be interesting to see in a month or two who actually ends up delivering cards.

    I mean if they're being made in significant quantities they'll be at your local store with a reduced 'street' price but if it's just a paper launch they'll just be at Alienware, Dell (with a new PC only) or $500 if you can find one.
    Reply
  • jensend - Tuesday, May 4, 2004 - link

    #17, the Serious Engine has nothing to do with the Q3 engine; Nvidia's superior OpenGL performance is not dependent on any handful of engines' particular quirks.

    Zobar is right; contra Jibbo, the increased flexibility of PS3 means that for many 2.0 shader programs a PS3 version can achieve equivalent results with a lesser performance hit.

    As far as power goes, I'm surprised NV made such a big deal out of PSU requirements, as its new cards (except the 6800U Extremely Short Production Run Edition/6850U/Whatever they end up calling that part) compare favorably wattage-wise to the 5950U and don't pull all that much more power than the 9800XT. Both companies have made a big performance per watt leap, and it'll be interesting to see how the mid-range and value cards compare in this respect.
    Reply
  • blitz - Tuesday, May 4, 2004 - link

    "Of course, we will have to wait and see what happens in that area, but depending on what the test results for our 6850 Ultra end up looking like, we may end up recommending that NVIDIA push their prices down slightly (or shift around a few specs) in order to keep the market balanced."

    It sounds as if you would be giving nvidia advice on their pricing strategy, somehow I don't think they would listen nor be influenced by your opinion. It could be better phrased that you would advise consumers to wait for prices to drop or look elsewhere for better price\performance ratio.
    Reply
  • Cygni - Tuesday, May 4, 2004 - link

    Hmmmm, interesting. I really dont see where anyone can draw the conclusion that the x800 Pro is CLEARLY the winner. The 6800 GT and x800 Pro traded game wins back and forth. There doesnt seem to be any clear cut winner to me. Wolf, JediA, X2, F1C, and AQ3 all went clearly to the GT... this isnt open and shut. Alot of the other tests were split depending on resolution/AA. On the other hand, I dont think you can say that the GT is clearly better than the x800 Pro either.

    Personally, I will buy whichever one hits a reasonable price point first. $150-200. Both seem to be pretty equal, and to me, price matters far more.
    Reply
  • kherman - Tuesday, May 4, 2004 - link

    BRING ON DOOM 3!!!!!!

    We all know inside that this is what ID was waiting for!
    Reply
  • Diesel - Tuesday, May 4, 2004 - link

    ------------------
    I think it is strange that the tested X800XT is clocked at 520 Mhz, while the 6800U, that is manufactured by the same taiwanese company and also has 16 pipelines, is set at 400 Mhz.
    ------------------

    This could be because NV40 has 222M transistors vs. R420 at 160M transistors. I think the amount of power required and heat generated is proportional to transistor count and clock speed.
    Reply
  • edub82 - Tuesday, May 4, 2004 - link

    I know this is an ATI article but that 6800 GT is looking very attractive. It beats the x800Pro on a fairly regular basis is a single slot and molex connector card and is starting at 400 and hopefully will go down a few dollars ;) in 6 months when i want to upgrade. Reply
  • Slaanesh - Tuesday, May 4, 2004 - link

    "Clearly a developer can have much nicer quality and exotic effects if he/she exploits these, but how many gamers will have a PS3.0 card that will run these extremely complex shaders at high resolutions and AA/AF without crawling to single-digit fps? It's my guess that it will be *at least* a year until games show serious quality differentiation between PS2.0 and PS3.0. But I have been wrong in the past..."
    --------

    I dunnow.. When Morrowind got released, only he few GF3 cards on the market were able to show the cool pixel shader water effects and they did it well; at that time I was really pissed I went for the cheaper Geforce2 Ultra although it had some better benchmarks at a much lower price. I don't think I want make that mistake again and pay the same amount of money for a card that doesnt support the latest technology..
    Reply

Log in

Don't have an account? Sign up now