Anisotropic, Trilinear, and Antialiasing


There was a great deal of controversy last year over some of the "optimizations" NVIDIA included in some of their drivers. We have visited this issue before, and we won't get into the debate here, but it is important to note that NVIDIA won't be taking chances in the future on being called a cheater.

NVIDIA's new driver defaults to the same adaptive anisotropic filtering and trilinear filtering optimizations they are currently using in the 50 series drivers, but users are now able to disable these features. Trilinear filtering optimizations can be turned off (doing full trilinear all the time), and a new "High Quality" rendering mode turns off adaptive anisotropic filtering. What this means is that if someone wants (or needs) to have accurate trilinear and anisotropic filtering they can. The disabling of trilinear optimizations is currently available in the 56.72

Unfortunately, it seems like NVIDIA will be switching to a method of calculating anisotropic filtering based on a weighted Manhattan distance calculation. We appreciated the fact that NVIDIA's previous implementation of anisotropic filtering employed a Euclidean distance calculation which is less sensitive to the orientation of a surface than a weighted Manhattan calculation.


This is how NVIDIA used to do Anisotropic filtering


This is Anisotropic under the 60.72 driver.


This is how ATI does Anisotropic Filtering.


The advantage is that NVIDIA now has a lower impact when enabling anisotropic filtering, and we will also be doing a more apples to apples comparison when it comes to anisotropic filtering (ATI also makes use of a weighted Manhattan scheme for distance calculations). In games where angled, textured, surfaces rotate around the z-axis (the axis that comes "out" of the monitor) in a 3d world, both ATI and NVIDIA will show the same fluctuations in anisotropic rendering quality. We would have liked to see ATI alter their implementation rather than NVIDIA, but there is something to be said for both companies doing the same thing.

We had a little time to play with the D3D AF Tester that we used in last years image quality article. We can confirm that turning off the trilinear filtering optimizations results in full trilinear being performed all the time. Previously, neither ATI nor NVIDIA did this much trilinear filtering, but check out the screenshots.


Trilinear optimizations enabled.


Trilinear optimizations disabled.


When comparing "Quality" mode to "High Quality" mode we didn't observe any difference in the anisotropic rendering fidelity. Of course, this is still a beta driver, so everything might not be doing what it's supposed to be doing yet. We'll definitely keep on checking this as the driver matures. For now, take a look.


Quality Mode.


High Quailty Mode.


On a very positive note, NVIDIA has finally adopted a rotated grid antialiasing scheme. Here we can take a glimpse at what the new method does for their rendering quailty in Jedi Knight: Jedi Academy.


Jedi Knight without AA


Jedi Knight with 4x AA


Its nice to finally see such smooth near vertical and horizontal lines from a graphics company other than ATI. Of course, ATI does have yet to throw its offering into the ring, and it is very possible that they've raised their own bar for filtering quality.
Programmable Encoding Anyone? The Card and The Test
Comments Locked

77 Comments

View All Comments

  • Reliant - Wednesday, April 14, 2004 - link

    Any ideas how the Non Ultra version will perform?
  • segagenesis - Wednesday, April 14, 2004 - link

    I cant agree with #45 more. People rush to judgement when its no secret that ATI will be coming out with thier goods very soon also. "Wow look this card is really fast!!! I cant believe it!" well this sounds like almost every other graphics card release from ATI or nVidia in the past. To me nVidia had better have come out with something good after ther lackluster Geforce FX 5800 wasnt anything terribly special. I used to like nVidia alot (heh my ti4600 still runs fine) but when it comes to looking for a new card, I'll pick whichever one is faster *and* has the features I want. If it wasnt for such turnoffs like the 2-slot design and now even 2 power connections required Im not sure I am ready to spend $500 just yet...

    Sorry if im obtuse but if ATI comes out with a part thats either equal (note the key term there) in performance or maybe even slightly slower... I'd go for ATI and thier better IQ that the Radeon 9700 series so impressed me on and made me wish for more out of my ti4600. That and a single slot/single power type design would probably put me in thier boat.

    Fanboy ATI opinion? I've owned nVidia from the Riva TNT to the ti4600 and many in-between.
  • Lonyo - Wednesday, April 14, 2004 - link

    #42, the jump from the Ti4600 to the 9700Pro wasn't good for you? I woul dhave thought finally playable AA/AF was quite a jump.
    Personally, it seems less of a jump than the 4600 -> 9700.


    And I will reserve judgement on how much of an accomplishment nVidia have made until I see what ATi release.
    If it's of similar power, but maybe has 1 molex, or is a single slot solution, they will have accomplished more.
    It's not just raw performance, we'll have to see how it all stacks up, and how long it takes to release the things!
  • ChronoReverse - Wednesday, April 14, 2004 - link

    Some site tested the 6800U on a 350W supply and it worked just fine.


    Myself, I think my Enermax 350W with its enhanced 12V rail will take it just fine as well.
  • Regs - Wednesday, April 14, 2004 - link

    Yeah, Nvidia did make one hell of an accomplishment. They just earned a lot of respect back from both fan clubs. You have to respect the development and research that went into this card and the end result turns out to be just as we anticipated if not more.

    I really don't know how anybody could pick a "club" when seeing hardware like this perform so well.

    Im hoping to see the same results from ATI.

    Just too bad they are some costly pieces of hardware ;)
  • araczynski - Wednesday, April 14, 2004 - link

    nice to FINALLY see a universally quantifiable performance increase from one generation to the next.

    but the important thing is how it competes with the x800 from ati, not against older cards.

    as for the power supply, i think the hardcore crowd that these are geared at already have more then enough power, and quite frankly i would be suprised if these woudln't work fine on a solid 350W from a reputable source (i.e. not your 350W ps for $10 from some 'special' sale).

    They're being conservative knowing that many of the people have crappy power supplies and don't know better.
  • klah - Wednesday, April 14, 2004 - link

    "Anyone know when it ships to retail stores?"

    http://www.eetimes.com/semi/news/showArticle.jhtml...

    "GeForce 6800 Ultra and GeForce 6800 models, are currently shipping to add-in-card partners, OEMs, system builders and game developers. Retail graphics boards based on the GeForce 6800 models are scheduled for release in the next 45 days."
  • Jeff7181 - Wednesday, April 14, 2004 - link

    This has me a bit curious... maybe I didn't read close enough... but is this the 6800 or the 6800 Ultra?
  • saechaka - Wednesday, April 14, 2004 - link

    wow impressive. i really want one. wonder if it will run ok with my 380w powersupply
  • Cygni - Wednesday, April 14, 2004 - link

    Personally, im very impressed, and i havent had an Nvidia product in my main gaming rig since my Geforce256. The card may be huge, power hungry, hot, and loud (maybe), but that is some SERIOUS performance.

    How long has it been since Nvidia has had a top end card that DOUBLED the performance of the last top end card? Pretty awesome, I think. I dont have the money to pick one up, but hopefully the mid/low end gets some love from both ATI and Nvidia as well. The 9200/9600/5200/5600 dont really appeal to me... not enough of a performance leap over a $20 8500!

Log in

Don't have an account? Sign up now