Anisotropic, Trilinear, and Antialiasing


There was a great deal of controversy last year over some of the "optimizations" NVIDIA included in some of their drivers. We have visited this issue before, and we won't get into the debate here, but it is important to note that NVIDIA won't be taking chances in the future on being called a cheater.

NVIDIA's new driver defaults to the same adaptive anisotropic filtering and trilinear filtering optimizations they are currently using in the 50 series drivers, but users are now able to disable these features. Trilinear filtering optimizations can be turned off (doing full trilinear all the time), and a new "High Quality" rendering mode turns off adaptive anisotropic filtering. What this means is that if someone wants (or needs) to have accurate trilinear and anisotropic filtering they can. The disabling of trilinear optimizations is currently available in the 56.72

Unfortunately, it seems like NVIDIA will be switching to a method of calculating anisotropic filtering based on a weighted Manhattan distance calculation. We appreciated the fact that NVIDIA's previous implementation of anisotropic filtering employed a Euclidean distance calculation which is less sensitive to the orientation of a surface than a weighted Manhattan calculation.


This is how NVIDIA used to do Anisotropic filtering


This is Anisotropic under the 60.72 driver.


This is how ATI does Anisotropic Filtering.


The advantage is that NVIDIA now has a lower impact when enabling anisotropic filtering, and we will also be doing a more apples to apples comparison when it comes to anisotropic filtering (ATI also makes use of a weighted Manhattan scheme for distance calculations). In games where angled, textured, surfaces rotate around the z-axis (the axis that comes "out" of the monitor) in a 3d world, both ATI and NVIDIA will show the same fluctuations in anisotropic rendering quality. We would have liked to see ATI alter their implementation rather than NVIDIA, but there is something to be said for both companies doing the same thing.

We had a little time to play with the D3D AF Tester that we used in last years image quality article. We can confirm that turning off the trilinear filtering optimizations results in full trilinear being performed all the time. Previously, neither ATI nor NVIDIA did this much trilinear filtering, but check out the screenshots.


Trilinear optimizations enabled.


Trilinear optimizations disabled.


When comparing "Quality" mode to "High Quality" mode we didn't observe any difference in the anisotropic rendering fidelity. Of course, this is still a beta driver, so everything might not be doing what it's supposed to be doing yet. We'll definitely keep on checking this as the driver matures. For now, take a look.


Quality Mode.


High Quailty Mode.


On a very positive note, NVIDIA has finally adopted a rotated grid antialiasing scheme. Here we can take a glimpse at what the new method does for their rendering quailty in Jedi Knight: Jedi Academy.


Jedi Knight without AA


Jedi Knight with 4x AA


Its nice to finally see such smooth near vertical and horizontal lines from a graphics company other than ATI. Of course, ATI does have yet to throw its offering into the ring, and it is very possible that they've raised their own bar for filtering quality.
Programmable Encoding Anyone? The Card and The Test
Comments Locked

77 Comments

View All Comments

  • Pete - Monday, April 19, 2004 - link

    Shinei,

    I did not know that. </Johnny Carson>

    Derek,

    I think it'd be very helpful if you listed the game version (you know, what patches have been applied) and map tested, for easier reference. I don't even think you mentioned the driver version used on each card, quite important given the constant updates and fixes.

    Something to think about ahead of the X800 deadline. :)
  • zakath - Friday, April 16, 2004 - link

    I've seen a lot of comments on the cost of these next-gen cards. This shouldn't surprise anyone...it has always been this way. The market for these new parts is small to begin with. The best thing the next gen does for the vast majority of us non-fanbois-who-have-to-have-the-bleeding-edge-part is that it brings *todays* cutting edge parts into the realm of affordability.
  • Serp86 - Friday, April 16, 2004 - link

    Bah! My almost 2 year old 9700pro is good enough for me now. i think i'll wait for nv50/r500....

    Also, a better investment for me is to get a new monitor since the 17" one i have only supports 1280x1024 and i never turn it that high since the 60hz refresh rate makes me go crazy
  • Wwhat - Friday, April 16, 2004 - link

    that was to brickster, neglected to mention that
  • Wwhat - Friday, April 16, 2004 - link

    Yes you are alone
  • ChronoReverse - Thursday, April 15, 2004 - link

    Ahem, this card has been tested by some people with a high-quality 350W power supply and it was just fine.


    Considering that anyone who could afford a 6800U would have a good powersupply (Thermaltake, Antec or Enermax), it really doesn't matter.


    The 6800NU uses only one molex.
  • deathwalker - Thursday, April 15, 2004 - link

    Oh my god...$400 and u cant even put it in 75% of the systems on peoples desks today without buying a new power supply at a cost of nearly another $100 for a quailty PS...i think this just about has to push all the fanatics out there over the limit...no way in hell your going to notice the perform improvement in a multiplayer game over a network..when does this maddness stop.
  • Justsomeguy21 - Monday, November 29, 2021 - link

    LOL, this was too funny to read. Complaining about a bleeding edge graphics card costing $400 is utterly ridiculous in the year 2021 (almost 2022). You can barely get a midrange card for that price and that's assuming you're paying MSRP and not scalper prices. 2004 was a great year for PC gaming, granted today's smartphones can run circles around a Geforce 6800 Ultra but for the time PC hardware was being pushed to the limits and games like Doom 3, Far Cry and Half Life 2 felt so nextgen that console games wouldn't catch up for a few years.
  • deathwalker - Thursday, April 15, 2004 - link

  • Shinei - Thursday, April 15, 2004 - link

    Pete, MP2 DOES use DX9 effects, mirrors are disabled unless you have a PS2.0-capable card. I'm not sure why, since AvP1 (a DX7 game) had mirrors, but it does nontheless. I should know, since my Ti4200 (DX8.1 compatible) doesn't render mirrors as reflective even though I checked the box in the options menu to enable them...
    Besides, it does have some nice graphics that can bog a card down at higher resolutions/AA settings. I'd love to see what the game looks like at 2048x1536 with 4xAA and maxed AF with a triple buffer... Or even a more comfortable 1600x1200 with same graphical settings. :D

Log in

Don't have an account? Sign up now