Anisotropic, Trilinear, and Antialiasing


There was a great deal of controversy last year over some of the "optimizations" NVIDIA included in some of their drivers. We have visited this issue before, and we won't get into the debate here, but it is important to note that NVIDIA won't be taking chances in the future on being called a cheater.

NVIDIA's new driver defaults to the same adaptive anisotropic filtering and trilinear filtering optimizations they are currently using in the 50 series drivers, but users are now able to disable these features. Trilinear filtering optimizations can be turned off (doing full trilinear all the time), and a new "High Quality" rendering mode turns off adaptive anisotropic filtering. What this means is that if someone wants (or needs) to have accurate trilinear and anisotropic filtering they can. The disabling of trilinear optimizations is currently available in the 56.72

Unfortunately, it seems like NVIDIA will be switching to a method of calculating anisotropic filtering based on a weighted Manhattan distance calculation. We appreciated the fact that NVIDIA's previous implementation of anisotropic filtering employed a Euclidean distance calculation which is less sensitive to the orientation of a surface than a weighted Manhattan calculation.


This is how NVIDIA used to do Anisotropic filtering


This is Anisotropic under the 60.72 driver.


This is how ATI does Anisotropic Filtering.


The advantage is that NVIDIA now has a lower impact when enabling anisotropic filtering, and we will also be doing a more apples to apples comparison when it comes to anisotropic filtering (ATI also makes use of a weighted Manhattan scheme for distance calculations). In games where angled, textured, surfaces rotate around the z-axis (the axis that comes "out" of the monitor) in a 3d world, both ATI and NVIDIA will show the same fluctuations in anisotropic rendering quality. We would have liked to see ATI alter their implementation rather than NVIDIA, but there is something to be said for both companies doing the same thing.

We had a little time to play with the D3D AF Tester that we used in last years image quality article. We can confirm that turning off the trilinear filtering optimizations results in full trilinear being performed all the time. Previously, neither ATI nor NVIDIA did this much trilinear filtering, but check out the screenshots.


Trilinear optimizations enabled.


Trilinear optimizations disabled.


When comparing "Quality" mode to "High Quality" mode we didn't observe any difference in the anisotropic rendering fidelity. Of course, this is still a beta driver, so everything might not be doing what it's supposed to be doing yet. We'll definitely keep on checking this as the driver matures. For now, take a look.


Quality Mode.


High Quailty Mode.


On a very positive note, NVIDIA has finally adopted a rotated grid antialiasing scheme. Here we can take a glimpse at what the new method does for their rendering quailty in Jedi Knight: Jedi Academy.


Jedi Knight without AA


Jedi Knight with 4x AA


Its nice to finally see such smooth near vertical and horizontal lines from a graphics company other than ATI. Of course, ATI does have yet to throw its offering into the ring, and it is very possible that they've raised their own bar for filtering quality.
Programmable Encoding Anyone? The Card and The Test
Comments Locked

77 Comments

View All Comments

  • Da3dalus - Thursday, April 15, 2004 - link

    I'd like to see benchmarks of Painkiller in the upcoming NV40 vs R420 tests...
  • Brickster - Thursday, April 15, 2004 - link

    Am I the only one who thinks Nvidia's Nalu is the MOST bone-able cartoon out there?

    Oy, get the KY!
  • Warder45 - Thursday, April 15, 2004 - link

    Did any reviews try and overclock the card? Is it not possible with the test card?
  • DonB - Thursday, April 15, 2004 - link

    Would have been better if it had a coax cable TV input + TV tuner. For $500, I would expect a graphic card to include EVERYTHING imaginable.
  • Pete - Thursday, April 15, 2004 - link

    Shinei #37,

    "Speaking of DX9/PS2.0, what about a Max Payne 2 benchmark?"

    MP2 doesn't use DX9 effects. The game requires DX9 compatability, but only DX8 compliance for full effects.

    Xbit-Labs has a ton of benches of next-gen titles as well, and is worth checking out. NV40 certainly redeems itself in the HL2 leak. :)
  • Wwhat - Thursday, April 15, 2004 - link

    Anybody happen to know if it's possible to use a second (old) PSU to run it, you can pick up cheap 235 watt PSU's and would be helped with both extra connectors and power.
    I'm not sure it won't cause 'sync' problems though as a small difference between the rails of 2 PSU's would cause one to drain the other if the card's connectors aren't decoupled enough from the AGP port.



  • Pumpkinierre - Thursday, April 15, 2004 - link

    Agrre with you Trog #59 on the venting. Also with DX9.0c having fp32 as spec., does this mean that FX series cards redeem themselves? (As the earlier DX9 spec was fp24 which was'nt present on the FX gpus causing a juggling act between fp16 and fp32 to match performance and IQ). Still, full fp32 on the FX cards might be too slow.
  • mrprotagonist - Thursday, April 15, 2004 - link

    What's with all the cheesy comments before the benchmarks? Anyone?
  • Cygni - Thursday, April 15, 2004 - link

    "what mobo and mobo drivers were used? i hear that the nforce2 provides an unfair performance advantage for nvidia"

    The test was on an Athlon 64 3400+ system, so i doubt it was using an Nforce2. But ya, i agree, the system specs were short. More details are required.
  • Brickster - Wednesday, April 14, 2004 - link

    Derek, what was that Monitor you used?

    Thanks!

Log in

Don't have an account? Sign up now