Texture Filtering Image Quality

Texture filtering is always a hot topic when a new GPU is introduced. For the past few years, every new architecture has had a new take on where and how to optimize texture filtering. The community is also very polarized and people can get really fired up about how this company or that is performing an optimization that degrades the user's experience.

The problem is that all 3D graphics is an optimization problem. If GPUs were built to render every detail of every of every scene without any optimization, rather than frames per second, we would be looking at seconds per frame. Despite this, looking at the highest quality texture filtering available is a great place from which to start working our way down to what most people will use.

The good news is that G80 completely eliminates angle dependent anisotropic filtering. Finally we have a return to GeForce FX quality anisotropic filtering. When stacked up against R580 High Quality AF with no optimizations enabled on either side (High Quality mode for NVIDIA, Catalyst AI Disabled for ATI), G80 definitely shines. We can see at 8xAF (left) under NVIDIA's new architecture is able to more accurately filter textures based on distance from and angle to the viewer. On the right, we see ATI's angle independent 16xAF degrade in quality to a point where different texture stages start bleeding into one another in undesirable ways.



ATI G80

Hold mouse over links to see Image Quality

Oddly enough, ATI's 16xAF is more likely to cause shimmering with the High Quality AF box checked than without. Even when looking at an object like a flat floor, we can see the issue pop up in the D3DAFTester. NVIDIA has been battling shimmering issues due to some of their optimizations over the past year or so, but these issues could be avoided through driver settings. There isn't really a way to "fix" ATI's 16x high quality AF issue.



ATI Normal Quality AF ATI High Quality AF

Hold mouse over links to see Image Quality

But, we would rather have angle independent AF than not, so for the rest of this review, we will enable High Quality AF on ATI hardware. This will give us a more fair comparison to G80, even if we still aren't really looking at two bowls of apples. G70 is not able to enable angle independent AF, so we'll be stuck with the rose pattern we've been so familiar with over the past few years.

There is still the question of how much impact optimization has on texture filtering. With G70, disabling optimizations resulted in more trilinear filtering being done, and thus a potential performance decrease. The visual result is minimal in most cases, as trilinear filtering is only really necessary to blur the transition between mipmap levels on a surface.



G70 Normal Quality AF G70 High Quality AF

Hold mouse over links to see Image Quality

On G80, we see a similar effect when comparing default quality to high quality. Of course, with angle independent anisotropic, we will have to worry less about shimmering period, so optimizations shouldn't cause any issues here. Default quality does show a difference in the amount of trilinear filtering being applied, but this does not negatively impact visual quality in practice.



G80 Normal Quality AF G80 High Quality AF

Hold mouse over links to see Image Quality

What's Transparency AA? Turning Optimizations Off
Comments Locked

111 Comments

View All Comments

  • dwalton - Thursday, November 9, 2006 - link

    When using older cards sacrificing IQ for performance is typically acceptable. Who needs AA when running F.E.A.R on a 9700 Pro.

    However, on a just launched high-end card, why would anyone feel the need to sacrifice IQ for performance? Some may say resolution over AA, but I find it hard to believe that there is a lot of gaming enthusiasts with deep pockets, who play with insane resolutions yet no AA.
  • JarredWalton - Thursday, November 9, 2006 - link

    If I look for jaggies, I see them. On most games, however, they don't bother me much at all. Running at native resolution on LCDs or at a really high resolution on CRTs, I'd take that over a lower res with 4xAA. If you have the power to enable 4xAA, great, but I'm certainly not one to suggest it's required. I'd rather be able to enable vsync without a massive performance hit (i.e. stay above 60 FPS) than worry about jaggies. Personal preference.
  • munim - Wednesday, November 8, 2006 - link

    "With the latest 1.09 patch, F.E.A.R. has gained multi-core support,"

    Where is this?
  • JarredWalton - Wednesday, November 8, 2006 - link

    I wrote that, but it may be incorrect. I'm trying to get in contact with Gary to find out if I'm just being delusional about Quad Core support. Maybe it's NDA still? Hmmm.... nothing to see here!
  • JarredWalton - Wednesday, November 8, 2006 - link

    Okay, it's the 1.08 patch, and that is what was tested. Since we didn't use a quad core CPU I don't know if it will actually help or not -- something to look at in the future.
  • Nelsieus - Wednesday, November 8, 2006 - link

    I haven't even finished reading it yet, but so far, this is the most comprehensive, in-depth review I've seen on G80 and I just wanted to mention that beforehand.

    :)
  • GhandiInstinct - Wednesday, November 8, 2006 - link

    What upcoming games will be the first to be fully made on DX10 structure? And does the G80 have full support of DX10?
  • timmiser - Thursday, November 9, 2006 - link

    Microsoft Flight Simulator X will be DX10 compliant via a planned patch once Vista comes out.
  • JarredWalton - Wednesday, November 8, 2006 - link

    All DX10 hardware will be full DX10 (see pages 2-4). As for games that will be DX10 ready, Halo 2 for Vista will be for sure. Beyond that... I don't know for sure. As we've explained a bit, DX10 will require Vista, so anything launching before Vista will likely not be DX10 compliant.
  • shabby - Wednesday, November 8, 2006 - link

    They're re-doing a dx8 game in dx10? You gotta be kidding me, whats the point? You cant polish a turd.

Log in

Don't have an account? Sign up now