Turning on Antialiasing

Quite possibly the biggest difference between Half Life 2 and Doom 3 (other than the fact that Half Life 2 is infinitely brighter) is that aliasing is far more pronounced in Half Life 2 than in Doom 3.  The aliasing isn’t overwhelming and at resolutions of 1280 x 1024 and above it is minimized relatively well, but it’s still something we’d like to get rid of.

Enabling 2X AA proved to help somewhat but not as much as we would have liked, thus we turned our attention to enabling 4X AA on the cards compared here today. We haven't included any screenshots in line because they would have to be scaled down to fit on this page, so we are offering a zip file of all of the screenshots we are talking about here.

Our first test was 1024 x 768 with 4X AA enabled - we found that while 1024 x 768 with 4X AA enabled gave us some very solid image quality, we preferred playing at 1280 x 1024 without AA.  Most cards offered slightly reduced performance playing at 1280 x 1024 vs. 1024 x 768 with 4X AA.

Next we looked at 1280 x 1024 with 4X AA enabled - here we found that 1280 x 1024 with 4X AA enabled was a good alternative to 1600 x 1200, however with most cards 1600 x 1200 ran faster than 1280 x 1024 with 4X AA enabled.  In the end the choice here comes down to whether your monitor supports 1600 x 1200 or not; if it does, then by all means, 1600 x 1200 is the resolution to run at, otherwise 1280 x 1024 with 4X AA is a good alternative. 

Finally we have 1600 x 1200 with 4X AA enabled - this is truly a beautiful setup and while you can definitely play it on even a GeForce 6800, it is best paired with a GeForce 6800 Ultra or Radeon X800 XT or better yet, two GeForce 6800 Ultras.  You don’t get a much better looking game than Half Life 2 at 1600 x 1200 with 4X AA enabled. 

So interestingly enough, although Half Life 2 definitely appreciates antialiasing being enabled, in reality the performance hit is just not worth the resulting gains in image quality – especially when compared to just cranking up the resolution and leaving AA disabled.  For those of you that are interested in enabling AA anyway, we have provided some AA benchmarks on the next pages. But before we get to the benchmarks let's have a look at AA image quality.

First let's look at both ATI and NVIDIA with everything disabled:


Antialiasing Disabled on ATI


Antialiasing Disabled on NVIDIA

So far so good, both ATI and NVIDIA look identical (except for the birds flying around in the background, but regardless of how many breadcrumbs we left out they would not stay still).

Now let's turn on 4X AA:


4X AA Enabled on ATI


4X AA Enabled on NVIDIA

You can immediately see the benefit of having AA turned on in Half Life 2 (these screenshots were taken at 1024 x 768), but let's zoom in for a closer look to see if either card is doing a better job:

ATI (4X AA - 200% Zoom)
NVIDIA (4X AA - 200% Zoom)

From the screenshots above it is tough to tell the difference between the two competitors. It looks like NVIDIA may have a slightly more blurry AA implementation than ATI, but it is really tough to tell the two apart.

The Slowest Level in the Game Turning on Anisotropic Filtering
Comments Locked

79 Comments

View All Comments

  • Anand Lal Shimpi - Wednesday, November 17, 2004 - link

    Thanks for all of the comments guys. Just so you know, I started on Part 2 the minute the first article was done. I'm hoping to be done with testing by sometime tomorrow and then I've just got to write the article. Here's a list of the new cards being tested:

    9600XT, 9550, 9700, X300, GF 6200, GF 5900XT, GF4 Ti 4600, GF4 MX440

    I'm doing both DX9 and DX8 comparisons, including image quality.

    After Part 2 I think I'll go ahead and do the CPU comparison, although I've been thinking about doing a more investigative type of article into Half Life 2 performance in trying to figure out where its performance limitations exist, so things may get shuffled around a bit.

    We used the PCI Express 6600GT for our tests, but the AGP version should perform quite similarly.

    The one issue I'm struggling with right now is the fact that the X700 XT is still not available in retail, while the X700 Pro (256MB) is. If I have the time I may go back and run some X700 Pro numbers to make this a more realistic present-day comparison.

    Any other requests?

    Take care,
    Anand
  • Cybercat - Wednesday, November 17, 2004 - link

    You guys made my day comparing the X700XT, 6800, and 6600GT together. One question though (and I apologize if this was mentioned in the article and I missed it), did you guys use the PCIe or AGP version of the 6600GT?
  • Houdani - Wednesday, November 17, 2004 - link

    18: Many users rely on hardware review sites to get a feel for what technology is worth upgrading and when.

    Most of us have financial contraints which preclude us from upgrading to the best hardware, therefore we are more interested in knowing how the mainstream hardware performs.

    You are correct that it would not be an efficient use of resources to have AT repeat the tests on hardware that is two or three generations old ... but sampling the previous generation seems appropriate. Fortunately, that's where part 2 will come in handy.

    I expect that part 2 will be sufficient in showing whether or not the previous generation's hardware will be a bottleneck. The results will be invaluable for helping me establish my minimum level of satisfaction for today's applications.
  • stelleg151 - Wednesday, November 17, 2004 - link

    forget what i said in 34.....
  • pio!pio! - Wednesday, November 17, 2004 - link

    So how do you softmod a 6800NU to a 6800GT???
    or unlock the extra stuff....
  • stelleg151 - Wednesday, November 17, 2004 - link

    What drivers were being used here, 4.12 + 67.02??
  • Akira1224 - Wednesday, November 17, 2004 - link

    Jedi

    lol I should have seen that one coming!
  • nastyemu25 - Wednesday, November 17, 2004 - link

    i bought a 9600XT because it came boxed with a free coupon for HL2. and now i can't even see how it matches up :(
  • coldpower27 - Wednesday, November 17, 2004 - link

    These benchmarks are more in line with what I was predicting, the x800 Pro should be equal to 6800 GT due to similar Pixel Shader fillrate while the X800 XT should have an advantage at higher resolutions due to it's having a higher fillrate being clocked higher.

    Unlike DriverATIheaven:P.

    This is great I am happy knowing Nvidia's current generation of hardware is very competitive in performance in all aspects when at equal amounts of fillrate.
  • Da3dalus - Wednesday, November 17, 2004 - link

    In the 67.02 Forceware driver there's a new option called "Negative LOD bias", if I understand what I've read correctly it's supposed to reduce shimmering.

    What was that option set to in the tests? And how did it affect performance, image quality and shimmering?

Log in

Don't have an account? Sign up now