Turning on Antialiasing

Quite possibly the biggest difference between Half Life 2 and Doom 3 (other than the fact that Half Life 2 is infinitely brighter) is that aliasing is far more pronounced in Half Life 2 than in Doom 3.  The aliasing isn’t overwhelming and at resolutions of 1280 x 1024 and above it is minimized relatively well, but it’s still something we’d like to get rid of.

Enabling 2X AA proved to help somewhat but not as much as we would have liked, thus we turned our attention to enabling 4X AA on the cards compared here today. We haven't included any screenshots in line because they would have to be scaled down to fit on this page, so we are offering a zip file of all of the screenshots we are talking about here.

Our first test was 1024 x 768 with 4X AA enabled - we found that while 1024 x 768 with 4X AA enabled gave us some very solid image quality, we preferred playing at 1280 x 1024 without AA.  Most cards offered slightly reduced performance playing at 1280 x 1024 vs. 1024 x 768 with 4X AA.

Next we looked at 1280 x 1024 with 4X AA enabled - here we found that 1280 x 1024 with 4X AA enabled was a good alternative to 1600 x 1200, however with most cards 1600 x 1200 ran faster than 1280 x 1024 with 4X AA enabled.  In the end the choice here comes down to whether your monitor supports 1600 x 1200 or not; if it does, then by all means, 1600 x 1200 is the resolution to run at, otherwise 1280 x 1024 with 4X AA is a good alternative. 

Finally we have 1600 x 1200 with 4X AA enabled - this is truly a beautiful setup and while you can definitely play it on even a GeForce 6800, it is best paired with a GeForce 6800 Ultra or Radeon X800 XT or better yet, two GeForce 6800 Ultras.  You don’t get a much better looking game than Half Life 2 at 1600 x 1200 with 4X AA enabled. 

So interestingly enough, although Half Life 2 definitely appreciates antialiasing being enabled, in reality the performance hit is just not worth the resulting gains in image quality – especially when compared to just cranking up the resolution and leaving AA disabled.  For those of you that are interested in enabling AA anyway, we have provided some AA benchmarks on the next pages. But before we get to the benchmarks let's have a look at AA image quality.

First let's look at both ATI and NVIDIA with everything disabled:


Antialiasing Disabled on ATI


Antialiasing Disabled on NVIDIA

So far so good, both ATI and NVIDIA look identical (except for the birds flying around in the background, but regardless of how many breadcrumbs we left out they would not stay still).

Now let's turn on 4X AA:


4X AA Enabled on ATI


4X AA Enabled on NVIDIA

You can immediately see the benefit of having AA turned on in Half Life 2 (these screenshots were taken at 1024 x 768), but let's zoom in for a closer look to see if either card is doing a better job:

ATI (4X AA - 200% Zoom)
NVIDIA (4X AA - 200% Zoom)

From the screenshots above it is tough to tell the difference between the two competitors. It looks like NVIDIA may have a slightly more blurry AA implementation than ATI, but it is really tough to tell the two apart.

The Slowest Level in the Game Turning on Anisotropic Filtering
Comments Locked

79 Comments

View All Comments

  • ballero - Wednesday, November 17, 2004 - link

    it'd be nice a comparison between cpu
  • Jalf - Wednesday, November 17, 2004 - link

    To those wanting benchmarks on older hardware, remember that this is a hardware site, not a games review site.

    Their focus is on the hardware, and honestly, few hardware enthusiasts can get excited about an 800 mhz cpu or a Geforce 3. ;)

    For AT, HL2 is a tool to compare new *interesting* hardware. It's not the other way around.
  • CU - Wednesday, November 17, 2004 - link

    I would also like to see slower cpu's and 512meg systems tested. It seems all recent cards can run it fine, so it would be nice to see how other things affect HL2.
  • CU - Wednesday, November 17, 2004 - link

    Based on the 6800nu vs 6600gt I would say that HL2 is being limited by fillrate and not bandwith. I say this since they both have about the same fillrate, but the 6800nu has around 40% more bandwidth than the 6600gt. So, unlocking extra pipes and overclocking the GPU should give the most increase in fps. Anyone want to test this?
  • Jeff7181 - Wednesday, November 17, 2004 - link

    ... in addition... this is a case where minimum frame rates would be very useful to know.
  • Jeff7181 - Wednesday, November 17, 2004 - link

    Those numbers are about what I expected. I'm sorta thinking that triple buffering isn't working with the 66.93 drivers and HL2 because I have vsync enabled, it seems like the frame rate is either 85 or 42.

    I also suspected that anistropic filtering wasn't particularly necessary... I'll have to try it without and see how it looks... although with 4XAA and 8XAF I'm still getting acceptable frame rates.
  • nserra - Wednesday, November 17, 2004 - link

    #8 i never heard of 6800 extra pipes unlocked, where did you see that. Arent you making some confusion with the Ati 9500 cards?
  • MAME - Wednesday, November 17, 2004 - link

    Make some budget video card benchmarks (Ti4200 plus or minus) and possibly a slower cpu or less ram so that people will know if they have to upgrade
  • Akira1224 - Wednesday, November 17, 2004 - link

    #8 Thats not a fair comparison. Yes atm it would seem the 6800Nu is a better buy. However if you go to Gameve you will find the XFX (clocked at PCIe speeds)6600GT for $218. Thats a much better deal than your example using Newegg. You talk about a $5 diff... if you are a smart shopper you can get upwards of a $50 diff.

    THAT makes the 6600GT the better buy. Esp when you consider that the market this card is aimed at is not the same market that will softmod their cards to unlock pipes. Either way you go you will get great performance.

    I digress off topic.... sorry.
  • nserra - Wednesday, November 17, 2004 - link

    You didn’t use overclocked nvidia cards like hardocp did. That Kyle has the shame to say he used stock clock, those BFG OC are overclocked from factory. Just 25Mhz but its something.

    Very good review!!! Better then the NVIDIA's GeForce 6600GT AGP review where something was missing.

Log in

Don't have an account? Sign up now