Turning on Antialiasing

Quite possibly the biggest difference between Half Life 2 and Doom 3 (other than the fact that Half Life 2 is infinitely brighter) is that aliasing is far more pronounced in Half Life 2 than in Doom 3.  The aliasing isn’t overwhelming and at resolutions of 1280 x 1024 and above it is minimized relatively well, but it’s still something we’d like to get rid of.

Enabling 2X AA proved to help somewhat but not as much as we would have liked, thus we turned our attention to enabling 4X AA on the cards compared here today. We haven't included any screenshots in line because they would have to be scaled down to fit on this page, so we are offering a zip file of all of the screenshots we are talking about here.

Our first test was 1024 x 768 with 4X AA enabled - we found that while 1024 x 768 with 4X AA enabled gave us some very solid image quality, we preferred playing at 1280 x 1024 without AA.  Most cards offered slightly reduced performance playing at 1280 x 1024 vs. 1024 x 768 with 4X AA.

Next we looked at 1280 x 1024 with 4X AA enabled - here we found that 1280 x 1024 with 4X AA enabled was a good alternative to 1600 x 1200, however with most cards 1600 x 1200 ran faster than 1280 x 1024 with 4X AA enabled.  In the end the choice here comes down to whether your monitor supports 1600 x 1200 or not; if it does, then by all means, 1600 x 1200 is the resolution to run at, otherwise 1280 x 1024 with 4X AA is a good alternative. 

Finally we have 1600 x 1200 with 4X AA enabled - this is truly a beautiful setup and while you can definitely play it on even a GeForce 6800, it is best paired with a GeForce 6800 Ultra or Radeon X800 XT or better yet, two GeForce 6800 Ultras.  You don’t get a much better looking game than Half Life 2 at 1600 x 1200 with 4X AA enabled. 

So interestingly enough, although Half Life 2 definitely appreciates antialiasing being enabled, in reality the performance hit is just not worth the resulting gains in image quality – especially when compared to just cranking up the resolution and leaving AA disabled.  For those of you that are interested in enabling AA anyway, we have provided some AA benchmarks on the next pages. But before we get to the benchmarks let's have a look at AA image quality.

First let's look at both ATI and NVIDIA with everything disabled:


Antialiasing Disabled on ATI


Antialiasing Disabled on NVIDIA

So far so good, both ATI and NVIDIA look identical (except for the birds flying around in the background, but regardless of how many breadcrumbs we left out they would not stay still).

Now let's turn on 4X AA:


4X AA Enabled on ATI


4X AA Enabled on NVIDIA

You can immediately see the benefit of having AA turned on in Half Life 2 (these screenshots were taken at 1024 x 768), but let's zoom in for a closer look to see if either card is doing a better job:

ATI (4X AA - 200% Zoom)
NVIDIA (4X AA - 200% Zoom)

From the screenshots above it is tough to tell the difference between the two competitors. It looks like NVIDIA may have a slightly more blurry AA implementation than ATI, but it is really tough to tell the two apart.

The Slowest Level in the Game Turning on Anisotropic Filtering
Comments Locked

79 Comments

View All Comments

  • Nuke Waste - Thursday, December 16, 2004 - link

    Would it be possible for AT to update the timedemos to Source Enigne 7? Steam "graciously" updated my HL2 platform, and now none of my timedemos work!
  • The Internal - Friday, December 3, 2004 - link

    Which x700 XT card was used? How much RAM did it have?
  • VortigernRed - Tuesday, November 23, 2004 - link

    "Remember that we used the highest detail settings with the exception of anisotropic filtering and antialiasing, "

    That is not what you are showing on the SS on page 2. You are showing there that you have the water details set to "reflect world" not "reflect all".

    I would be interested to see how that affects the performance in your benchmarks with water in them, as some sites are showing larger wins for ATI and it seems possible that this setting may be the difference.

    It certainly looks much better in game with "reflect all" but does affect the performance.

    PS, sorry for the empty post above, trying to guess my username and password!
  • VortigernRed - Tuesday, November 23, 2004 - link

  • Warder45 - Sunday, November 21, 2004 - link

    I'd like to know what you guys think about X0bit's and other reviews that have ATI way ahead in numbers do to turning on Reflect All and not just reflect world.

    http://www.chaoticdreams.org/ce/jb/ReflectAll.jpg
    http://www.chaoticdreams.org/ce/jb/ReflectWorld.jp...

    Some SS.
  • Counterspeller - Friday, November 19, 2004 - link

    I forgot about my specs : P4 3.0 3HD 8, 16, 60Gb, MB P4P800-E Deluxe, Samtron 96BDF Screen.
  • Counterspeller - Friday, November 19, 2004 - link

    I don't understand... I have a GeForce 256 DDR, and the ONLY game that I have not been able to play is DOOM 3, only because it asks for 64Mb of VRAM, and I only have 32. I'd like to play HL2, but I don't have it. Perhaps it'll be like D3... not enough VRAM, and in that case, the 2nd game I can't play with that board. What I don't understand is this : how can anyone be complaining because x game or y game «only» gives us 200 fps... Can YOU see 200 fps ? we're happy with 24fps on TV, 25fps in the theaters, and we're bitchin' about some game that only gives us 56.7 fps instead of the «behold perfection» 67.5. I know there is a difference, and yes, we can see that difference, but is it useful, in terms of gameplay ? Will you be fragged because of a 1 or 2 or even 3 fps difference between you and your opponent ? Stupidity gets us fragged, not fps. I believe that anything below 30/40 fps is nice, but unplayable, when it comes to action games. I'm happy with 60. Anything above it is extra. I have played with this very board many demanding games, and I can say that yes, some parts are demanding on the board. But I never lost because of it. Resuming : I don't understand this war between ATI lovers and NVIDIA lovers. I've been using the same board for years, and I never needed to change it. Unless it crumbles, I'll stick with it.
  • Counterspeller - Friday, November 19, 2004 - link

    I don't understand... I have a GeForce 256 DDR, and the ONLY game that I have not been able to play is DOOM 3, only because it asks for 64Mb of VRAM, and I only have 32. I'd like to play HL2, but I don't have it. Perhaps it'll be like D3... not enough VRAM, and in that case, the 2nd game I can't play with that board. What I don't understand is this : how can anyone be complaining because x game or y game «only» gives us 200 fps... Can YOU see 200 fps ? we're happy with 24fps on TV, 25fps in the theaters, and we're bitchin' about some game that only gives us 56.7 fps instead of the «behold perfection» 67.5. I know there is a difference, and yes, we can see that difference, but is it useful, in terms of gameplay ? Will you be fragged because of a 1 or 2 or even 3 fps difference between you and your opponent ? Stupidity gets us fragged, not fps. I believe that anything below 30/40 fps is nice, but unplayable, when it comes to action games. I'm happy with 60. Anything above it is extra. I have played with this very board many demanding games, and I can say that yes, some parts are demanding on the board. But I never lost because of it. Resuming : I don't understand this war between ATI lovers and NVIDIA lovers. I've been using the same board for years, and I never needed to change it. Unless it crumbles, I'll stick with it.
  • TheRealSkywolf - Friday, November 19, 2004 - link

    I have a fx 5950, i have turned on the x9 path and things run great. 1st and all the graphics dont look much better, you see slight differences on the water and in some bumpmapping, but minor things.
    So i guess its time for Ati fans to shut up, both the fx and the 9800 cards run the game great.
    Man, doom3 showed all the wistles and bells, why wouldnt hl2? I think is very unprofessional from Valve to do what they did.
  • SLI - Friday, November 19, 2004 - link

    Umm, why was the Radeon P.E. not tested?

Log in

Don't have an account? Sign up now