Introduction

We have been excited about lots of new games being released and we've had our hands full testing and playing as many as we can. Starting with games like Battlefield 2, we've been seeing some big advancement in game graphics even within the past few months. Black and White 2, in particular, impressed us recently with its amazing images of water and overall environments. We are always excited about a game that has beautiful looking graphics and rich gameplay as well, and it seems like this is happening more often lately, much to our delight. The Call of Duty 2 demo also has us all giddy, and it looks and plays great, even if it is frustratingly short.

Some other games that have us waiting in anticipation are Quake 4 and Age of Empires 3. We wish that we had some good demos of these games, but unfortunately we have to wait for the release date like everyone else. It seems like the bar is being raised higher and higher with new games in terms of graphics that video card manufacturers might have trouble keeping up, and this past Tuesday, with the release of FEAR, the bar was raised a very significant notch. Yes, FEAR is out, and it is beautiful.

We recently sat down and tested FEAR with the 1.01 patch that came out the day on which the game was released. We also tested with the absolute latest drivers from ATI (press sample 8.183.1017 which should be available in catalyst soon) and NVIDIA (81.85 available on nzone now), both of which offer increased performance in FEAR. Our results were interesting to say the least, and we'll give you the details on how this game performs on a wide range of boards, including ATI's new X1000 line.

While the single and multiplayer demos of this game have been available for quite some time, we had the (quite correct) understanding that final performance would not look anything like what the demo showed. Today, readers can rest assured that the numbers that we have collected will be an accurate reflection of FEAR performance on modern hardware.

The Game/Test setup
Comments Locked

117 Comments

View All Comments

  • Le Québécois - Thursday, October 20, 2005 - link

    Like many peoples said it would have been nice to see older generation HW...especially on ATI side of thing since most of the card tested here are nowhere to be found on the market.

    Seeing performance with the X800XL and the X850XT would have been nice.

    I also hope you'll do some CPU testing in the future since I doubt you'll see many peoples out there with AMD FX55...especially paired up with the like of X1300... :)
  • Kogan - Thursday, October 20, 2005 - link

    Since the max upgrade for AGP users on the ATI side is an X800xt/x850xt, it would have been nice to have seen one of them included.
  • ballero - Thursday, October 20, 2005 - link

    I'm looking forward to the SLI numbers
  • Abecedaria - Thursday, October 20, 2005 - link

    It is a significant error that SLI numbers were left out of the article since it seems to be about how fast current video card technologies can play the game:
    "Those who want to play FEAR at the highest resolution and settings with AA enabled (without soft shadows) will basically have to use the 7800 GTX, as no other card available gets playable framerates at those settings, and the 7800 GTX does just barely (if uncomfortably)." ...unless you have an SLI setup, I assume. Does Anandtech feel that SLI is not a viable graphics technology or am I missing something?

    And then there's Crossfire... while it STILL isn't available yet, it would have been interesting to see some performance numbers along with SLI tests.

    I'd would be nice if you could update the article with dual card frame-rates.

    abc
  • Abecedaria - Thursday, October 20, 2005 - link

    Oh wait!!!!

    PC Perspective has already beat Anandtech to the punch on this subject, and the results show that SLI has a SIGNIFICANT impact on playability, even without any driver optimizations....

    http://www.pcper.com/article.php?aid=175&type=...">http://www.pcper.com/article.php?aid=175&type=...

    abc
  • Ender17 - Thursday, October 20, 2005 - link

    I agree. Can we get some SLI benchmarks?
  • Kyanzes - Thursday, October 20, 2005 - link

    ...to see a card performing on the top when it's not even available...
  • 9nails - Saturday, October 22, 2005 - link

    Exactly! I love this Land of Make Believe. It's a good thing that I have a AMD Athlon 64 FX-55 2.6 GHz processor in my Desktop, Laptop, and PDA. And I'm loving it because after an unreal CPU like that, I would still have hundreds of dollars left to burn on make-believe GPU's. Because, if I was only a regular Joe Anad-reader with a middle tier Pentium 4 and old school AGP graphics port I would be quite upset that the author is targeting his reviews at the well connected Beverley Hills posh.

    Just who is Josh writing his articles for any way?! I'm going back to surfing pr0n. Because I have a far better chance at dating a porn* than owning a system like the one that he's showing scores on.
  • yacoub - Saturday, October 22, 2005 - link

    Well thanks for supporting the thread I started in Video forum section last week addressing that very issue. All the idiots came out of the woodwork to do their best to misinterpret and misread the post and very few actually bothered to support my suggestion that a test be done with a REAL WORLD system most of us own, not an FX-55 setup with a 7800GTX that few people own.

    I'd LOVE to see how modern games perform on a system I'm actually thinking of buying, not an imaginary supersystem.
  • deathwalker - Thursday, October 20, 2005 - link

    You know..it's simply come to the point to where I don't know how the average gamer can keep up. It's come to the point to where if you are not willing to spend $300-$500 every 6-12 mos. or so you just can not keep up with the demands that games are putting on computer hardware. This is stupid..I mean who the hell is dragging this industry along? Do they develop new and more powerful hardware so more demanding software can be created or do they develop more demanding software making it a necessity to develop more powerful hardware? Is all this crap really needed to have a decent gaming experience? I guess I'm just gonna have to starve the Cat for the couple of months so I can toss out my POS 6800gt and get some new wizzbang graphics cards the industry wants me to buy. This has become a never ending process that is wearing thin on me.

Log in

Don't have an account? Sign up now