Introduction

We have been excited about lots of new games being released and we've had our hands full testing and playing as many as we can. Starting with games like Battlefield 2, we've been seeing some big advancement in game graphics even within the past few months. Black and White 2, in particular, impressed us recently with its amazing images of water and overall environments. We are always excited about a game that has beautiful looking graphics and rich gameplay as well, and it seems like this is happening more often lately, much to our delight. The Call of Duty 2 demo also has us all giddy, and it looks and plays great, even if it is frustratingly short.

Some other games that have us waiting in anticipation are Quake 4 and Age of Empires 3. We wish that we had some good demos of these games, but unfortunately we have to wait for the release date like everyone else. It seems like the bar is being raised higher and higher with new games in terms of graphics that video card manufacturers might have trouble keeping up, and this past Tuesday, with the release of FEAR, the bar was raised a very significant notch. Yes, FEAR is out, and it is beautiful.

We recently sat down and tested FEAR with the 1.01 patch that came out the day on which the game was released. We also tested with the absolute latest drivers from ATI (press sample 8.183.1017 which should be available in catalyst soon) and NVIDIA (81.85 available on nzone now), both of which offer increased performance in FEAR. Our results were interesting to say the least, and we'll give you the details on how this game performs on a wide range of boards, including ATI's new X1000 line.

While the single and multiplayer demos of this game have been available for quite some time, we had the (quite correct) understanding that final performance would not look anything like what the demo showed. Today, readers can rest assured that the numbers that we have collected will be an accurate reflection of FEAR performance on modern hardware.

The Game/Test setup
Comments Locked

117 Comments

View All Comments

  • Regs - Friday, October 21, 2005 - link

    This is one of the reasons why I don't think the 7800 GTX or x1000's are worth buying. I feel sorry for the people who payed over 600 dollars for them when they can't even play FEAR @ 1280x1024 with AA.
  • fogeyman - Friday, October 21, 2005 - link

    FEAR is quite clearly optimized poorly. However, claiming that people pay over $600 for a gtx without being able to play 1280x1024 with AA is totally wrong. It is easily playable as the review shows for less than $600 (price-wise, at least for the gtx). Not to mention, you can even kick up the resolution to 1600x1200 and get only slightly unusable FPS.

    Specifically, on 1280x1024 with all settings on max except for soft shadows, the GTX gets a playable 39 fps. ATI is off the mark, but NVIDIA is okay. And as for the cost of the 7800GTX, it is (as of now off Newegg) in $20 intervals from $460-$500, the $500 version includes BF2, and one $580 version. Clearly, you can get the GTX for over $100 less than your "$600" price. And no, exaggerating by over $100 is not negligible, not at all.

    Note: what I mean by "slightly unusable" is not that it is slightly problematic, but rather that it is in fact unplayable but misses the playability mark by only a little.
  • LoneWolf15 - Friday, October 21, 2005 - link

    quote:

    This is one of the reasons why I don't think the 7800 GTX or x1000's are worth buying. I feel sorry for the people who payed over 600 dollars for them when they can't even play FEAR @ 1280x1024 with AA.


    I would argue that if anything, it is likely that F.E.A.R. was optimized poorly, and is more likely the result. I've seen screenshots, and so far, I'm not impressed enough to put down the money. Greaphics doesn't seem to be anywhere near as good as the hype has stated (previous Half-Life 2 shots look far better IMO; perhaps I have to play it to see). Add that to the fact that there's already a 1.01 patch the day of the game release, and I think that's a symptom of a game that needs more under-the-hood work. I'll wait to see the results of testing for more games; one is not enough.

    P.S. To all that said this review should have had more ATI cards, you were right on the money. This review has the Geforce 6600GT and 6800GT, and doesn't even include ATI counterparts to them (read: X800GT, X800XL)? That's poor.
  • Jackyl - Friday, October 21, 2005 - link

    I really do think developers have either reached the limit in optimizing their code, or they are too lazy to do so. Or perhaps, it's a conspiracy between ATI/Nvidia and developers? The fact is, you shouldn't NEED a $600 video card to run some of these games coming out today. The shear lack of performance shown here on a high dollar card, shows us that something is wrong in the industry.

    Anyone notice a trend here in the industry? Supposedly the GPU power of the cards are increasing. X800 claims to be two times faster than an "old" 9800 Pro. Yet the game engines being written today, can't crank out more than 40fps at a measly resolution of 1280x1024? Something is wrong in the industry. As someone else said in another post...Something as got to give.
  • Le Québécois - Friday, October 21, 2005 - link

    The problem is simple...PC game developers have no limite to speak of...They know there is allways something new coming up who will run their game perfectly...That's not the case with the console market. Since they're going to be "stuck" with the same HW for 4-5 years they HAVE to optimize their code..That why you see games on a same system ( Gamecube for exemple ) with graphic twice has beautiful as other older game running on the SAME HW...
    Take RE4 for exemple..nobody even though that level of graphic could be achive on a GC....but it did.
  • g33k - Friday, October 21, 2005 - link

    I can't really complain as the 6800gt was included in the article. Good read, I enjoyed it.
  • PrinceGaz - Friday, October 21, 2005 - link

    I'd say this was a fairly good performance review except for the choice of graphics-cards.

    An excellent choice of current nVidia cards by including both 7800 models, and popular GF6 cards (6800GT and 6600GT) from which the performance of other 6800/6600 can be extrapolated. Given the use of a PCIe platform, the only cards I would add would be a standard 6200 (not TC) and a PCX5900; the PCX5900 would give FX5900 owners a good idea of how their card would perform and be a guide to general GF5 series performance. A 7800GTX SLI setup is also needed to show what benefit it offers, but I wouldn't bother testing anything slower in SLI as it is not a viable upgrade.

    The ATI X1000 series cards included was also excellent, but only using an X800GT from the previous generation was woefully inadequate. Ideally an X850XT, X800XL, and X700Pro would also be added to give more complete information. For the generation before that, just as a PCX5900 could be used for nVidia, an X600Pro/XT could be used for ATI as that would be equivalent to a 9600Pro/XT. It's a pity there isn't a PCIe version of the 9800Pro but a 9600Pro/XT would be the next best thing. Until you can setup a Crossfire X1800XT there is no point including any Crossfire tests.

    So my recommended gfx-card selection is: nVidia 7800GTX SLI, 7800GTX, 7800GT, 6800GT, 6600GT, 6200, PCX5900. ATI X1800XT, X1800XL, X1600XT, X1300Pro, X850XT, X800XL, X800GT, X700Pro, X600Pro/XT. That may seem a daunting list but it is only a total of 16 instead of 10 cards so it is not overwhelming. All the cards are PCIe so you only need the one box, and it includes a good selection of old and new cards.

    The only other thing I'd change is the test system. The FX-55 processor is fine though an FX-57 would be even better; people who suggest using something slower when testing slower video-cards are missing the point of a video-card review. I would up the memory to 2GB (2x 1GB) though just to remove possible stuttering from affecting the results, even if that means slowing the timings slightly to 2-3-2.
  • Le Québécois - Friday, October 21, 2005 - link

    Oh..and your selection of video card seems pretty good to me :P Since pple with a 9800PRO will perform closely with the X700PRO.
  • Le Québécois - Friday, October 21, 2005 - link

    The fastest CPU is good if you want to know exactly how well a GPU do in a game...but that still doesn't refelct the majority of peoples who will run the game...that's why a slower CPU could be nice. If hte idea behind this review was to show peoples how well their HW will do in this game...only using the best of the best is not the best way to achive that goal.
  • PrinceGaz - Friday, October 21, 2005 - link

    The aim of video-card reviews is to show as best as possible what the video-card is capable of when all other variables (such as CPU limitations) are removed from the equation. That's why even testing an AGP GeForce 2GTS with a high-end FX-57 processor would be preferable as the performance is determined entirely by the graphics-card.

    If you use slower CPUs with slower graphics-cards, it is difficult to say for sure whether it is the CPU or the graphics-card that is the limiting factor. All a review which tries to mix and match CPUs and graphics cards is saying is "this combination went this fast, but we have no idea if it was the CPU or the graphics-card that was the limiting factor, so we don't know if you should buy a faster CPU or a faster graphics-card".

Log in

Don't have an account? Sign up now