F.E.A.R. Performance

F.E.A.R. has a built in test that we make use of in this performance analysis. This test flies through some action as people shoot each other and things blow up. F.E.A.R. is very heavy on the graphics, and we enable most of the high end settings for our test.

During our testing of F.E.A.R., we noted that the "soft shadows" don't really look soft. They jumped out at us as multiple layers of transparent shadows layered on top of each other and jittered to appear soft. Unfortunately, this costs a lot in performance and not nearly enough shadows are used to make this look realistic. Thus, we disable soft shadows in our test even though its one of the large performance drains on the system.

Again we tested with anisotropic filtering at 8x, and all options were on their highest quality (with the exception of soft shadows which was disabled). Frame rates for F.E.A.R. can get pretty low for a first person shooter, but the game does a good job of staying playable down to about 25 fps.

F.E.A.R. Performance

The X1900 GT maintains an advantage through out this DirectX 9 based title. While the overclocked XFX M480 7900 GS is able to just catch up at 1920x1440, the cost savings and promise of SLI of the NVIDIA based cards are only enough to call this a toss up in the value category.

F.E.A.R. - No AA
 
800x600
1024x768
1280x1024
1600x1200
1920x1440
ATI Radeon X800 GTO
77
52
36
24
15
ATI Radeon X1600 XT
76
53
37
26
17
ATI Radeon X1800 GTO
108
75
52
36
23
ATI Radeon X1900 GT
137
104
77
54
38
ATI Radeon X1900 XT 256MB
156
127
100
72
52
ATI Radeon X1900 XT
169
135
104
76
56
NVIDIA GeForce 6600 GT
64
46
33
23
12
NVIDIA GeForce 6800 GS
84
61
43
31
17
NVIDIA GeForce 7600 GT
108
78
56
39
28
NVIDIA GeForce 7800 GT
122
88
63
44
33
NVIDIA GeForce 7900 GS
126
92
68
48
36
XFX GeForce 7900 GS 480M Extreme
132
98
71
50
38
NVIDIA GeForce 7900 GT
143
105
77
54
40

The Elder Scrolls IV: Oblivion Performance Half Life 2 Episode One Performance
Comments Locked

29 Comments

View All Comments

  • munky - Wednesday, September 6, 2006 - link

    quote:

    In spite of the fact that F.E.A.R. is an OpenGL game, the X1900 GT maintains the advantage.

    FEAR is a DX9 game, not OpenGL...
  • DerekWilson - Wednesday, September 6, 2006 - link

    I'm looking into this at the moment but having trouble finding documentation on it.

    I suppose, as I was recently testing quad sli and saw huge performance increases, I assumed the game must be using the 4 frame afr mode only possible in opengl (dx is limited to rendering 3 frames ahead). I'll keep looking for confirmation on this ...
  • MemberSince97 - Wednesday, September 6, 2006 - link

    Jupiter EX is a DX9 rendering engine...
  • DerekWilson - Wednesday, September 6, 2006 - link

    corrected, thanks ... now I have to figure out why FEAR likes quad sli so much ...
  • MemberSince97 - Wednesday, September 6, 2006 - link

    Nice writeup DW, I really like the mouseover performance % graphs...
  • PrinceGaz - Thursday, September 7, 2006 - link

    So do I, but there is one error
    quote:

    With equivalent stock clock speeds and potential 14% and 20% advantages in vertex and pixel processing respectively...

    That should be 14% and 25% advantages

    The 7900GS has 20 PS while the 7900GT has 24 PS. That makes the 7900GS 20% slower than the 7900GT, but it makes the 7900GT 25% faster than the 7900GS. It's important to remember which one you're comparing it against when quoting percentages.

    Hopefully the percentage performance difference in the graph itself was calculated correctly, or at least consistently.
  • PrinceGaz - Thursday, September 7, 2006 - link

    Ooops sorry, please ignore my post. For some reason I thought for a moment the 7900GS had 16 PS and the 7900GT had 20 PS (despite writing the correct values in my comment). The article is correct, I was just getting confused.

    PS. an edit function would be nice.
  • Frackal - Wednesday, September 6, 2006 - link

    There is no way an X1900xt gets 75fps at 1600x1200 4xAA, at that same resolution and AA setting I get well over 120-130fps average with an X1900xtx. Most sites show it hitting at least 100+
  • DerekWilson - Wednesday, September 6, 2006 - link

    if you use the built in demo features to run a timedemo with dice's own calculations you will get a very wrong (skewed upward) number. Dice themselves say that results over 100 fps aren't reliable.

    the problem is that they benchmark the load screen, and generally one card or the other will get better load screen performance -- for instance, the x1900 gt may get 300+fps while the 7900 gt may only get 200fps. (I just picked those numbers, but framerates for the load screen are well over 100 fps in most cases and drastically different between manufacturers).

    not only does no one care about this difference on a load screen, but it significantly interferes with benchmark numbers.

    the timedemo feature can be used to output a file with frametimes and instantaneous frames per second. we have a script that opens this file, removes the frame data for the load screen, and calculates a more accurate framerate average using only frame data for scenes rendered during the benchmark run.

    this will decrease over all scores.

    we also benchmark in operation clean sweep which has a lot of fog and water. we use a benchmark with lots of smoke and explosions and we test for some ammount of time in or near most vehicles.
  • splines - Wednesday, September 6, 2006 - link

    Ownage approved.

Log in

Don't have an account? Sign up now