World in Conflict Performance

Version: 1.005

Settings: Medium quality plus Heat Haze, Debris Physics, and DX10

We tested this game using the built-in benchmark feature of the game. In our experience, this does a good job of testing the different graphical scenarios that can be encountered in the game.

World in Conflict, from the data, really looks like we had Vsync enabled, the system was severely CPU limited, or the framerate cap was on. However, this was not the case. These data points had minimum and maximum framerates extending from ~30 to ~160 fps, and there appears to be another factor in the resulting data looking so flat between resolutions.

The 9600 GT SLI was able to break past this barrier and post average framerates higher than 60fps. The only major difference is that that we had to use a different driver just for the 9600. Given what we experienced in our recent Dell XPS M1730 article, the 170 series drivers help significantly in World in Conflict and Crysis; unfortunately, no official beta or other driver with 8800 Ultra support is available. We are investigating further and waiting for driver updates from NVIDIA.

World in Conflict Multi-GPU Scaling over Resolution


It seems clear that if there is some limit on performance scaling here with our test platform. We expect future driver updates to significantly help both SLI and CrossFireX.

Pushing resolution higher is the way to get more value out of multi-GPU here. Increasing settings may help, and we will go back and look at higher settings with these configurations in the future. Running at Very High is still not a viable option, but there is room for customization to end up with a workable stress test for current high-end systems.

World in Conflict Performance


World in Conflict Performance
  1280x1024 1600x1200 1920x1200 2560x1600
NVIDIA GeForce 9600 GT SLI 73 68 63 51
NVIDIA GeForce 8800 Ultra SLI 58 56 54 52
NVIDIA GeForce 8800 Ultra 58 56 53 44
NVIDIA GeForce 9600 GT 60 50 44 30
AMD Radeon HD 3870X2 (x 2) 63 60 60 58
AMD Radeon HD 3870X2 + 3870 63 60 60 57
AMD Radeon HD 3870X2 63 60 59 52
AMD Radeon HD 3870 57 51 47 34

With all the data compressed under 60fps, it is hard to get a clear understanding of what's going on. In the interest of reporting what we actually saw, the above chart shows our results.

S.T.A.L.K.E.R. Performance Final Words
Comments Locked

36 Comments

View All Comments

  • MAIA - Tuesday, March 11, 2008 - link

    "After rebooting a few times to let windows do its thing, we installed the driver and all was well."

    This sentence is soooooo microsoft windows !!! :))

    Sorry .... had to say it.
  • dash2k8 - Tuesday, March 11, 2008 - link

    I'm just wondering: instead of piling on the number of GPU's, why hasn't a manufacturer just come out with ONE monstrous GPU that does away with the need of using multiple video cards? If someone is crazy enough to spend moola on 4 GPU's, I imagine that person would be equally willing to buy ONE card that has the same horsepower. Just saying.
  • punko - Monday, March 10, 2008 - link

    Thanks Derek for a good review. As you indicated, this may be the future and its good to see the tech reach a point where it is ready for use and can be improved upon as all tech goes forward.

    It also sound like you had a lot of help directly from AMD on this one.

  • gsellis - Monday, March 10, 2008 - link

    "but today a WHQL drier is available "

    Hey Derek, typo in the beginning. Still mirthful about this one. Water cooling and you needed it drier to work with all GPUs?
  • ltcommanderdata - Sunday, March 9, 2008 - link

    I'm just curious as to whether you've checked to see if quad channel memory has any benefit for multiple GPU situations? With 3 or 4 GPUs sucking data, I would presume the additional memory bandwidth provided by quad DDR2-800 would increase performance, especially since dual channel FB-DIMMs are not as efficient as the best dual channel DDR2 or DDR3 setups on desktop boards. It would be interesting to see the results of a 4x1GB setup on Skulltrail vs the 2x2GB setup you used.
  • cerwin13 - Saturday, March 8, 2008 - link

    Would it be wise to try this upgrade without SP1 installed with Vista 32? I am currently using 2x Radeon HD3870 x2s and would like to benchmark with these new drivers, but apparently SP1 isn't officially out yet?
  • DerekWilson - Saturday, March 8, 2008 - link

    other people had luck without SP1; it's not a requirement, but some of our editors did find that it helped with a lot of stuff ...

    you'll want to make sure you have hotfixes:

    929777-v2
    936710
    938194
    938979
    940105
    945149

    as a minimum
  • Ananke - Saturday, March 8, 2008 - link

    XFX has Forceware 169.32, my guess it was added after 9600GT appear. On Nvidia official download site the highest ver is 169.28
  • Ananke - Saturday, March 8, 2008 - link

    XFX has Forceware 169.32, my guess it was added after 9600GT appear. On Nvidia official download site the highest ver is 169.28
  • Incisal flyer - Saturday, March 8, 2008 - link

    Derek, thanks for the very timely and detailed review. I'm going to be building a system for Flight Simulator X and have been trying to figure out the best graphics card(s) for that application. Have you considered benchmarking that sim? A lot of discussion right now on AVSIM etc on what to do in terms of GPUs for people building new systmes. There is a lot of back and forth on advantages and disadvantages of different configs. I realize FSX is a bit of a niche product. Would FSX use multiple GPUs like 2 3870 x2s and are the potential headaches of that configurtation worth it if you are a not a computer geek? Or am I better off just getting a couple of Nvidia 8800s in SLI or a single 3870 x2 and not hassling with the 4 GPU solution? Any help or advice would be appreciated. Thanks in advance for your time.

    Incisal Flyer

Log in

Don't have an account? Sign up now