Battlefield 2 Performance

This benchmark is performed using DICE's built in demo playback functionality with a few added extras built in house. When using the built in demo playback features of BF2, frames rendered during the loading screen are counted in the benchmark. In order to get a real idea of performance, we use the instantaneous frametime and frames per second data generated from a benchmark run. We discard the data collected during the loading screen and calculate a result that represents the actual gameplay that was benchmarked. While DICE maintains that results over 100fps aren't reliable, our methods have allowed us to get useful data from high performing systems.

During the benchmark, the camera switches between players and vehicles in order to capture the most action possible. There is a lot of smoke and explosions, so this is a very GPU intensive Battlefield 2 benchmark. The game itself is best experienced with average in-game framerates of 35 and up.

We ran Battlefield 2 using the highest quality graphics settings we could. Shadows, lighting, and especially view distance are very important in order to best play the game. In our opinion view distance should never be set to less than the max, but other settings can be decreased slightly if a little more performance or a higher resolution is required.

Battlefield 2

At the very top of the charts the GeForce 7900 GTX SLI manages to maintain just under a 12% advantage over the X1950 CrossFire, indicating to NVIDIA that it may not need to even respond with a new product to combat ATI's launch today at the high end. A single 7950 GX2 offers virtually identical performance to the X1950 CrossFire, showcasing the main strength of the 7950 GX2: its ability to offer dual card performance in a single slot in any platform. The performance advantage the X1950 CrossFire offers over its X1900 predecessor is 6%, definitely not enough to warrant an upgrade.

Single card performance is representative of what we've seen with multi-GPU performance, with the single 7900 GTX outperforming a single X1950 XTX. Note that the 7900 GTX's performance advantage actually grows from one to two cards, thanks to better scaling with NVIDIA's SLI architecture over ATI's CrossFire.

At the very bottom of the chart we've got the X1900 XT 256MB which really puts things into perspective. Being able to deliver 60 fps at 2048 x 1536 itself, most users will be monitor limited before they are GPU limited in games like Battlefield 2; in which case, the clear recommendation here is the $280 X1900 XT (or the similarly priced factory overclocked 7900 GT as we saw earlier in this review).

Although the 7900 GTX SLI performs better at higher resolutions, ATI's X1950 and X1900 CrossFire setups actually perform better at lower, more CPU bound resolutions, indicating greater driver overhead with NVIDIA's SLI. CPU limitations are quite evident at lower resolutions with the multi-GPU setups further reinforcing the idea that if you've got a LCD with a 1280 x 1024 maximum resolution, then you may want to think twice about upgrading to a second GPU.

The CPU limitations seen at 1280 x 1024 start to fade away as we move to 1600 x 1200, where the multi-GPU pack separates itself from the single GPU cards. What's interesting is that, with the exception of NVIDIA's 7900 GTX SLI, the remaining multi-GPU cards have similar resolution scaling curves to the single cards, just at higher frame rates.

Battlefield 2

When AA gets kicked on, we see the numbers get shaken up a bit. X1950 CrossFire essentially ties 7900 GTX SLI for the performance lead, and the 7950 GX2 drops to the bottom of the multi-GPU pile. Single GPU performance becomes dominated by ATI cards with the 7900 GTX falling to nearly the level of the 256MB X1900 XT. At the same time, it is a nice treat to realize that even the 256MB X1900 XT is playable at 2048x1536 with all the eye candy cranked up.

Our scaling graph doesn't show the same CPU limitedness we saw without AA enabled. The 7900 GTX SLI does see a hint of performance loss due to driver overhead here as well, but otherwise all these cards scale similarly as in the previous test.

A Faster, Cheaper High-End Black & White 2 Performance
Comments Locked

74 Comments

View All Comments

  • JarredWalton - Wednesday, August 23, 2006 - link

    We used factory overclocked 7900 GT cards that are widely available. These are basically guaranteed overclocks for about $20 more. There are no factory overclocked ATI cards around, but realistically don't expect overclocking to get more than 5% more performance on ATI hardware.

    The X1900 XTX is clocked at 650 MHz, which isn't much higher than the 625 MHz of the XT cards. Given that ATI just released a lower power card but kept the clock speed at 650 MHz, it's pretty clear that there GPUs are close to topped out. The RAM might have a bit more headroom, but memory bandwidth already appears to be less of a concern, as the X1950 isn't tremendously faster than the X1900.
  • yyrkoon - Wednesday, August 23, 2006 - link

    I think its obvious why ATI is selling thier cards for less now, and that reason is alot of 'tech savy' users, are waiting for Direct3D 10 to be released, and want to buy a capable card. This is probably to try an entice some people into buying technology that will be 'obsolete', when Direct3D 10 is released.

    Supposedly Vista will ship with Directx 9L, and Directx 10 (Direct3D 10), but I've also read to the contrary, and that Direct3D 10 wont be released until after Vista ships (sometime). Personally, I couldnt think of a better time to buy hardware, but alot of people think that waiting, and just paying through the nose for a Video card later, is going to save them money. *shrug*
  • Broken - Wednesday, August 23, 2006 - link

    In this review, the test bed was an Intel D975XBX (LGA-775). I thought this was an ATI Crossfire only board and could not run two Nvidia cards in SLI. Are there hacked drivers that allow this, and if so, is there any penalty? Also, I see that this board is dual 8x pci-e and not dual 16x... at high resolutions, could this be a limiting factor, or is that not for another year?

  • DerekWilson - Wednesday, August 23, 2006 - link

    Sorry about the confusion there. We actually used an nForce4 Intel x16 board for the NVIDIA SLI tests. Unfortunately, it is still not possible to run SLI on an Intel motherboard. Our test section has been updated with the appropriate information.

    Thanks for pointing this out.

    Derek Wilson
  • ElFenix - Wednesday, August 23, 2006 - link

    as we all should know by now, Nvidia's default driver quality setting is lower than ATi's, and makes a significant difference in the framerate when you use the driver settings to match the quality settings. your "The Test" page does not indicate that you changed the driver quality settings to match.
  • DerekWilson - Wednesday, August 23, 2006 - link

    Drivers were run with default quality settings.

    Default driver settings between ATI and NVIDIA are generally comparable from an image quality stand point unless shimmering or banding is noticed due to trilinear/anisotropic optimizations. None of the games we tested displayed any such issues during our testing.

    At the same time, during our Quad SLI followup we would like to include a series of tests run at the highest possible quality settings for both ATI and NVIDIA -- which would put ATI ahead of NVIDIA in terms of Anisotropic filtering or in chuck patch cases and NVIDIA ahead of ATI in terms of adaptive/transparency AA (which is actually degraded by their gamma correction).


    If you have any suggestions on different settings to compare, we are more than willing to run some tests and see what happens.

    Thanks,
    Derek Wilson
  • ElFenix - Wednesday, August 23, 2006 - link

    could you run each card with the quality slider turned all the way up, please? i believe that the the default setting for ATi, and the 'High Quality' setting for nvidia. someone correct me if i'm wrong.

    thanks!

    michael
  • yyrkoon - Wednesday, August 23, 2006 - link

    I think as long as all settings from both offerings are as close as possible per benchmark, there is no real gripe.

    Although, some people seem to think it nessisary to run AA as high resolutions (1600x1200 +), but I'm not one of them. Its very hard for me to notice jaggies even at 1440x900, especially when concentrating on the game, instead of standing still, and looking with a magnifying glass for jaggies . . .
  • mostlyprudent - Wednesday, August 23, 2006 - link

    When are we going to see a good number of Core 2 Duo motherboards that support Crossfire? The fact that AT is using an Intel made board rather than a "true enthusiast" board says something about the current state of Core 2 Duo motherboards.
  • DerekWilson - Wednesday, August 23, 2006 - link

    Intel's boards are actually very good. The only reason we haven't been using them in our tests (aside from a lack of SLI support) is that we have not been recommending Intel processors for the past couple years. Core 2 Duo makes Intel CPUs worth having, and you definitely won't go wrong with a good Intel motherboard.

Log in

Don't have an account? Sign up now