Black & White 2 Performance

The AnandTech benchmark for Black & White 2 is a FRAPS benchmark. Between the very first tutorial land and the second land there is a pretty well rounded cut scene rendered in-game. This benchmark is indicative of real world performance in Black & White 2. We are able to see many of the commonly rendered objects in action. The most stressful part of the benchmark is a scene where hundreds of soldiers come running over a hill, which really pounds the geometry capabilities of these cards. At launch, ATI cards were severely out matched when it came to B&W2 performance because of this scene, but two patches applied to the game and quite a few Catalyst revisions later give ATI cards a much needed boost in performance over what we first saw.

A desirable average framerate for Black & White 2 is anything over 20 fps. The game does remain playable down to the 17-19 fps range, but we usually start seeing the occasional annoying hiccup during gameplay here. While this isn't always a problem as far as getting things done and playing the game, any jerkiness in frame rate degrades the overall experience.

We did test with all the options on the highest quality settings under the custom menu. Antialiasing has quite a high performance hit in this game, and is generally not worth it at high resolutions unless the game is running on a super powerhouse of a graphics card. If you're the kind of person who just must have AA enabled, you'll have to settle for a little bit lower resolution than we tend to like on reasonably priced graphics card. Black & White 2 is almost not worth playing at low resolutions without AA, depth of field, or bloom enabled. At that point, we tend to get image quality that resembles the original Black & White. While various people believe that the original was a better game, no one doubts the superiority of B&W2's amazing graphics.

Black & White 2

So far things aren't looking good for ATI's multi-GPU solution making its way to the top of the charts, as the 7900 GTX SLI significantly outperforms the X1950 CrossFire setup. Once again, we see that the X1950 CrossFire is barely faster than a single 7950 GX2 but, to ATI's credit, Black & White 2 has never been a strength of the X1000 series.

Single card performance is a bit closer, as the 7900 GTX offers the same performance as the X1950 XTX. Although the 7950 GX2 is technically a single card, its dual GPUs let it perform like a multi-card solution, and its price shows. The 7950 GX2 offers an interesting middle ground between the price and performance of a top of the line single GPU solution like the X1950 XTX or 7900 GTX and a full blown multi-card multi-GPU setup.

Once again it's worth noting that even the $280 X1900 XT 256MB is able to average a playable frame rate at 2048 x 1536, making a case for the value to be had in a sub-$300 graphics card.

Black & White 2

With AA enabled, the X1950 CrossFire vs. 7900 GTX SLI gap narrows considerably, and on the single card side the X1950 XTX manages to outperform the 7900 GTX. Thanks to better scaling with NVIDIA's SLI, the 7900 GTX more than makes up for the gap when you add a second card.

With 4X AA enabled, the X1900 XT 256MB can no longer hang with the big boys. However, it's worth mentioning that at higher resolutions, the visual benefit of anti-aliasing quickly diminishes. As pixel size decreases, visible aliasing becomes much less of a problem and if it's bothering you that much at 2048 x 1536 we may need to sit you down and have a talk about the old days when we didn't have anti-aliasing (and we had to benchmark in the snow).

 

Battlefield 2 Performance The Elder Scrolls IV: Oblivion Performance
Comments Locked

74 Comments

View All Comments

  • JarredWalton - Wednesday, August 23, 2006 - link

    We used factory overclocked 7900 GT cards that are widely available. These are basically guaranteed overclocks for about $20 more. There are no factory overclocked ATI cards around, but realistically don't expect overclocking to get more than 5% more performance on ATI hardware.

    The X1900 XTX is clocked at 650 MHz, which isn't much higher than the 625 MHz of the XT cards. Given that ATI just released a lower power card but kept the clock speed at 650 MHz, it's pretty clear that there GPUs are close to topped out. The RAM might have a bit more headroom, but memory bandwidth already appears to be less of a concern, as the X1950 isn't tremendously faster than the X1900.
  • yyrkoon - Wednesday, August 23, 2006 - link

    I think its obvious why ATI is selling thier cards for less now, and that reason is alot of 'tech savy' users, are waiting for Direct3D 10 to be released, and want to buy a capable card. This is probably to try an entice some people into buying technology that will be 'obsolete', when Direct3D 10 is released.

    Supposedly Vista will ship with Directx 9L, and Directx 10 (Direct3D 10), but I've also read to the contrary, and that Direct3D 10 wont be released until after Vista ships (sometime). Personally, I couldnt think of a better time to buy hardware, but alot of people think that waiting, and just paying through the nose for a Video card later, is going to save them money. *shrug*
  • Broken - Wednesday, August 23, 2006 - link

    In this review, the test bed was an Intel D975XBX (LGA-775). I thought this was an ATI Crossfire only board and could not run two Nvidia cards in SLI. Are there hacked drivers that allow this, and if so, is there any penalty? Also, I see that this board is dual 8x pci-e and not dual 16x... at high resolutions, could this be a limiting factor, or is that not for another year?

  • DerekWilson - Wednesday, August 23, 2006 - link

    Sorry about the confusion there. We actually used an nForce4 Intel x16 board for the NVIDIA SLI tests. Unfortunately, it is still not possible to run SLI on an Intel motherboard. Our test section has been updated with the appropriate information.

    Thanks for pointing this out.

    Derek Wilson
  • ElFenix - Wednesday, August 23, 2006 - link

    as we all should know by now, Nvidia's default driver quality setting is lower than ATi's, and makes a significant difference in the framerate when you use the driver settings to match the quality settings. your "The Test" page does not indicate that you changed the driver quality settings to match.
  • DerekWilson - Wednesday, August 23, 2006 - link

    Drivers were run with default quality settings.

    Default driver settings between ATI and NVIDIA are generally comparable from an image quality stand point unless shimmering or banding is noticed due to trilinear/anisotropic optimizations. None of the games we tested displayed any such issues during our testing.

    At the same time, during our Quad SLI followup we would like to include a series of tests run at the highest possible quality settings for both ATI and NVIDIA -- which would put ATI ahead of NVIDIA in terms of Anisotropic filtering or in chuck patch cases and NVIDIA ahead of ATI in terms of adaptive/transparency AA (which is actually degraded by their gamma correction).


    If you have any suggestions on different settings to compare, we are more than willing to run some tests and see what happens.

    Thanks,
    Derek Wilson
  • ElFenix - Wednesday, August 23, 2006 - link

    could you run each card with the quality slider turned all the way up, please? i believe that the the default setting for ATi, and the 'High Quality' setting for nvidia. someone correct me if i'm wrong.

    thanks!

    michael
  • yyrkoon - Wednesday, August 23, 2006 - link

    I think as long as all settings from both offerings are as close as possible per benchmark, there is no real gripe.

    Although, some people seem to think it nessisary to run AA as high resolutions (1600x1200 +), but I'm not one of them. Its very hard for me to notice jaggies even at 1440x900, especially when concentrating on the game, instead of standing still, and looking with a magnifying glass for jaggies . . .
  • mostlyprudent - Wednesday, August 23, 2006 - link

    When are we going to see a good number of Core 2 Duo motherboards that support Crossfire? The fact that AT is using an Intel made board rather than a "true enthusiast" board says something about the current state of Core 2 Duo motherboards.
  • DerekWilson - Wednesday, August 23, 2006 - link

    Intel's boards are actually very good. The only reason we haven't been using them in our tests (aside from a lack of SLI support) is that we have not been recommending Intel processors for the past couple years. Core 2 Duo makes Intel CPUs worth having, and you definitely won't go wrong with a good Intel motherboard.

Log in

Don't have an account? Sign up now