Black & White 2 Performance

The AnandTech benchmark for Black & White 2 is a FRAPS benchmark. Between the very first tutorial land and the second land there is a pretty well rounded cut scene rendered in-game. This benchmark is indicative of real world performance in Black & White 2. We are able to see many of the commonly rendered objects in action. The most stressful part of the benchmark is a scene where hundreds of soldiers come running over a hill, which really pounds the geometry capabilities of these cards. At launch, ATI cards were severely out matched when it came to B&W2 performance because of this scene, but two patches applied to the game and quite a few Catalyst revisions later give ATI cards a much needed boost in performance over what we first saw.

A desirable average framerate for Black & White 2 is anything over 20 fps. The game does remain playable down to the 17-19 fps range, but we usually start seeing the occasional annoying hiccup during gameplay here. While this isn't always a problem as far as getting things done and playing the game, any jerkiness in frame rate degrades the overall experience.

We did test with all the options on the highest quality settings under the custom menu, with the exception of AA. Antialiasing has quite a high performance hit in this game, and is generally not worth it at high resolutions unless the game is running on a super powerhouse of a graphics card. If you're the kind of person who just must have AA enabled, you'll have to settle for a little bit lower resolution than we tend to like. Black & White 2 is almost not worth playing at low resolutions without AA, depth of field, and bloom enabled. At that point, we tend to get image quality that resembles the original Black & White. While various people believe that the original Black & White was a better game, no one doubts the superiority of B&W2's amazing graphics.

Black and White 2 Performance

As with BF2, 1600x1200 is a viable target resolution for midrange graphics users, even with high settings enabled. Again, we won't be able to hit this target with AA enabled, but it does look smooth enough that it isn't totally necessary. The X1800 GTO is a minimum on the ATI side for getting good framerates at this resolution, while the 7600 GT does just fine for NVIDIA. This is another benchmark where the 7900 GT edges out the X1900 GT in terms of performance, but the price of the X1900 GT still makes it a more attractive buy (but remember to keep in mind the availability of overclocked 7900 GT options). Users of older midrange cards won't be able to hit this resolution, and the X1600 XT is once again a very poor performer at our target resolution.

Every card in the test is playable at 800x600 with the settings we used. But with cards like the 6600 GT, 6800 GS, X800 GTO and X1600 XT, the game would look much better if some settings were turned down in favor of enabling some antialiasing or a higher resolution. At low res, the 7900 GT looses its advantage over the X1900 GT, but we don't see any signs of CPU limitation in the all powerful X1900 XT so we can appropriately conclude that the NVIDIA card is capable of scaling better in this scenario. This should translate well when we look at overclocking. Going from roughly equivalent performance at 1024x768, the 7900 GT leads the X1900 GT by 25% at our 2.8 MPixel resolution. But as the X1900 GT still maintains playability, we really have to give the X1900 GT the win as far as cost/benefit goes. As will be the case constantly, the X1900 XT leads the pack here and can easily handle turning on AA even at 1920x1440 (though we didn't test this setting here as most other cards are completely useless under such conditions).

For the upper end of our comparison, the X1900 XT leads. It's clear that the stock 7900 GT isn't worth the price, but overclocking should make a difference here. Even when we look at the 7600 GT, which clearly outclasses the X1600 XT, the X1900 GT offers a great performance boost for its price.

Battlefield 2 Performance The Elder Scrolls IV: Oblivion Performance
Comments Locked

74 Comments

View All Comments

  • Sharky974 - Friday, August 11, 2006 - link

    I tried comparing numbers for SCCT, FEAR and X3, the problem is Anand didn't bench any of these with AA in this mid-range test, and other sites all use 4XAA as default. So in other words no direct numbers comparison on those three games at least with those two Xbit/FS articles is possible.

    Although the settings are different, both FS and Anand showed FEAR as a tossup, though.

    It does appear other sites are confirming Anand's results more than I thought though.

    And the X1900GT for $230 is a kickass card.
  • JarredWalton - Friday, August 11, 2006 - link

    The real problem is that virtually every level of a game can offer higher/lower performance relative to the average, and you also get levels that use effects that work better on ATI or NV hardware. Some people like to make a point about providing "real world" gaming benchmarks, but the simple fact of the matter is that any benchmark is inherently different from actually sitting down and playing a game - unless you happen to be playing the exact segment benchmarked, or perhaps the extremely rare game where performance is nearly identical throughout the entire game. (I'm not even sure what an example of that would be - Pacman?)

    Stock clockspeed 7900GT cards are almost uncommon these days, since the cards are so easy to overclock. Standard clocks are actually supposed to be 450/1360 IIRC, and most cards are at least slightly overclocked in one or both areas. Throw in all the variables, plus things like whether or not antialiasing is enabled, and it becomes difficult to compare articles between any two sources. I tend to think of it as providing various snapshots of performance, as no one site can provide everything. So if we determine X1900 GT is a bit faster overall than 7900 GT and another site determines the reverse, the truth is that the cards are very similar, with some games doing better on one architecture and other games on the other arch.

    My last thought is that it's important to look at where each GPU manages to excel. If for example (and I'm just pulling numbers out of the hat rather than referring to any particular benchmarks) the 7900 GT is 20% faster in Half-Life 2 but the X1900 GT still manages frame rates of over 100 FPS, but then the X1900 GT is faster in Oblivion by 20% and frame rates are closer to 40 FPS, I would definitely wait to Oblivion figures as being more important. Especially if you run on LCDs, super high frame rates become virtually meaningless. If you can average well over 60 frames per second, I would strongly recommend enabling VSYNC on any LCD. Of course, down the road we are guaranteed to encounter games that require more GPU power, but predicting what game engine is most representative of the future requires a far better crystal ball than what we have available.

    For what it's worth, I would still personally purchase an overclocked 7900 GT over an X1900 GT for a few reasons, provided the price difference isn't more than ~$20. First, SLI is a real possibility, whereas CrossFire with an X1900 GT is not (as far as I know). Second, I simply prefer NVIDIA's drivers -- the old-style, not the new "Vista compatible" design. Third, I find that NVIDIA always seems to do a bit better on brand new games, while ATI seems to need a patch or a new driver release to address performance issues -- not always, but at least that's my general impression; I'm sure there are exceptions to this statement. ATI cards are still good, and at the current price points it's definitely hard to pick a clear winner. Plus you have stuff like the reduced prices on X1800 cards, and in another month or so we will likely have new hardware in all of the price points. It's a never ending rat race, and as always people should upgrade only when they find that the current level of performance they had is unacceptable from their perspective.
  • arturnowp - Friday, August 11, 2006 - link

    I think another advantage of 7900GT over X1900GT is power consumption. I'm not checking numbers of this matter so I am not 100% sure.
  • coldpower27 - Saturday, August 12, 2006 - link


    Yes, this is completely true, going by Xbitlab's numbers.

    Stock 7900 GT: 48W
    eVGA SC 7900 GT: 54W
    Stock X1900 GT: 75W
  • JarredWalton - Friday, August 11, 2006 - link

    Speech-recognition + lack of proofing = lots of typos

    "... out of a hat..."
    "I would definitely weight..."
    "... level of performance they have is..."

    Okay, so there were only three typos that I saw, but I was feeling anal retentive.
  • Sharky974 - Friday, August 11, 2006 - link

    Not too beat this to death, but at FS the X1900GT vs 7900GT benchmarks

    X1900GT:

    Wins-BF2, Call of Duty 2 (barely)

    Loses-Quake 4, Lock On Modern Air Combat, FEAR (barely),

    Toss ups- Oblivion (FS runs two benches, foliage/mountains, the cards split them) Far Cry w/HDR (X1900 takes two lower res benches, 7900 GT takes two higher res benches)

    At Xbit's X1900 gt vs 7900 gt conclusion


    "The Radeon X1900 GT generally provides a high enough performance in today’s games. However, it is only in 4 tests out of 19 that it enjoyed a confident victory over its market opponent and in 4 tests more equals the performance of the GeForce 7900 GT. These 8 tests are Battlefield 2, Far Cry (except in the HDR mode), Half-Life 2, TES IV: Oblivion, Splinter Cell: Chaos Theory, X3: Reunion and both 3DMarks. As you see, Half-Life 2 is the only game in the list that doesn’t use mathematics-heavy shaders. In other cases the new solution from ATI was hamstringed by its having too few texture-mapping units as we’ve repeatedly said throughout this review."

    Xbit review: http://www.xbitlabs.com/articles/video/display/pow...">http://www.xbitlabs.com/articles/video/display/pow...
  • Geraldo8022 - Thursday, August 10, 2006 - link

    I wish you would do a similar article concerning the video cards for HDTV and HDCP. It is very confusing. Even though certain crds might state they are HDCP, it is not enabled.
  • tjpark1111 - Thursday, August 10, 2006 - link

    the X1800XT is only $200 shipped, why not include that card? if the X1900GT outperforms it, then ignore my comment(been out of the game for a while)
  • LumbergTech - Thursday, August 10, 2006 - link

    so you want to test the cheaper gpu's for those who dont want to spend quite as much..ok..well why are you using the cpu you chose then? that isnt exactly in the affordable segement for the average pc user at this point
  • PrinceGaz - Thursday, August 10, 2006 - link

    Did you even bother reading the article, or did you just skim through it and look at the graphs and conclusion? May I suggest you read page 3 of the review, or in case that is too much trouble, read the relevant excerpt-

    quote:

    With the recent launch of Intel's Core 2 Duo, affordable CPU power isn't much of an object. While the midrange GPUs we will be testing will more than likely be paired with a midrange CPU, we will be testing with high end hardware. Yes, this is a point of much contention, as has always been the case. The arguments on both sides of the aisle have valid points, and there are places for system level reviews and component level reviews. The major factor is that the reviewer and readers must be very careful to understand what the tests are really testing and what the numbers mean.

    For this article, one of the major goals is to determine which midrange cards offers the best quality and performance for the money at stock clock speeds at this point in time. If we test with a well aged 2.8GHz Netburst era Celeron CPU, much of our testing would show every card performing the same until games got very graphics limited. Of course, it would be nice to know how a graphics card would perform in a common midrange PC, but this doesn't always help us get to the bottom of the value of a card.

    For instance, if we are faced with 2 midrange graphics cards which cost the same and perform nearly the same on a midrange CPU, does it really matter which one we recommend? In our minds, it absolutely does matter. Value doesn't end with what performance the average person will get from the card when they plug it into a system. What if the user wants to upgrade to a faster CPU before the next GPU upgrade? What about reselling the card when it's time to buy something faster? We feel that it is necessary to test with high end platforms in order to offer the most complete analysis of which graphics solutions are actually the best in their class. As this is our goal, our test system reflects the latest in high end performance.

Log in

Don't have an account? Sign up now