Black and White 2 v1.1 vs. v1.2 Performance

We were interested to see what kind of performance difference there is between the 1.1 and 1.2 patch, so we tested the highest end cards from both ATI and NVIDIA with both patches. We ran the benchmark at 4 different resolutions, with AA enabled. Here are the results, starting with NVIDIA.

NVIDIA GeForce 7900 GTX


NVIDIA GeForce 7900 GTX


As the graphs show, there isn't a significant difference in framerates between the two patches for this card. This basically means that the 1.2 update doesn't really affect performance for NVIDIA cards.

ATI Radeon X1900 XTX


ATI Radeon X1900 XTX


On the X1900 however, we see that there is a marked improvement in all resolutions with the 1.2 patch, and while the 7900 GTX formerly outperformed the X1900 XTX in the previous patch, it now lags slightly behind the ATI card. While the difference isn't that great between the X1900 XTX and 7900 GTX, ATI's performance gain between patches is good news for any ATI owners with this game.

Now let's look at the general performance tests.

Black and White 2 Test Setup
Comments Locked

22 Comments

View All Comments

  • mino - Saturday, April 8, 2006 - link

    Hi Josh,

    could you consider a review of the (forever postponed) S.T.A.L.K.E.R. game ???

    I think it is as of now the most visully appealing, realistic and demandong game. IMHO this game is the way to test GPU's performance on future titles.

    I'm sure the moment You check it You will understand.
  • Josh Venning - Tuesday, April 11, 2006 - link

    We will definitely consider reviewing this game after it's released, whenever that may be.
  • Kremy - Thursday, April 6, 2006 - link

    Just wanted to add another vote for some Oblivion testing, and ALSO an inclusion of ATI's x800 and x850 series vid cards. For the record, I'm playing Oblivion on an x850xt running at 540/580 (PE speeeds), and it's running fine on high settings, 1024x768, no AA, full distance. Great game...
  • AdamK47 3DS - Wednesday, April 5, 2006 - link

    Why is there an article about it now when this patch has been out for so long?
  • bupkus - Wednesday, April 5, 2006 - link

    Maybe to be fair to ATI.
  • AdamK47 3DS - Wednesday, April 5, 2006 - link

    I suppose there could have been some pressure from Ati to post this article in order to vindicate themselves. Anandtech did like to use the previous version of B&W2 for performance testing. People were probably quick to blame Ati for the poor performance. I doubt this whole article would have ever been written had there not been some sort of outside influence.
  • JarredWalton - Wednesday, April 5, 2006 - link

    Actually, AFAIK, Josh just wanted to write about this subject. He's also the one that has done some of the regression testing (i.e use old drivers). It's all in the search for knowledge. As far as the patch, I'm *sure* that ATI helped Lionhead make some optimizations. Okay, that's a guess, but I would be amazed if they didn't. So, file this one under the heading of, "why is it that we need to make specific optimizations to games and drivers?"

    Oblivion is even worse right now. SLI you have to make a custom profile and manually enable AFR2 rendering for best performance (apparently). For ATI CF support, you actually have to rename the executable. So much for multi-GPU support out of box experience! Not that SLI/CF aren't faster, but they are frequently a hassle to deal with.
  • spinportal - Wednesday, April 5, 2006 - link

    Why doesn't any site test a 7900 GTX clocked down to a 7900 GT part for core/mem and see its performance? I have a feeling there could be a US$400 market for such a tweaked GT w/ 512MB card in between a GT (256MB) & GTX (512MB). Where o'where?
  • Araemo - Wednesday, April 5, 2006 - link

    "Unfortunately, one of the problems with this game has been that it tends to favor NVIDIA graphics cards over ATI cards, despite the ATI splash screen at the game's startup."

    Well, humorously enough, there has been at least one "Nvidia: The way it's meant to be played" game that ran better on my 9700 Pro than my friends' 5xxxx and 4xxxx series nvidia cards. :) Most dev houses are against making their game specifically more playable on one type of hardware than another, even with branding payments. ATI and nVidia are pretty even as far as gamer-level market share goes, so they're not going to fubar half their audience on purpose, and some games just run better on one architecture than another.. Nothing really suprising except that ATI and nVidia think that is worth spending their money on. :)
  • Warder45 - Wednesday, April 5, 2006 - link

    Let's see some Elder Scrolls Oblivion testing.

Log in

Don't have an account? Sign up now