Test Setup

We wanted to get a sense of how well the latest high-end hardware could run this game, but we also wanted to have older budget and midrange cards included in our performance tests for comparison. We included as many cards from both ATI and NVIDIA as we could in order to understand the game's overall playability, particularly after the 1.2 update.

We broke down the performance tests into 3 groups. For the high-end card tests, we enabled "High Antialiasing" (basically 4xAA with some tweaks and optimizations) and used the four resolutions tested on the last page (2048x1536, 1920x1440, 1600x1200, 1280x1024). The following cards make up the high-end group:

NVIDIA 7900 GTX
NVIDIA 7900 GT
NVIDIA 7800 GTX (512)
NVIDIA 7800 GT
ATI X1900 XT-X
ATI X1900 XT
ATI X1800 XL


In the midrange group we tested the game at 1600x1200, 1280x1024, and 1024x768 without AA enabled. Here are the cards we in this group:

NVIDIA 7800 GT
NVIDIA 7600 GT
NVIDIA 6800 GS
NVIDIA 6600 GT
ATI X1800 XL
ATI X1800 GTO
ATI X1600 XT


Note that the 7800 GT and X1800 XL are borderline high-end/midrange, so we tested these two cards in both categories. If you're wondering how some of the other "high-end" cards fare without AA, you can use them as a guide. We also included X1800 GTO numbers, and we are starting to see a few of these parts available for purchase.

For the low-end tests, we tested at 1280x1024, 1024x768, 800x600, and 640x480 without AA. These are the cards for this group:

NVIDIA 7300 GS
NVIDIA 6200 TC (TurboCache)
ATI X1300


Our test system was equipped as follows:

ATI Radeon Express 200 based system
NVIDIA nForce 4 based system
AMD Athlon 64 FX-57
2x 1GB DDR400 2:3:2:8
120 GB Seagate 7200.7 HD
600 W OCZ PowerStream PSU


Note that sound was disabled for all tests.

BW2 v1.1 versus v1.2 Performance Performance Tests
Comments Locked

22 Comments

View All Comments

  • mino - Saturday, April 8, 2006 - link

    Hi Josh,

    could you consider a review of the (forever postponed) S.T.A.L.K.E.R. game ???

    I think it is as of now the most visully appealing, realistic and demandong game. IMHO this game is the way to test GPU's performance on future titles.

    I'm sure the moment You check it You will understand.
  • Josh Venning - Tuesday, April 11, 2006 - link

    We will definitely consider reviewing this game after it's released, whenever that may be.
  • Kremy - Thursday, April 6, 2006 - link

    Just wanted to add another vote for some Oblivion testing, and ALSO an inclusion of ATI's x800 and x850 series vid cards. For the record, I'm playing Oblivion on an x850xt running at 540/580 (PE speeeds), and it's running fine on high settings, 1024x768, no AA, full distance. Great game...
  • AdamK47 3DS - Wednesday, April 5, 2006 - link

    Why is there an article about it now when this patch has been out for so long?
  • bupkus - Wednesday, April 5, 2006 - link

    Maybe to be fair to ATI.
  • AdamK47 3DS - Wednesday, April 5, 2006 - link

    I suppose there could have been some pressure from Ati to post this article in order to vindicate themselves. Anandtech did like to use the previous version of B&W2 for performance testing. People were probably quick to blame Ati for the poor performance. I doubt this whole article would have ever been written had there not been some sort of outside influence.
  • JarredWalton - Wednesday, April 5, 2006 - link

    Actually, AFAIK, Josh just wanted to write about this subject. He's also the one that has done some of the regression testing (i.e use old drivers). It's all in the search for knowledge. As far as the patch, I'm *sure* that ATI helped Lionhead make some optimizations. Okay, that's a guess, but I would be amazed if they didn't. So, file this one under the heading of, "why is it that we need to make specific optimizations to games and drivers?"

    Oblivion is even worse right now. SLI you have to make a custom profile and manually enable AFR2 rendering for best performance (apparently). For ATI CF support, you actually have to rename the executable. So much for multi-GPU support out of box experience! Not that SLI/CF aren't faster, but they are frequently a hassle to deal with.
  • spinportal - Wednesday, April 5, 2006 - link

    Why doesn't any site test a 7900 GTX clocked down to a 7900 GT part for core/mem and see its performance? I have a feeling there could be a US$400 market for such a tweaked GT w/ 512MB card in between a GT (256MB) & GTX (512MB). Where o'where?
  • Araemo - Wednesday, April 5, 2006 - link

    "Unfortunately, one of the problems with this game has been that it tends to favor NVIDIA graphics cards over ATI cards, despite the ATI splash screen at the game's startup."

    Well, humorously enough, there has been at least one "Nvidia: The way it's meant to be played" game that ran better on my 9700 Pro than my friends' 5xxxx and 4xxxx series nvidia cards. :) Most dev houses are against making their game specifically more playable on one type of hardware than another, even with branding payments. ATI and nVidia are pretty even as far as gamer-level market share goes, so they're not going to fubar half their audience on purpose, and some games just run better on one architecture than another.. Nothing really suprising except that ATI and nVidia think that is worth spending their money on. :)
  • Warder45 - Wednesday, April 5, 2006 - link

    Let's see some Elder Scrolls Oblivion testing.

Log in

Don't have an account? Sign up now