Wolfenstein

Finally among our benchmark suite we have Wolfenstein, the most recent game to be released using the id Software Tech 4 engine. All things considered it’s not a very graphically intensive game, but at this point it’s the most recent OpenGL title available. It’s more than likely the entire OpenGL landscape will be thrown upside-down once id releases Rage later this year.

Even at 2560 Wolfenstein is very close to being CPU limited when we’re working with SLI/CF. There’s just enough room for the 6990 to once again fall behind the 6950CF, however even the all-powerful 6970CF only eeks out a few more frames per second. In these conditions the test is less about the hardware and more about the software.

Mass Effect 2 Compute Performance
Comments Locked

130 Comments

View All Comments

  • iamezza - Tuesday, March 8, 2011 - link

    This could make for an extremely valuable article for gamers on a budget. When does lack of PCIe bandwidth become an issue for running SLI/crossfire?

    Testing 580SLI at 2 x 8 and 2 x 16 modes would be a good place to start....
  • therealnickdanger - Tuesday, March 8, 2011 - link

    It will be curious to see what impact the bandwidth will have... then again, even with the restriction, the current Sandy Bridge systems still dominate the previous chips.

    In reality, 16/16 or 8/8 really doesn't have much impact. The difference even at 2560x1600 with all the fixins in even the most demanding games is <1%. Unless AT's new test system will feature six displays and 4K+ resolutions, I'm not sure SNB-E is worth waiting so long for (yes, that could be perceived as a challenge!)

    In any case, I'm looking forward to it! Thanks for the article!
  • shaggart5446 - Tuesday, March 8, 2011 - link

    i hope u said the same thing when ur friend nvidia release their 590 card i also do hope u say the exact words that the 590 dont make any sence since a pair of 560 or 570 can give u the same performance as the 590 i cant wait to see ur article on the 590 ill be waiting for anand tfor this because we all know that the 590 are going to be down clock
  • ClownPuncher - Tuesday, March 8, 2011 - link

    With cards designed specifically with multi monitor gaming in mind, you may want to include those resolutions. Buying this card for 1920x1200 would make zero sense.
  • 7Enigma - Wednesday, March 9, 2011 - link

    I think it was good to have both. The number of people buying this card will likely have 30" displays, but I'm sure some (competetive FPS for example) will want extremely fluid display even in busy scenes, as well as the person that doesn't yet have the cash to upgrade to a big screen but plans to in the near future.

    I would also argue that there are likely vastly more people playing on large single-screen displays than eyefinity folks so this does make more sense. And honestly when some of the games are averaging in the sub 80-100 fps range, those minimum framerates approach questionable playability depending on type of game.

    So basically as crazy as it is to say this, the graphical power isn't quite there yet to use Eyefinity at high detail settings in more recent and demanding games.
  • Nentor - Tuesday, March 8, 2011 - link

    "With but a trio of exceptions, the 6990 doesn’t make sense compared to a pair of cards in Crossfire."

    This product is not meant to make any sense from a financial, performance or even practical standpoint.

    It IS the fastest videocard and that is that.

    I was watching a video last night on youtube of a chainsaw powered by a Buick's V8 engine (hG5sTLY0-V8). It goes through a tree trunk in a blink of an eye, but it had to be lifted by TWO men.

    Sure is cool though.
  • Squuiid - Sunday, March 13, 2011 - link

    It makes complete sense if you want SLI in a small form factor, mATX and such. (as do I).
    PCIe slots are at a premium, and so is space on a mATX board/case.

    However, I think I'm going to wait and see what the 590 looks like...
  • Fhistleb - Tuesday, March 8, 2011 - link

    I didn't even think that was possible. Though with what this is pushing out its a little expected I suppose.
  • stangflyer - Tuesday, March 8, 2011 - link

    I would like to see the 6990 and 5970 comparison in crysis and metro at eyefinity and single monitor res but with the 5970 at default clocks and close to 5870 clocks. When I am playing these games I have my 5970 at 850 core and 1150 memory and it runs all day without any throttling.

    The 5970 is handicapped at the default speeds as everyone can run at or real close to 5870 speeds. The core is easy at 850 but you may need to back down memory to 1150 or 1175.

    Would love to see the true difference in the 5970 and 6990 this way.

    The framebuffer will be the big difference at eyefinity res. with any aa applied.
  • stangflyer - Tuesday, March 8, 2011 - link

    One thing I do like about the dual gpu amd cards is that I play a few games that use physx.. (I have a 5970) I have a 250gts in the second pcie slot. both my slots are 2x16. This way I have a powerfull gpu and physx! I play my games at 5040x1050 and a single card just don't cut it. I did use nvidia surround for 2 months but like my eyefinity setup better. To go crossfire and then have physx you need a motherboard that doesn't knock your pcie slot down to 8x with 3 cards which are few and expensive and also a case that has space for that 3rd card like a coolermaster haf 932X. I have a haf 932 (not X) and I could not go 3 cards unless the 3rd card is single slot.

    On a side note as to why I am sticking with my 5970 till the 28nm show up is that I like the way the cooler is set up. With the fan on the end I have my 250gts below it with about a 3/8 inch below it. BUT the 250gts is only about 7.5-8 inches long and does not cover the fan at all because the fan is at the end. I have a 120mm fan at the bottom of my haf 932 case that blows straight up into the 5970 fan.

    If I used a 6990 the 250gts would cover the 6990 fan.

    My choices would be then to sell the 250gts and get a single slot card. (450gts probably)

    I think I am just going to stay with what I have for now.

    Maybe! LOL!

Log in

Don't have an account? Sign up now