Wolfenstein

Finally among our benchmark suite we have Wolfenstein, the most recent game to be released using the id Software Tech 4 engine. All things considered it’s not a very graphically intensive game, but at this point it’s the most recent OpenGL title available. It’s more than likely the entire OpenGL landscape will be thrown upside-down once id releases Rage later this year.

Wolfenstein ends up being another title the GTX 590 and 6990 are quite close at, largely because the game quickly becomes CPU Limited after a point. Still, the GTX 590 OC picks up almost 10%, and even the EVGA GTX 590 is good for around 2%.

Mass Effect 2 Compute
Comments Locked

123 Comments

View All Comments

  • RaistlinZ - Thursday, March 24, 2011 - link

    What about the 6950 2GB? It can be had for $245.00 after rebate and it's plenty powerful.
  • the_elvino - Thursday, March 24, 2011 - link

    Is it so hard to admit that the 6990's performance is better across the board? Multi-monitor setups were left out in order to make NVidia look good.

    Remember when the GTX 580 SLI review was published, AT didn't include a 5970 crossfire setup, because they sadly only had one 5970.

    Yes, the GTX 590 is less noisy, but then again you can underclock the 6990 to GTX 590 performance levels and it will be quieter too, not really an argument.

    The GTX 590 is slower (especially at super high resolutions) and draws more power than the 6990 at the same price, AMD wins! Simple!
  • softdrinkviking - Thursday, March 24, 2011 - link

    If the 590 can only drive 2 displays, is the reason it has 3 DVI ports is only for people who buy 2 cards and then you can run all three off of one card?
  • Ryan Smith - Friday, March 25, 2011 - link

    The individual GPUs can only drive 2 monitors each. NVIDIA is using the display capabilities of both GPUs together in order to drive 4 monitors.
  • softdrinkviking - Friday, March 25, 2011 - link

    ah, got it. i should read more carefully.
    thanks for answering. :)
  • The Jedi - Friday, March 25, 2011 - link

    Surely if each GPU can run two displays, two GPUs on one card can run four displays?
  • Soulkeeper - Thursday, March 24, 2011 - link

    Wow that is massive
    I wouldn't put that in my pc if someone else bought it for me.
  • hab82 - Friday, March 25, 2011 - link

    For me the gaming differences between AMD Nividia at this level would not get me to change allegiance. In my case I bought two EVGA classifieds but they may never see a game. I use them as low cost replacements to the Telsa cards. For Parallel processing I held off until we had powers of two ie 256 512 1024 cores. I did not like 240 432 480 alternatives. Just easier to work with powers of 2. The lower noise and lower power were big winners. Evan if the power supplies can handle them you still have to get the heat out of the room. My 4 nodes of 1024 cuda cores each are a pain to air condition in my den.
    I really love you gamers, you buy enough GPU's to keep the competition going and lowering the price for impressive computation power.
    Thanks all!

    Hubert
  • Calin - Friday, March 25, 2011 - link

    USA manages to stay just south of Canada
  • rs2 - Friday, March 25, 2011 - link

    I thought you guys learned your lesson the last time you pitted a factory-overclocked reference nVidia card against a stock AMD card. If you're going to use an overclocked card in an article that's meant to establish the performance baseline for a series of graphics cards (as this one does for the GTX 590 series), then you really should do at least one of:

    1. Downclock the overclocked "reference" card to stock levels.
    2. Overclock the competing AMD card by a comparable amount (or use a factory overclocked AMD card as the reference point)
    3. Publish a review using a card that runs at reference clocks, and then afterwards publish a separate review for the overclocked card.
    4. Overclock both cards to their maximum stable levels, and use that as the comparison point.

    Come on now, this is supposed to be Anandtech. It shouldn't be up to the readers to lecture you about journalistic integrity or the importance of providing proper "apples to apples" comparisons. Using a factory overclocked card to establish a performance baseline for a new series of GPU's is biased, no matter how small and insignificant the overclock may seem. Cut it out.

Log in

Don't have an account? Sign up now