Final Thoughts

If my final thoughts start sounding like a broken record, it’s because once again a set of NVIDIA & AMD product launches have resulted in a pair of similarly performing products.

The crux of the matter is that NVIDIA and AMD have significantly different architectures, and once again this has resulted in cards that are quite equal on average but are all over the place in individual games and applications. If we just look at the mean performance lead/loss for all games at 2560, the GTX 590 is within 1% of the 6990; however, within those games there’s a great deal of variance. The GTX 590 does extremely well in Civilization V as we’d expect, along with DIRT 2, Mass Effect 2, and HAWX. Meanwhile in Crysis, BattleForge, and especially STALKER the GTX 590 comes up very short. Thus choosing the most appropriate card is heavily reliant what games are going to be played on it, and as a result there is no one card that can be crowned king.

Of the games NVIDIA does well in, only Civ5 is a game we’d classify as highly demanding; the rest are games where the GTX 590 is winning, but it’s also getting 100+ frames per second. Meanwhile on the games AMD does well at the average framerate is much lower, and all of the games are what we’d consider demanding. Past performance does not perfectly predict future performance, but there’s a good chance the 6990 is going to have a similar lead on future, similarly intensive games (at least as long as extreme tessellation isn’t a factor). So if you had to choose a card based on planning for future use as opposed to current games, the 6990 is probably the better choice from a performance perspective. Otherwise if you’re choosing based off of games you’d play today, you need to look at the individual games.

With that said, the wildcard right now is noise. Dual-GPU cards are loud, but the GTX 590 ends up being the quieter of the two by quite a bit; the poor showing of the 6990 ends up making the GTX 590 look a lot more reasonable than it necessarily is. The situation is a lot like the launch of the GTX 480, where we saw the GTX 480 take the performance crown, but at the cost of noise. The 6990’s performance advantage in shader-intensive games goes hand-in-hand with a much louder fan; whether this is a suitable tradeoff is going to be up to you to decide.

Ultimately we’re still looking at niche products here, so we shouldn’t lose sight of that fact. A pair of single-GPU cards in SLI/CF is still going to be faster and a bit quieter if not a bit more power hungry, all for the same price or less. The GTX 590 corrects the 6990’s biggest disadvantage versus a pair of single-GPU cards, but it ends up being no faster on average than a pair of $280 6950s, and slower than a pair of $350 GTX 570s. At the end of the day the only thing really threatened here is the GTX 580 SLI; while it’s bar none the fastest dual-GPU setup there is, at $1000 for a pair of the cards a quad-GPU setup is only another $400. For everything else, as was the case with the Radeon HD 6990, it’s a matter of deciding whether you want two video cards on one PCB or two PCBs.

Quickly, let's also touch upon factory overclocked/premium cards, since we had the chance to look at one today with the EVGA GeForce GTX 590 Classified. EVGA’s factory overclock isn’t anything special, and indeed if it were much less it wouldn’t even be worth the time to benchmark. Still, EVGA is charging 4% more for about as much of a performance increase, and then is coupling that with a lifetime warranty; ignore the pack-in items and you have your usual EVGA value-added fare, and all told it’s a reasonable deal, particularly when most other GTX 590s don’t come with that kind of warranty. Meanwhile EVGA’s overclocking utility suite is nice to see as always, though with the changes to OCP (and the inability to see when it kicks in) I’m not convinced GTX 590 is a great choice for end-user overclocking right now.

Update: April 2nd, 2011: Starting with the 267.91 drivers and release 270 drivers, NVIDIA has disabled overvolting on the GTX 590 entirely. This is likely a consequence of several highly-publicized incidents where GTX 590 cards died as a result of overvolting. Although it's unusual to see a card designed to not be overclockable, clearly this is where NVIDIA intends to be.

Finally, there’s still the multi-monitor situation to look at. We’ve only touched on a single monitor at 2560; with Eyefinity and NVIDIA/3D Vision Surround things can certainly change, particularly with the 6990’s extra 512MB of RAM per GPU to better handle higher resolutions. But that is a story for another day, so for that you will have to stay tuned…

Power, Temperature, & Noise


View All Comments

  • RaistlinZ - Thursday, March 24, 2011 - link

    What about the 6950 2GB? It can be had for $245.00 after rebate and it's plenty powerful. Reply
  • the_elvino - Thursday, March 24, 2011 - link

    Is it so hard to admit that the 6990's performance is better across the board? Multi-monitor setups were left out in order to make NVidia look good.

    Remember when the GTX 580 SLI review was published, AT didn't include a 5970 crossfire setup, because they sadly only had one 5970.

    Yes, the GTX 590 is less noisy, but then again you can underclock the 6990 to GTX 590 performance levels and it will be quieter too, not really an argument.

    The GTX 590 is slower (especially at super high resolutions) and draws more power than the 6990 at the same price, AMD wins! Simple!
  • softdrinkviking - Thursday, March 24, 2011 - link

    If the 590 can only drive 2 displays, is the reason it has 3 DVI ports is only for people who buy 2 cards and then you can run all three off of one card? Reply
  • Ryan Smith - Friday, March 25, 2011 - link

    The individual GPUs can only drive 2 monitors each. NVIDIA is using the display capabilities of both GPUs together in order to drive 4 monitors. Reply
  • softdrinkviking - Friday, March 25, 2011 - link

    ah, got it. i should read more carefully.
    thanks for answering. :)
  • The Jedi - Friday, March 25, 2011 - link

    Surely if each GPU can run two displays, two GPUs on one card can run four displays? Reply
  • Soulkeeper - Thursday, March 24, 2011 - link

    Wow that is massive
    I wouldn't put that in my pc if someone else bought it for me.
  • hab82 - Friday, March 25, 2011 - link

    For me the gaming differences between AMD Nividia at this level would not get me to change allegiance. In my case I bought two EVGA classifieds but they may never see a game. I use them as low cost replacements to the Telsa cards. For Parallel processing I held off until we had powers of two ie 256 512 1024 cores. I did not like 240 432 480 alternatives. Just easier to work with powers of 2. The lower noise and lower power were big winners. Evan if the power supplies can handle them you still have to get the heat out of the room. My 4 nodes of 1024 cuda cores each are a pain to air condition in my den.
    I really love you gamers, you buy enough GPU's to keep the competition going and lowering the price for impressive computation power.
    Thanks all!

  • Calin - Friday, March 25, 2011 - link

    USA manages to stay just south of Canada Reply
  • rs2 - Friday, March 25, 2011 - link

    I thought you guys learned your lesson the last time you pitted a factory-overclocked reference nVidia card against a stock AMD card. If you're going to use an overclocked card in an article that's meant to establish the performance baseline for a series of graphics cards (as this one does for the GTX 590 series), then you really should do at least one of:

    1. Downclock the overclocked "reference" card to stock levels.
    2. Overclock the competing AMD card by a comparable amount (or use a factory overclocked AMD card as the reference point)
    3. Publish a review using a card that runs at reference clocks, and then afterwards publish a separate review for the overclocked card.
    4. Overclock both cards to their maximum stable levels, and use that as the comparison point.

    Come on now, this is supposed to be Anandtech. It shouldn't be up to the readers to lecture you about journalistic integrity or the importance of providing proper "apples to apples" comparisons. Using a factory overclocked card to establish a performance baseline for a new series of GPU's is biased, no matter how small and insignificant the overclock may seem. Cut it out.

Log in

Don't have an account? Sign up now