Final Thoughts

If my final thoughts start sounding like a broken record, it’s because once again a set of NVIDIA & AMD product launches have resulted in a pair of similarly performing products.

The crux of the matter is that NVIDIA and AMD have significantly different architectures, and once again this has resulted in cards that are quite equal on average but are all over the place in individual games and applications. If we just look at the mean performance lead/loss for all games at 2560, the GTX 590 is within 1% of the 6990; however, within those games there’s a great deal of variance. The GTX 590 does extremely well in Civilization V as we’d expect, along with DIRT 2, Mass Effect 2, and HAWX. Meanwhile in Crysis, BattleForge, and especially STALKER the GTX 590 comes up very short. Thus choosing the most appropriate card is heavily reliant what games are going to be played on it, and as a result there is no one card that can be crowned king.

Of the games NVIDIA does well in, only Civ5 is a game we’d classify as highly demanding; the rest are games where the GTX 590 is winning, but it’s also getting 100+ frames per second. Meanwhile on the games AMD does well at the average framerate is much lower, and all of the games are what we’d consider demanding. Past performance does not perfectly predict future performance, but there’s a good chance the 6990 is going to have a similar lead on future, similarly intensive games (at least as long as extreme tessellation isn’t a factor). So if you had to choose a card based on planning for future use as opposed to current games, the 6990 is probably the better choice from a performance perspective. Otherwise if you’re choosing based off of games you’d play today, you need to look at the individual games.

With that said, the wildcard right now is noise. Dual-GPU cards are loud, but the GTX 590 ends up being the quieter of the two by quite a bit; the poor showing of the 6990 ends up making the GTX 590 look a lot more reasonable than it necessarily is. The situation is a lot like the launch of the GTX 480, where we saw the GTX 480 take the performance crown, but at the cost of noise. The 6990’s performance advantage in shader-intensive games goes hand-in-hand with a much louder fan; whether this is a suitable tradeoff is going to be up to you to decide.

Ultimately we’re still looking at niche products here, so we shouldn’t lose sight of that fact. A pair of single-GPU cards in SLI/CF is still going to be faster and a bit quieter if not a bit more power hungry, all for the same price or less. The GTX 590 corrects the 6990’s biggest disadvantage versus a pair of single-GPU cards, but it ends up being no faster on average than a pair of $280 6950s, and slower than a pair of $350 GTX 570s. At the end of the day the only thing really threatened here is the GTX 580 SLI; while it’s bar none the fastest dual-GPU setup there is, at $1000 for a pair of the cards a quad-GPU setup is only another $400. For everything else, as was the case with the Radeon HD 6990, it’s a matter of deciding whether you want two video cards on one PCB or two PCBs.

Quickly, let's also touch upon factory overclocked/premium cards, since we had the chance to look at one today with the EVGA GeForce GTX 590 Classified. EVGA’s factory overclock isn’t anything special, and indeed if it were much less it wouldn’t even be worth the time to benchmark. Still, EVGA is charging 4% more for about as much of a performance increase, and then is coupling that with a lifetime warranty; ignore the pack-in items and you have your usual EVGA value-added fare, and all told it’s a reasonable deal, particularly when most other GTX 590s don’t come with that kind of warranty. Meanwhile EVGA’s overclocking utility suite is nice to see as always, though with the changes to OCP (and the inability to see when it kicks in) I’m not convinced GTX 590 is a great choice for end-user overclocking right now.

Update: April 2nd, 2011: Starting with the 267.91 drivers and release 270 drivers, NVIDIA has disabled overvolting on the GTX 590 entirely. This is likely a consequence of several highly-publicized incidents where GTX 590 cards died as a result of overvolting. Although it's unusual to see a card designed to not be overclockable, clearly this is where NVIDIA intends to be.

Finally, there’s still the multi-monitor situation to look at. We’ve only touched on a single monitor at 2560; with Eyefinity and NVIDIA/3D Vision Surround things can certainly change, particularly with the 6990’s extra 512MB of RAM per GPU to better handle higher resolutions. But that is a story for another day, so for that you will have to stay tuned…

Power, Temperature, & Noise
POST A COMMENT

123 Comments

View All Comments

  • Ruger22C - Thursday, March 24, 2011 - link

    Don't spew nonsense to the people reading this! Write a disclaimer if you're going to do that. Reply
  • The Finale of Seem - Saturday, March 26, 2011 - link

    Um...no. For one, HUD elements tend to shrink in physical size as resolution rises, meaning that games with a lot of HUD (WoW comes to mind) benefit by letting you see more of what's going on, which means that 720p is pretty friggin' awful. For two, 1920x1080 has become the standard for most monitors over 21" or so, and a lot of gamers get 1920x1080 displays, especially if they're also watching 1080p video or doing significant multitasking. Non-native resolutions look like ass, and as such, 1600x1050 is right out as you won't want to play at anything but 1920x1080.

    Now, you can say that there isn't much point going above that, and right now, that may be so as cost is pretty prohibitive, but that may not always be the case.
    Reply
  • rav55 - Thursday, March 31, 2011 - link

    What good is it if you can't buy it? Nvidia cherry picked the gpu's to work on this card and they could only release a little over 1000 units. It is now sold out in the US and available in limited amounts in Europe.

    Basically the GTX 590 is vapourware!!! What a joke!
    Reply
  • wellortech - Thursday, March 24, 2011 - link

    Reviews seem to still agree that 6950CF or 570 SLI are just as powerful, and much less expensive. Guess I'll be keeping my pair of 6950s while continuing to enjoy 30" 2550x1600 heaven. Reply
  • DanNeely - Thursday, March 24, 2011 - link

    Yeah, these only really make sense if you're going for a 4GPU setup in an ATX box, or have a larger mATX case and want to 2 GPUs and some other card. Reply
  • jfelano - Thursday, March 24, 2011 - link

    You go boy. I'll continue to have a life. Reply
  • The_Comfy_Chair - Thursday, March 24, 2011 - link

    Get over yourself.

    YOU are trolling on a forum about a video card on a tech-geek site on the internet. You have no more of a life than wellortech or anyone else here - self included.
    Reply
  • ShumOSU - Thursday, March 24, 2011 - link

    You're 16,000 pixels short. :-) Reply
  • egandt - Thursday, March 24, 2011 - link

    Would have been better to see what these cards did with 3x 1920x1200 displays, as obviously they are overkill for any single display. Reply
  • Dudler - Thursday, March 24, 2011 - link

    Couldn't agree more, but since we know from the 1,5 GB 580 that the nVida card do poorly in higher resolutions, AnandTech is probably never test any such setup. Expect 12x10 instead, as nVidia tends to do better in low resolutions than Amd. 19x12 is already irrelevant with these cards. Reply

Log in

Don't have an account? Sign up now