Final Words

Traditionally dual-GPU cards have been a mixed bag. More often than not they have to sacrifice a significant amount of single-GPU performance in order to put two GPUs on a single card, and in the rare occasions where that tradeoff doesn’t happen there’s some other tradeoff such as a loud cooler or immense power consumption. NVIDIA told us that they could break this tradition and put two full GTX 680s on a single card, and that they could do that while making it quieter and less power consuming than a dual video card SLI setup. After going through our benchmarking process we can safely say that NVIDIA has met their goals.

From a gaming performance perspective we haven’t seen a dual-GPU card reach the performance of a pair of high-end cards in SLI/CF since the Radeon HD 4870X2 in 2008, so it’s quite refreshing to see someone get so close again 4 years later. The GTX 690 doesn’t quite reach the performance of the GTX 680 SLI, but it’s very, very close. Based on our benchmarks we’re looking at 95% of the performance of the GTX 680 SLI at 5760x1200 and 96% of the performance at 2560x1600. These are measurable differences, but only just. For all practical purposes the GTX 690 is a single card GTX 680 SLI – a single card GTX 680 SLI that consumes noticeably less power under load and is at least marginally quieter too.

With that said, this would typically be the part of the review where we would inject a well-placed recap of the potential downsides of multi-GPU technology; but in this case there’s really no need. Unlike the GTX 590 and unlike the GTX 295 NVIDIA is not making a performance tradeoff here compared to their single-GPU flagship card. When SLI works the GTX 690 is the fastest card out there, and when SLI doesn’t work the GTX 690 is still the fastest card out there. For the first time in a long time using a dual-GPU card doesn’t mean sacrificing single-GPU performance, and that’s a game changer.

At this point in time NVIDIA offers two different but compelling solutions for ultra-enthusiast performance; the GTX 690 and GTX 680 SLI, and they complement each other well. For most situations the GTX 690 is going to be the way to go thanks to its lower power consumption and lower noise levels, but for cases that need fully exhausting video cards the GTX 680 SLI can offer the same gaming performance at the same price. Unfortunately we’re going to have to put AMD out of the running here; as we’ve seen in games like Crysis and Metro the 7970 in Crossfire has a great deal of potential, but as it stands Crossfire is simply too broken overall to recommend.

The only real question I suppose is simply this: is the GTX 690 worthy of its $999 price tag? I don’t believe there’s any argument to be had with respect to whether the GTX 690 is worth getting over the GTX 680 SLI, as we’ve clearly answered that above. As a $999 card it doesn’t double the performance of the $499 GTX 680, but SLI has never offered quite that much of a performance boost. However at the same time SLI has almost always been good enough to justify the cost of another GPU if you must have performance better than what the fastest single GPU can provide, and this is one of those times.

Is $999 expensive? Absolutely. Is it worth it? If you’re gaming at 2560x1600 or 5760x1200, the GTX 690 is at least worth the consideration. You can certainly get by on less, but if you want 60fps or better and you want it with the same kind of ultra high quality single GPU cards can already deliver at 1920x1080, then you can’t do any better than the GTX 690.

Wrapping things up, there is one question left I feel like we still don’t have a good answer to: how much RAM a $999 card should have. NVIDIA went with a true equal for the GTX 680 SLI, right down to the 2GB of VRAM per GPU. Looking back at what happened to the Radeon HD 5970 and its 1GB of VRAM per GPU – we can’t even run our 5760x1200 benchmarks on it, let alone a couple of 2560x1600 benchmarks – I’m left uneasy. None of our benchmarks today seem to require more than 2GB of VRAM, but that much VRAM has been common in high-end cards since late 2010; the day will come when 2GB isn’t enough, and I'm left to wonder when. A GTX 690 with 4GB of VRAM per GPU would be practically future-proof, but with 2GB of VRAM NVIDIA is going to be cutting it close.

Overclocked: Gaming Performance
Comments Locked

200 Comments

View All Comments

  • InsaneScientist - Sunday, May 6, 2012 - link

    Except that nVidia wins in the article and all of the accumulated benches here, even at 1920x1200 (which this card would be a complete waste on...), so what exactly are you complaining about?
    It's bias if they say that the AMD cards are better when they're not, but in the benchmarks and in the conclusions (here and elsewhere), nVidia is consistently ahead, so any claims of bias are completely groundless...
  • CeriseCogburn - Tuesday, May 8, 2012 - link

    Read my first post instead of asking or having already read it attack like you just did and continue to be a jerk who cares, right ?
    You obviously are all seriously upset about the little facts I gave in my very first post. You're all going bonkers over it, and you all know I'm correct, that's what really bothers all of you.
    Continue to be bothered, you cannot help it, that's for sure.
  • Sabresiberian - Thursday, May 3, 2012 - link

    It's certainly not crazy, I'd certainly run 3 1920x1200 monitors over 3 1920x1080s.

    ;)
  • CeriseCogburn - Thursday, May 3, 2012 - link

    I guess all of you got very upset that my point was made, you're looking at a biased for amd set of benchmarks. I'm sure most of you are very happy about that, but angered it has been effectively pointed out.
  • Makaveli - Thursday, May 3, 2012 - link

    The only thing were are upset about is your being a tool!

    And what point? you haven't shown a shread of evidence to back up this bias claim only whats floating around in your head!
  • CeriseCogburn - Sunday, May 6, 2012 - link

    Go look at the link you missed since you cannot read and only attack and call names.
  • james.jwb - Thursday, May 3, 2012 - link

    I always love these guys who behave like this.

    On the one hand, if they are trolling just for the reaction, it's fascinating. What kind of weird creature lies behind the internet persona? In most cases, we all know it must be a sad figure of a person with all sorts of interesting personality problems.

    But on the flip side, if this person actually means and believes what they say is some sort of honest analysis, it's just as fascinating. What kind of thick bastard must then lurk behind the keyboard in question?

    It boggles the mind :)
  • silverblue - Thursday, May 3, 2012 - link

    Reminds me of SiliconDoc. That particular numpty got banned as far as I remember.
  • Galidou - Thursday, May 3, 2012 - link

    I think that these creatures are Nvidia's fanboy, they react always the same way. CeriseCogburn remember me one of them a little while ago, can't remember his name. He was convinced that the 7970 pricing was the worst thing to ever happen to humanity since the birth of Justin Bieber, or at least, it looked alot like that. Sure the price wasn't attractive, but there's some limit you must not cross to stay in the real world.

    So as weird a creature they can be, I believe they are a result of Nvidia's funding them to spread insanity in forums speaking of video cards. They can't be natural things after all, they just don't make sense. Their closed mind is second to none. Or else, they could only have the possibility to type insanities and a filter to read the replies to stop some information entering their brain.
  • Parhel - Friday, May 4, 2012 - link

    Do you really think ATI and nVidia would pay these weird, sad, little trolls to piss off readers every time one of their products is reviewed? It's an embarassment and a distraction. No, I think they would pay someone like that NOT to talk about their products if they could. I'm sure that employees do write comments on product reviews, but guys like this are bad for business. Nobody wants someone like that on their side. If I were nVidia, I'd pay that guy to become an AMD fan!!!

Log in

Don't have an account? Sign up now