Final Words

Traditionally dual-GPU cards have been a mixed bag. More often than not they have to sacrifice a significant amount of single-GPU performance in order to put two GPUs on a single card, and in the rare occasions where that tradeoff doesn’t happen there’s some other tradeoff such as a loud cooler or immense power consumption. NVIDIA told us that they could break this tradition and put two full GTX 680s on a single card, and that they could do that while making it quieter and less power consuming than a dual video card SLI setup. After going through our benchmarking process we can safely say that NVIDIA has met their goals.

From a gaming performance perspective we haven’t seen a dual-GPU card reach the performance of a pair of high-end cards in SLI/CF since the Radeon HD 4870X2 in 2008, so it’s quite refreshing to see someone get so close again 4 years later. The GTX 690 doesn’t quite reach the performance of the GTX 680 SLI, but it’s very, very close. Based on our benchmarks we’re looking at 95% of the performance of the GTX 680 SLI at 5760x1200 and 96% of the performance at 2560x1600. These are measurable differences, but only just. For all practical purposes the GTX 690 is a single card GTX 680 SLI – a single card GTX 680 SLI that consumes noticeably less power under load and is at least marginally quieter too.

With that said, this would typically be the part of the review where we would inject a well-placed recap of the potential downsides of multi-GPU technology; but in this case there’s really no need. Unlike the GTX 590 and unlike the GTX 295 NVIDIA is not making a performance tradeoff here compared to their single-GPU flagship card. When SLI works the GTX 690 is the fastest card out there, and when SLI doesn’t work the GTX 690 is still the fastest card out there. For the first time in a long time using a dual-GPU card doesn’t mean sacrificing single-GPU performance, and that’s a game changer.

At this point in time NVIDIA offers two different but compelling solutions for ultra-enthusiast performance; the GTX 690 and GTX 680 SLI, and they complement each other well. For most situations the GTX 690 is going to be the way to go thanks to its lower power consumption and lower noise levels, but for cases that need fully exhausting video cards the GTX 680 SLI can offer the same gaming performance at the same price. Unfortunately we’re going to have to put AMD out of the running here; as we’ve seen in games like Crysis and Metro the 7970 in Crossfire has a great deal of potential, but as it stands Crossfire is simply too broken overall to recommend.

The only real question I suppose is simply this: is the GTX 690 worthy of its $999 price tag? I don’t believe there’s any argument to be had with respect to whether the GTX 690 is worth getting over the GTX 680 SLI, as we’ve clearly answered that above. As a $999 card it doesn’t double the performance of the $499 GTX 680, but SLI has never offered quite that much of a performance boost. However at the same time SLI has almost always been good enough to justify the cost of another GPU if you must have performance better than what the fastest single GPU can provide, and this is one of those times.

Is $999 expensive? Absolutely. Is it worth it? If you’re gaming at 2560x1600 or 5760x1200, the GTX 690 is at least worth the consideration. You can certainly get by on less, but if you want 60fps or better and you want it with the same kind of ultra high quality single GPU cards can already deliver at 1920x1080, then you can’t do any better than the GTX 690.

Wrapping things up, there is one question left I feel like we still don’t have a good answer to: how much RAM a $999 card should have. NVIDIA went with a true equal for the GTX 680 SLI, right down to the 2GB of VRAM per GPU. Looking back at what happened to the Radeon HD 5970 and its 1GB of VRAM per GPU – we can’t even run our 5760x1200 benchmarks on it, let alone a couple of 2560x1600 benchmarks – I’m left uneasy. None of our benchmarks today seem to require more than 2GB of VRAM, but that much VRAM has been common in high-end cards since late 2010; the day will come when 2GB isn’t enough, and I'm left to wonder when. A GTX 690 with 4GB of VRAM per GPU would be practically future-proof, but with 2GB of VRAM NVIDIA is going to be cutting it close.

Overclocked: Gaming Performance
Comments Locked

200 Comments

View All Comments

  • james.jwb - Thursday, May 3, 2012 - link

    You are correct, I don't own one... I own three in triple screen. Dell U2412m's.

    I really am at a loss as to what you are on about. It is well known that 16:10 is preferred amongst enthusiasts and professionals for a few reasons. If you want 16:9, fine, go for it, but don't act like it's weird that AT are benching with 16:10 just because you went with cheap ass 16:9 screens.
  • CeriseCogburn - Friday, May 4, 2012 - link

    Yes of course you are at a loss, you don't understand a word so why reply ?
    You're all at a loss.
    ROFL
  • yelnatsch517 - Friday, May 4, 2012 - link

    Are you being sarcastic or an idiot?
    From my experience 1900x1200 24" monitors are the MAJORITY. My work has roughly 50 24" monitors all in that resolution. My HP ZR24W is 1900x1200 as well. The only 24" monitor that I have even seen is the TN panel that came with an HP computer.

    If you are talking about computer monitors, 1900x1200 is the dominant resolution. If you are talking about TVs, then obviously 1080p is the norm.
  • CeriseCogburn - Saturday, May 5, 2012 - link

    There are 242 - count them, well over 200, nearly 250 1920X1080 monitors at the egg.
    _
    In your great experience, there are 16 that fit your 1920X1200 dreampipe FUD scenario at the egg, with most of them, well over half, over $400 each, while the 242 common monitors you all buy as you pinch every penny and whine about $10 difference in videocard prices are well under $200 each a lot of the time.
    So now suddenly, you all spend way over $300 to plus $400 for 11% more pixels... ROFL HAHAHHAHHA instead of $150 or $200...
    I guess that's why this place is so biased, the little bloggers are just as whacked when it comes to being honest.
  • InsaneScientist - Saturday, May 5, 2012 - link

    Good grief... resorting to personal attacks isn't exactly a good way to get people to listen to you.

    I'm not going to argue that 1080p isn't more common (from what I've read, no one is), because it is more common, you are quite correct there, however I must point out that your logic to arrive at that conclusion is faulty:
    You're contending that 1080p is more common (it is) because there are more models available on Newegg, but just knowing how many models are available doesn't tell us how many units those move.
    If, for example, each of those 22 models of 1920x1200 monitors moves 10 times as much stock as each of the 1920x1080, nearly as many of the 1920x1200 will have been sold as the 1920x1080 ones.
    Now, I don't think that's likely, and I do agree with you that 1080p is more common nowadays (see next point), but your argument is invalid, even though you have come to the correct conclusion.
    Consider this: there are currently two models of iPhone available, compared to dozens of Android phones. By the same logic as you're using, I could say that the iPhone is incredibly rare - I'd get laughed out of town if I tried to make that argument.

    The second point is that 1920x1200 hasn't been nearly as rare in the past as it is today. When I bought my previous monitor and my laptop (both 1920x1200), 1080p monitors were almost unheard of. Since monitors tend to last a while, it's not at all unreasonable for a disproportionate amount of people to be using them compared to their current sales.

    Thirdly, there is a point of diminishing returns. Notice the complete lack of any benchmarks at or below 1680x1050? These cards are so fast that comparisons at those resolutions are pointless - they're all fast enough for anything you could do to them at that screen res - even Crysis. 1920x1080 almost falls into that category, heck, even 1920x1200 almost falls into that category. Benchmarks are only about who wins if there is some advantage to winning. Below 2560x1600, which card you're using is almost completely irrelevant, so why does it even matter whether they used 1920x1080 or 1920x1200?
  • CeriseCogburn - Tuesday, May 8, 2012 - link

    Blah blah blah blah and I'm still 100% correct and you are not at all.
  • Decembermouse - Tuesday, May 8, 2012 - link

    You're quite a character.
  • anirudhs - Thursday, May 3, 2012 - link

    I use 2 at work - HP ZR24W.
  • piroroadkill - Sunday, May 6, 2012 - link

    Hm, odd.
    Not only do I have 1920x1200 monitor on my desktop, I have TWO laptops with 1920x1200 screens. Using one right now.
    Yes, they're rarer than 1080p screens, but this is a site for enthusiasts, therefore, it is more likely.
  • Ryan Smith - Thursday, May 3, 2012 - link

    The truth is a bit more simple than that. 5760x1200 is because our choice in monitors for multi-monitor testing was based on my personal monitor, which is another PA246Q. NVIDIA's limited display flexibility (same res + same sync) meant that it was easiest to just pair the PA246Q with some more PA246Qs, Consequently it's easier to just test these monitors at their native resolution when we're using NVIDIA cards.

Log in

Don't have an account? Sign up now