It’s been quite a while since we’ve looked at triple-GPU CrossFire and SLI performance – or for that matter looking at GPU scaling in-depth. While NVIDIA in particular likes to promote multi-GPU configurations as a price-practical upgrade path, such configurations are still almost always the domain of the high-end gamer. At $700 we have the recently launched GeForce GTX 590 and Radeon HD 6990, dual-GPU cards whose existence is hedged on how well games will scale across multiple GPUs. Beyond that we move into the truly exotic: triple-GPU configurations using three single-GPU cards, and quad-GPU configurations using a pair of the aforementioned dual-GPU cards. If you have the money, NVIDIA and AMD will gladly sell you upwards of $1500 in video cards to maximize your gaming performance.

These days multi-GPU scaling is a given – at least to some extent. Below the price of a single high-end card our recommendation is always going to be to get a bigger card before you get more cards, as multi-GPU scaling is rarely perfect and with equally cutting-edge games there’s often a lag between a game’s release and when a driver profile is released to enable multi-GPU scaling. Once we’re looking at the Radeon HD 6900 series or GF110-based GeForce GTX 500 series though, going faster is no longer an option, and thus we have to look at going wider.

Today we’re going to be looking at the state of GPU scaling for dual-GPU and triple-GPU configurations. While we accept that multi-GPU scaling will rarely (if ever) hit 100%, just how much performance are you getting out of that 2nd or 3rd GPU versus how much money you’ve put into it? That’s the question we’re going to try to answer today.

From the perspective of a GPU review, we find ourselves in an interesting situation in the high-end market right now. AMD and NVIDIA just finished their major pushes for this high-end generation, but the CPU market is not in sync. In January Intel launched their next-generation Sandy Bridge architecture, but unlike the past launches of Nehalem and Conroe, the high-end market has been initially passed over. For $330 we can get a Core i7 2600K and crank it up to 4GHz or more, but what we get to pair it with is lacking.

Sandy Bridge only supports a single PCIe x16 link coming from the CPU – an awesome CPU is being held back by a limited amount of off-chip connectivity; DMI and a single PCIe x16 link. For two GPUs we can split that out to x8 and x8 which shouldn’t be too bad. But what about three GPUs? With PCIe bridges we can mitigate the issue some by allowing the GPUs to talk to each other at x16 speeds and dynamically allocate CPU-to-GPU bandwidth based on need, but at the end of the day we’re splitting a single x16 lane across three GPUs.

The alternative is to take a step back and work with Nehalem and the x58 chipset. Here we have 32 PCIe lanes to work with, doubling the amount of CPU-to-GPU bandwidth, but the tradeoff is the CPU.  Gulftown and Nehalm are capable chips on its own, but per-clock the Nehalem architecture is normally slower than Sandy Bridge, and neither chip can clock quite as high on average. Gulftown does offer more cores – 6 versus 4 – but very few games are held back by the number of cores. Instead the ideal configuration is to maximize performance of a few cores.

Later this year Sandy Bridge E will correct this by offering a Sandy Bridge platform with more memory channels, more PCIe lanes, and more cores; the best of both worlds. Until then it comes down to choosing from one of two platforms: a faster CPU or more PCIe bandwidth. For dual-GPU configurations this should be an easy choice, but for triple-GPU configurations it’s not quite as clear cut. For now we’re going to be looking at the latter by testing on our trusty Nehalem + x58 testbed, which largely eliminates a bandwidth bottleneck in a tradeoff for a CPU bottleneck.

Moving on, today we’ll be looking at multi-GPU performance under dual-GPU and triple-GPU configurations; quad-GPU will have to wait. Normally we only have two reference-style cards of any product on hand, so we’d like to thank Zotac and PowerColor for providing a reference-style GTX 580 and Radeon HD 6970 respectively.

Fitting Three Video Cards in an ATX Case
Comments Locked

97 Comments

View All Comments

  • Ryan Smith - Monday, April 4, 2011 - link

    There are 2 reasons for that:

    1) We can't immediately get another 6990. I know it seems odd that we'd have trouble getting anything, but vendors are generally uninterested in sampling cards that are reference, which is why we're so grateful to Zotac and PowerColor for the reference 580/6970.

    2) We actually can't run a second 6990 with our existing testbed. The Rampage II Extreme only has x16 slots at positions 2 and 4; position 6 is x8. The spacing needs for a 6990CF setup require 2 empty slots, meaning we'd have to install it in position 6. Worse yet is that position 6 is abutted by our Antec 1200W PSU - this isn't a problem with single-GPU cards as the blowers are well clear of the PSU, but a center-mounted fan like the 6990 would get choked just as if there was another card immediately next to it.

    We will be rebuilding our testbed for SNB and using a mobo with better spacing, but that's not going to happen right away. The point being that we're not ignoring the 590/6990 multiple card configurations, it's just not something we're in a position to test right now.
  • piroroadkill - Monday, April 4, 2011 - link

    As long as it's in the works, that's alright. Seems like you have your reasons for it being the way it is.
  • Rukur - Monday, April 4, 2011 - link

    This whole technology is stupid with monitors.

    Why don't you stitch together 3 projectors for a seamless canvas to play a game ?
  • SlyNine - Monday, April 4, 2011 - link

    "This whole technology is stupid with monitors." Do you suppose neural interfaces will be her soon. kick ass.
  • Rukur - Monday, April 4, 2011 - link

    Can you read more than one sentence ?
  • monkeyshambler - Monday, April 4, 2011 - link

    Interesting stuff, but for a 3 card SLI / crossfire what I'd really want to see is what the framerates are when every setting on the card is maxed.
    e.g. 24x AA 16x AF, high quality settings selected in the driver control panels etc.
    supplement this with whats the performance on triple SLI with 3 1920*1080 monitors @ 4x AA
    As lets face it if your going to spend this sort of money (and likely a watercooling rig too as theirs no way three cards are tolerable otherwise) you want to have a genuine show of why you should invest.
    The current resolutions just will never stretch the cards or enable them to differentiate significantly from a standard SLI setup.

    Hope we can see some of the above in a future article....
  • Rukur - Monday, April 4, 2011 - link

    I tend to agree. How is maxing everything any worse then half inch monitor bezels all over your play area.

    The whole idea of eye infinity is stupid unless we all look through widows with 1 inch gaps while racing extreme cars.

    How about some projectors stitched together for real people to actually try.
  • erple2 - Tuesday, April 5, 2011 - link

    Wasn't there an analysis a while back comparing 1x, 2x, 4x, 8x and 16x AA? I thought that the conclusion to that was that there's no discernible difference between 8x and 16x AA, and the differences between 4x and 8x were only visible in careful examination of static images. Under normal play, you couldn't actually tell any difference between them.

    Maybe I'm just remembering wrong.

    Also, I think that Ryan mentioned why they haven't yet done the triple monitor tests yet (lack of hardware).
  • DanNeely - Tuesday, April 5, 2011 - link

    That's generally correct. Toms Hardware has run PCI restriction tests roughly once per GPU generation. The only game that ever really suffered at x4 bandwidth was MS flight simulator.

    PCIe bandwidth can impact some compute tasks. Einstien@home runs about 30% faster on a 460 in an 16x slot vs an 8x.
  • fepple - Monday, April 4, 2011 - link

    With my two 5870s I have a wierd problem in crossfire. I have two screens a 24''' LCD and a 37'' LED TV. When in crossfire if I play video on the second screen it gets some odd artifacts of black(ish) horizontal lines across the bottom of the screen. Only solution i've found is to not have the cards in crossfire and plug the TV/Screen into different cards for watching stuff.

    Annoying, any thoughts?

Log in

Don't have an account? Sign up now