Setting up SLI

NVIDIA's nForce4 SLI reference design calls for a slot to be placed on the motherboard that will handle how many PCI Express lanes go to the second x16 slot. Remember that despite the fact that there are two x16 slots on the motherboard, there are still only 16 total lanes allocated to them at most - meaning that each slot is electrically still only a x8, but with a physical x16 connector. While having a x8 bus connection means that the slots have less bandwidth than a full x16 implementation, the real world performance impact is absolutely nothing. In fact, gaming performance doesn't really change down to even a x4 configuration; the performance impact of a x1 configuration itself is even negligible.

Click to Enlarge

The SLI card slot looks much like a SO-DIMM connector:

Click to Enlarge

The card itself has two ways of being inserted; if installed in one direction the card will configure the PCI Express lanes so that only one of the slots is a x16. In the other direction, the 16 PCI Express lanes are split evenly between the two x16 slots. You can run a single graphics card in either mode, but in order to run a pair of cards in SLI mode you need to enable the latter configuration. There are ways around NVIDIA's card-based design to reconfigure the PCI Express lanes, but none of them to date are as elegant as they require a long row of jumpers.

Click to Enlarge

With two cards installed, a bridge PCB is used to connect the golden fingers atop both of the cards. Only GeForce 6600GT and higher cards will feature the SLI enabling golden fingers, although we hypothesize that nothing has been done to disable it on the lower-end GPUs other than a non-accommodating PCB layout. With a little bit of engineering effort we believe that the video card manufacturers could come up with a board design to enable SLI on both 6200 and 6600 non-GT cards. Although we've talked to manufacturers about doing this, we have to wait and see what the results are from their experiments.

As far as board requirements go, the main thing to make sure of is that both of your GPUs are identical. While clock speeds don't have to be the same, NVIDIA's driver will set the clocks on both boards to the lowest common denominator. It is not recommended that you combine different GPU types (e.g. a 6600GT and a 6800GT) although doing so may still be allowed, yet resulting in some rather strange results in certain cases.

You only need to connect a monitor to the first PCI Express card; despite the fact that you have two graphics cards, only the video outputs on the first card will work so anyone wanting to have a quad-display and SLI is somewhat out of luck. I say somewhat because if you toggle off SLI mode (a driver option), then the two cards work independently and you could have a 4-head display configuration. But with SLI mode enabled, the outputs on the second card go blank. While that's not too inconvenient, currently you need to reboot between SLI mode changes in software, which could get annoying for some that only want to enable SLI while in games and use 4-display outputs while not gaming.

We used a beta version of NVIDIA's 66.75 drivers with SLI support enabled for our benchmarks. The 66.75 driver includes a configuration panel for Multi-GPU as you can see below:

Clicking the check box requires a restart to enable (or disable) SLI, but after you've rebooted everything is good to go.

We mentioned before that the driver is very important in SLI performance, the reason behind this is that NVIDIA has implemented several SLI algorithms into their SLI driver to determine how to split up the rendering between the graphics cards depending on the application and load. For example, in some games it may make sense for one card to handle a certain percentage of the screen and the other card handle the remaining percentage, while in others it may make sense for each card to render a separate frame. The driver will alternate between these algorithms as well as even disabling SLI all-together, depending on the game. The other important thing to remember is that the driver is also responsible for the rendering split between the GPUs; each GPU rendering 50% of the scene doesn't always work out to be an evenly split workload between the two, so the driver has to best estimate what rendering ratio would put an equal load on both GPUs.

Index The Test


View All Comments

  • bob661 - Friday, October 29, 2004 - link

    I think some these guys are mad because the motherboard that suits their needs won't be considered "the best". For some, it's an image thing. If it isn't, then why do you care that SLI is even available? Just but the HF4 Ultra. Then there some that come here just to piss people off. Reply
  • bob661 - Friday, October 29, 2004 - link

    Two GPU's on one card is more expensive and there would proabably be some heat issues. Reply
  • Pete - Friday, October 29, 2004 - link

    Whoops. NV43 has only four ROPs, while NV40 has sixteen. So SLIed 6600GTs still have only half the ROPs as a single 6800GT. Mah bad. Reply
  • Tides - Friday, October 29, 2004 - link

    SLI is meant for one thing, HIGH END. It's like spending 800 on an Athlon FX. Before now the option wasn't there, now it is. What's the problem? Reply
  • Pete - Friday, October 29, 2004 - link

    Thanks for the preview, Anand (and MSI). One note:

    "At 1280 x 1024 we see something quite unusual, the 6800GT gains much more from SLI than the 6600GT. The 6800GT received a 63.5% performance boost from SLI while the 6600GT gets "only" a 45.7% improvement; given the beta nature of the drivers we'll avoid hypothesizing about why."

    Not enough RAM? 12x10 4xAA is getting pretty RAM-intensive, no? That's one of the reasons I'm not that excited about SLI'ing two 6600GTs to the level of a 6800GT, but without the extra breathing room afforded by 256MB.

    Two questions for you, too, Anand:

    (1) The 6600GT is 500MHz core, 8 pipe, 4 ROP, 500MHz 128-bit memory. The 6800GT is 350MHz core, 16 pipe, eight ROP, 500MHz 256-bit memory. All else being equal, I'd have thought the SLI'ed 6600GTs would way outperform the 6800GT because they have the same specs and a 40% higher core clock. Is this just a matter of some efficiency lost due to SLI overhead?

    (2) Is there a way to tell if the cards are rendering in "SLI" or AFR mode, or even to force one or the other? I'd be curious to know which helps which app more.
  • justauser - Friday, October 29, 2004 - link

    I don't get it. Why not just put two GPUs on one 16x card. This bridge thing is so hokey. Reply
  • Tides - Friday, October 29, 2004 - link

    Better yet don't buy the SLI version of the mobo, there ARE 3 versions of NF4 boards afterall. Reply
  • Tides - Friday, October 29, 2004 - link

    Why are people complaining about an additional feature on motherboards, that you are no way forced to use? It's like having 2 agp slots on a motherboard, it's ROOM FOR UPGRADE. What's wrong with that? Reply
  • xsilver - Friday, October 29, 2004 - link

    I think the performance boost is viable, only you need to know when to buy

    6600gt SLI is close to a 6800gt in most benchies and in the ones that aren't may be due to driver issues rather than performance... however 2X 6600gt does not equal 6800gt in price, but in say 12months time will a 6600gt + the price of the old 6600gt = or be less than the price of a 6800gt originally?
    The new mainstream product in 12 months time should still perform less than a 6600gt in SLI
    Think of it as getting as good card on "layaway" (am I saying this right? im not in the US :)

    The other viability is of course having 2X 6800GT and saying I've got the best performance money can buy.... again you should not be superceded within 12-18 months

  • haris - Friday, October 29, 2004 - link

    This is a horrible move by Nvidia. Several people have already stated so because of some of the main problems: Heat, noise, power requirements, and SLI may only work if the driver supports that specific game/engine. It might work out great for them since they will be able to get people to pay for two cards instead of just getting a more powerful single card solution which will work just as well if not better in every game. For most people, by the time they would be ready to upgrade a low-mid range card, it would probably still be more cost effective to just buy a new card.

    I love the performance boost as much as the next guy/girl, but I still think that this is just plain stupid.

Log in

Don't have an account? Sign up now