Setting up SLI

NVIDIA's nForce4 SLI reference design calls for a slot to be placed on the motherboard that will handle how many PCI Express lanes go to the second x16 slot. Remember that despite the fact that there are two x16 slots on the motherboard, there are still only 16 total lanes allocated to them at most - meaning that each slot is electrically still only a x8, but with a physical x16 connector. While having a x8 bus connection means that the slots have less bandwidth than a full x16 implementation, the real world performance impact is absolutely nothing. In fact, gaming performance doesn't really change down to even a x4 configuration; the performance impact of a x1 configuration itself is even negligible.

Click to Enlarge

The SLI card slot looks much like a SO-DIMM connector:

Click to Enlarge

The card itself has two ways of being inserted; if installed in one direction the card will configure the PCI Express lanes so that only one of the slots is a x16. In the other direction, the 16 PCI Express lanes are split evenly between the two x16 slots. You can run a single graphics card in either mode, but in order to run a pair of cards in SLI mode you need to enable the latter configuration. There are ways around NVIDIA's card-based design to reconfigure the PCI Express lanes, but none of them to date are as elegant as they require a long row of jumpers.

Click to Enlarge

With two cards installed, a bridge PCB is used to connect the golden fingers atop both of the cards. Only GeForce 6600GT and higher cards will feature the SLI enabling golden fingers, although we hypothesize that nothing has been done to disable it on the lower-end GPUs other than a non-accommodating PCB layout. With a little bit of engineering effort we believe that the video card manufacturers could come up with a board design to enable SLI on both 6200 and 6600 non-GT cards. Although we've talked to manufacturers about doing this, we have to wait and see what the results are from their experiments.

As far as board requirements go, the main thing to make sure of is that both of your GPUs are identical. While clock speeds don't have to be the same, NVIDIA's driver will set the clocks on both boards to the lowest common denominator. It is not recommended that you combine different GPU types (e.g. a 6600GT and a 6800GT) although doing so may still be allowed, yet resulting in some rather strange results in certain cases.

You only need to connect a monitor to the first PCI Express card; despite the fact that you have two graphics cards, only the video outputs on the first card will work so anyone wanting to have a quad-display and SLI is somewhat out of luck. I say somewhat because if you toggle off SLI mode (a driver option), then the two cards work independently and you could have a 4-head display configuration. But with SLI mode enabled, the outputs on the second card go blank. While that's not too inconvenient, currently you need to reboot between SLI mode changes in software, which could get annoying for some that only want to enable SLI while in games and use 4-display outputs while not gaming.

We used a beta version of NVIDIA's 66.75 drivers with SLI support enabled for our benchmarks. The 66.75 driver includes a configuration panel for Multi-GPU as you can see below:

Clicking the check box requires a restart to enable (or disable) SLI, but after you've rebooted everything is good to go.

We mentioned before that the driver is very important in SLI performance, the reason behind this is that NVIDIA has implemented several SLI algorithms into their SLI driver to determine how to split up the rendering between the graphics cards depending on the application and load. For example, in some games it may make sense for one card to handle a certain percentage of the screen and the other card handle the remaining percentage, while in others it may make sense for each card to render a separate frame. The driver will alternate between these algorithms as well as even disabling SLI all-together, depending on the game. The other important thing to remember is that the driver is also responsible for the rendering split between the GPUs; each GPU rendering 50% of the scene doesn't always work out to be an evenly split workload between the two, so the driver has to best estimate what rendering ratio would put an equal load on both GPUs.

Index The Test
Comments Locked


View All Comments

  • darkrequiem - Tuesday, November 16, 2004 - link

    What I'd really like to see is how Doom3 performs with this SLI setup running in Ultra quality mode. They recommend a 512MB card for this mode, and here we have a total of 512MB between the two cards. It'd be interresting to see how many FPS can be achieved at this quality setting.
  • MightyB - Wednesday, November 10, 2004 - link

    Hmm Im gonna wait out for the ATI solution and hope they will make it possible to use two diffferent types of cards. Would love to have the new x800 All in Wonder matched with another X-series card :-) This way I can get great performance along with the much better ATI picture quality and even TV.. :-)

    And before you flame me being a ATI fanboy.. I own a Nvidia card.. Im talking about 2D (windows desktop) and Movies when i refer to picture quality. I see now real difference in games!

    Best regards
  • MiLaMber - Sunday, October 31, 2004 - link

    Perhaps slightly annoying this article to those ppl who have a Geforce 6800GT, like myself, just like the fact that the NF 4 is a no go.
    Looks like a possible upgrade to PCI Express will be on teh cards in..18 months, but I see this as a good thing, better motherboards, a maturer PCI Express solution.

    Guess could always sell the 6800gt agp when the pcie version comes out though huh lol, and then start considerin SLI again and indeed NF4.
  • Reflex - Sunday, October 31, 2004 - link

    Thats his point, because each card does 50% no matter how much work a section of the screen has, it won't necessarily utilize both cards fully. If your wandering through UT2k4 and there are no vehicles in the sky, but there is a massive ground battle going on, you will see hardly any benefit from the Alienware setup, but you'd see a huge benefit from the nVidia SLI solution since it will give the card rendering the top half of the screen more to do rather than sitting idle.

    Believe what you want, but I have a bad feeling that the Alienware tech will never actually appear on the market. It was announced months ago, and now if it arrived it would more or less be an Ati only solution, since if you had nVidia cards you can do SLI natively without Alienware's technology(and nVidia's solution should be faster most of the time). It was a good attempt by them, but at this point its not worth pursuing.
  • GhandiInstinct - Sunday, October 31, 2004 - link

    Swaid: Your ratios are a bit off, did you see the E3 demonstration? Maybe you should have another look at it. The ratio is always 50/50 even if the lower or upper half has a bit more work to be done, the thing is its only half the screen, resolution. Making the cards work a lot less than if they were single. Never the less, it's always seamless no matter how high your graphics settings.

    Ever heard of frame locking? Frame locking synchronizes display refresh and buffer swaps across multiple cards, preventing visual artifacts and ensuring image continuity in multi-monitor (or multiple video card) applications like simulations.
  • Reflex - Sunday, October 31, 2004 - link

    Hell, I'm an anti-nVidia guy and I still recognize this as a nice deal for those of us who'd love a cheap way to upgrade in the future when a secondary card will be cheap.

    I still won't be buying it however, the 2D performance of nVidia solutions is still crap at high res, and I spend a lot more time in front of a web browser than I do in front of games. But since Civ3 is the most recent game I have purchased, I don't imagine its really much of an issue for me. A Parhelia would meet my gaming needs just fine. ;)
  • Swaid - Sunday, October 31, 2004 - link

    His logic is correct, while your's on the other hand needs to be double checked. In your case (X2/Video Array), we can take the an example of a situation where the upper half of the screen has a more complex scene (or more activity) going on, thus making Card 1 (video card rendering the upper half) work at 100% while Card 2 (video card rendering the bottom half) only needs to work at 80% to keep up with Card 1. Thats not an efficient solution. This is where SLI's technology can really shine. If you take that same scene thats being played through and use SLI, Card 1 will now render the upper 40% while Card 2 will render the bottom 60% and this would keep the 2 GPU's at 100% load. Now that is an efficient solution. That senario can pretty much go for all FPS games. Dynamic load balancing appears to be a better 'logical' solution. But I am willing to bet that you are an ATI fan or maybe it has to do with some sort of loyalty towards Alienware, so SLI is feeling you left behind, thus your unwillingness for reason. There are many articles that try and explain the whole situation with SLI.
  • Zebo - Sunday, October 31, 2004 - link

    Oh and thanks Anand for this exclusive. I'm with the camp which says "single card" and "frequent upgrades" only because of noise, heat, power and very low resale when I'm finally done with the two cards.
  • Zebo - Sunday, October 31, 2004 - link

    More options are always a good thing. How can anyone be down on this tech is beyond me.
  • GhandiInstinct - Saturday, October 30, 2004 - link

    SLI requires special circuitry to be incorporated into GPUs and, for extra speed gain, into core-logic. Alienware’s Video Array technology does not require any special logic to be incorporated into graphics or system chips.

    This makes it less of a driver prone problem than SLI.


Log in

Don't have an account? Sign up now