Setting up SLI

NVIDIA's nForce4 SLI reference design calls for a slot to be placed on the motherboard that will handle how many PCI Express lanes go to the second x16 slot. Remember that despite the fact that there are two x16 slots on the motherboard, there are still only 16 total lanes allocated to them at most - meaning that each slot is electrically still only a x8, but with a physical x16 connector. While having a x8 bus connection means that the slots have less bandwidth than a full x16 implementation, the real world performance impact is absolutely nothing. In fact, gaming performance doesn't really change down to even a x4 configuration; the performance impact of a x1 configuration itself is even negligible.


Click to Enlarge

The SLI card slot looks much like a SO-DIMM connector:


Click to Enlarge

The card itself has two ways of being inserted; if installed in one direction the card will configure the PCI Express lanes so that only one of the slots is a x16. In the other direction, the 16 PCI Express lanes are split evenly between the two x16 slots. You can run a single graphics card in either mode, but in order to run a pair of cards in SLI mode you need to enable the latter configuration. There are ways around NVIDIA's card-based design to reconfigure the PCI Express lanes, but none of them to date are as elegant as they require a long row of jumpers.


Click to Enlarge

With two cards installed, a bridge PCB is used to connect the golden fingers atop both of the cards. Only GeForce 6600GT and higher cards will feature the SLI enabling golden fingers, although we hypothesize that nothing has been done to disable it on the lower-end GPUs other than a non-accommodating PCB layout. With a little bit of engineering effort we believe that the video card manufacturers could come up with a board design to enable SLI on both 6200 and 6600 non-GT cards. Although we've talked to manufacturers about doing this, we have to wait and see what the results are from their experiments.

As far as board requirements go, the main thing to make sure of is that both of your GPUs are identical. While clock speeds don't have to be the same, NVIDIA's driver will set the clocks on both boards to the lowest common denominator. It is not recommended that you combine different GPU types (e.g. a 6600GT and a 6800GT) although doing so may still be allowed, yet resulting in some rather strange results in certain cases.

You only need to connect a monitor to the first PCI Express card; despite the fact that you have two graphics cards, only the video outputs on the first card will work so anyone wanting to have a quad-display and SLI is somewhat out of luck. I say somewhat because if you toggle off SLI mode (a driver option), then the two cards work independently and you could have a 4-head display configuration. But with SLI mode enabled, the outputs on the second card go blank. While that's not too inconvenient, currently you need to reboot between SLI mode changes in software, which could get annoying for some that only want to enable SLI while in games and use 4-display outputs while not gaming.

We used a beta version of NVIDIA's 66.75 drivers with SLI support enabled for our benchmarks. The 66.75 driver includes a configuration panel for Multi-GPU as you can see below:

Clicking the check box requires a restart to enable (or disable) SLI, but after you've rebooted everything is good to go.

We mentioned before that the driver is very important in SLI performance, the reason behind this is that NVIDIA has implemented several SLI algorithms into their SLI driver to determine how to split up the rendering between the graphics cards depending on the application and load. For example, in some games it may make sense for one card to handle a certain percentage of the screen and the other card handle the remaining percentage, while in others it may make sense for each card to render a separate frame. The driver will alternate between these algorithms as well as even disabling SLI all-together, depending on the game. The other important thing to remember is that the driver is also responsible for the rendering split between the GPUs; each GPU rendering 50% of the scene doesn't always work out to be an evenly split workload between the two, so the driver has to best estimate what rendering ratio would put an equal load on both GPUs.

Index The Test
Comments Locked

84 Comments

View All Comments

  • Dasterdly - Saturday, October 30, 2004 - link

    Im willing to settle :p
    Also the 2 gpu on one card, or even on one chip would be good. Probably what ati should/will do now to keep up.
    I had the v2 12 mb and it was the fastest card to play my games for more than a year. After that I bought another one and was good for another year or so till the gf2 gts came.
    With the product cycles bumped up by Nv (and everyone else to compete) to 6 mo, I dont know if it would be worth it till they reach thier cap.
  • Grishnakh - Saturday, October 30, 2004 - link

    Well, Human beings seem to be preset to criticize what they just don't need.
    If you think SLI is nothing to you, that mean you just don't need these behemoths, so you will never buy nF4 SLI, KT890, etc, then SLI is nothing concerned with you.
    And Honestly, I wonder what kind of loss from nVidia can be? If you don't need it, fine, most of products meet your demand. If you need it, better! you would pay double, and so the company would earn double.
    SLI just a little like dual CPU, there always a certain population, though not much, need it
  • GhandiInstinct - Friday, October 29, 2004 - link

    Well X2 utilizes each GPU to perform half the screen making a more efficient cooperative effort than SLI. Plus you won't need to keep updating your drivers like SLI and the drivers will come straight from AlienWare.

    It's more appealing to use any combination of GPUs you want rather than SLI. So I want the best performance so I have to pay a premium to be stuck with Nvidia again? Not making that mistake again...
  • caliber fx - Friday, October 29, 2004 - link

    Wonder why alot of you are saying that the driver needs to be "specially written" for a game because even anand said that "In our conversation with NVIDIA, we noted that this technology should work with all software without any modification necessary". If you are talking about driver tweaking then even single gpu solutions are guilty of that one. The tweaks toward the nv30 or ati with their ai solution are just a few examples and I bet if the previewer had more time with the system in the right place he would have ran many other applications. I think most of you have gotten dual cores cpus mixed up with sli and I don't blame you because their are so many just introduced features that are currently not in use in alot of software like amd64, sse3, ps 3.0 and multithreading. Funny thing if their are games out there that can take advantage of all these features to the fullest I can't imagine what that would produce and the sad thing is all these features can be implemented on one machine. Also that alienware solution seems less efficient than sli.
  • GhandiInstinct - Friday, October 29, 2004 - link

    I'm sure everyone agrees that the drawback with this technology is it only supports inferior Nvidia GPUs.

    I'm looking forward to Alienware's X2 technology that combines any gpu combination at a much more efficient architecture.
  • TrogdorJW - Friday, October 29, 2004 - link

    My only question is about the small HSF on the NF4 Ultra chipset. That appears to sit directly underneath the second PCIe slot. Kind of odd, that. How difficult was it to install the cards in that board, Anand? It will also be interesting to see how performance changes over time. With their higher clock speed, I think SLI 6600GT should do better than a 6800GT. Seems like a driver optimization problem to me, although the lack of RAM might also come into play.

    And #11, what was that crap about requiring more geometry processing power to do SLI!? Do you have some reference that actually states this? Seems to me like it's just a blatant guess with not a lot of thought behind it. A card might need to do more geometry work in SLI relative to non-SLI, but twice as much? Hardly. I have a strong suspicion that the vast majority of applications do not come near to maxing out the geometry processing of current cards. Look at 6600GT vs. X700XT: 3 vertex pipelines vs. 6 vertex pipelines. Why then does the 6600GT win out in the majority of tests?
  • Reflex - Friday, October 29, 2004 - link

    #44: Why would DD encoding be a selling point? It is a compression algorithm among other things, and as a result it will degrade your sound quality. It makes sense for DVD's, but for quality PC audio it makes no sense at all. If you want multi-channel(sound on your back speakers) just use analog connections and specify in the control panel for whatever card your using that you'd like it, most give the option.

    Contrary to popular misconception, Dolby Digital, while nice for movies, is a bad thing for PC audio in general. It is one of the reasons that the SoundStorm is not considered a high end solution, despite how nVidia marketed it. Regardless, if you use a digital connection and you have a DD source(DVD movie for instance) your sound card no matter what brand will pass that signal through to your reciever and allow it to decode DD.
  • DrumBum - Friday, October 29, 2004 - link

    is it possible to run three monitors off of an SLI setup and run extended desktop across all three?

    (play a game or watch a dvd across three monitors)
  • Mrvile - Friday, October 29, 2004 - link

    Wow nVidia totally blew ATI away in Farcry (which is weird cuz Farcry is Direct3D) according to http://www.anandtech.com/video/showdoc.aspx?i=2044... benchmarks. But these are kinda old benchies, from May...
  • gplracer - Friday, October 29, 2004 - link

    I think this is a good solution for the time being. If I were going to build a new system I would want the GF4 with SLI capabilities. What if someone bought this board and one 6800 GT. Then at a later would it be impossible to buy another newer nvidia card and run it sli or would it have to be the exact same card? Also no one has noted that this sli capability is great for amd and not so good for intel. Some people will want this and intel has nothing to currently offer that I am aware of.

Log in

Don't have an account? Sign up now