Setting up SLI

NVIDIA's nForce4 SLI reference design calls for a slot to be placed on the motherboard that will handle how many PCI Express lanes go to the second x16 slot. Remember that despite the fact that there are two x16 slots on the motherboard, there are still only 16 total lanes allocated to them at most - meaning that each slot is electrically still only a x8, but with a physical x16 connector. While having a x8 bus connection means that the slots have less bandwidth than a full x16 implementation, the real world performance impact is absolutely nothing. In fact, gaming performance doesn't really change down to even a x4 configuration; the performance impact of a x1 configuration itself is even negligible.


Click to Enlarge

The SLI card slot looks much like a SO-DIMM connector:


Click to Enlarge

The card itself has two ways of being inserted; if installed in one direction the card will configure the PCI Express lanes so that only one of the slots is a x16. In the other direction, the 16 PCI Express lanes are split evenly between the two x16 slots. You can run a single graphics card in either mode, but in order to run a pair of cards in SLI mode you need to enable the latter configuration. There are ways around NVIDIA's card-based design to reconfigure the PCI Express lanes, but none of them to date are as elegant as they require a long row of jumpers.


Click to Enlarge

With two cards installed, a bridge PCB is used to connect the golden fingers atop both of the cards. Only GeForce 6600GT and higher cards will feature the SLI enabling golden fingers, although we hypothesize that nothing has been done to disable it on the lower-end GPUs other than a non-accommodating PCB layout. With a little bit of engineering effort we believe that the video card manufacturers could come up with a board design to enable SLI on both 6200 and 6600 non-GT cards. Although we've talked to manufacturers about doing this, we have to wait and see what the results are from their experiments.

As far as board requirements go, the main thing to make sure of is that both of your GPUs are identical. While clock speeds don't have to be the same, NVIDIA's driver will set the clocks on both boards to the lowest common denominator. It is not recommended that you combine different GPU types (e.g. a 6600GT and a 6800GT) although doing so may still be allowed, yet resulting in some rather strange results in certain cases.

You only need to connect a monitor to the first PCI Express card; despite the fact that you have two graphics cards, only the video outputs on the first card will work so anyone wanting to have a quad-display and SLI is somewhat out of luck. I say somewhat because if you toggle off SLI mode (a driver option), then the two cards work independently and you could have a 4-head display configuration. But with SLI mode enabled, the outputs on the second card go blank. While that's not too inconvenient, currently you need to reboot between SLI mode changes in software, which could get annoying for some that only want to enable SLI while in games and use 4-display outputs while not gaming.

We used a beta version of NVIDIA's 66.75 drivers with SLI support enabled for our benchmarks. The 66.75 driver includes a configuration panel for Multi-GPU as you can see below:

Clicking the check box requires a restart to enable (or disable) SLI, but after you've rebooted everything is good to go.

We mentioned before that the driver is very important in SLI performance, the reason behind this is that NVIDIA has implemented several SLI algorithms into their SLI driver to determine how to split up the rendering between the graphics cards depending on the application and load. For example, in some games it may make sense for one card to handle a certain percentage of the screen and the other card handle the remaining percentage, while in others it may make sense for each card to render a separate frame. The driver will alternate between these algorithms as well as even disabling SLI all-together, depending on the game. The other important thing to remember is that the driver is also responsible for the rendering split between the GPUs; each GPU rendering 50% of the scene doesn't always work out to be an evenly split workload between the two, so the driver has to best estimate what rendering ratio would put an equal load on both GPUs.

Index The Test
POST A COMMENT

84 Comments

View All Comments

  • GhandiInstinct - Saturday, October 30, 2004 - link

    I don't understand why you think X2's split work screen will be worse... Even if the scenes get more complex its still only half! So if a single X800XT renders the complex scene at 70fps then two will chop up? Your logic is FLAWED!!! Reply
  • TrogdorJW - Saturday, October 30, 2004 - link

    61 - Sokaku, I wasn't any more rude that you were in your original post. You were incorrect in your claims, as #62 pointed out. I'll repeat: it was a claim without a whole lot of thought/research behind it. Certainly SLI isn't a huge step forward, but to call it a step backwards is ludicrous. SMP would also be a step backwards, and dual-core would be pointless as well. Obviously, the 22 year old webmaster knows quite a few things that you don't. Being wrong at the top of your lungs is why pure guesses aren't used when writing any professional level article.

    One thing I find odd is that there's mention of the new Scalable Link Interface SLI doing either screen division - i.e. one card renders the top 2/3 and the other renders the bottom 1/3 - or Alternate Frame Rendering (AFR). I thought ATI created and patented AFR back with their Rage MAXX card, just like 3dfx created and patented Scan Line Interleave. (One problem with Scan Line Interleave, for those that don't realize this, is that it basically makes AA impossible to do without a massive performance hit. That's why NVIDIA calls the new SLI Scalable Link Interface.) I can't see ATI allowing NV to use AFR technology without a lawsuit, unless there was some other agreement that we haven't heard about.
    Reply
  • IamTHEsnake - Saturday, October 30, 2004 - link

    Come on ATi, surprise me!!!!!!! Reply
  • GhandiInstinct - Saturday, October 30, 2004 - link

    In addition, I hate overhyping brand new technology, it's so pointless. I can see analyzing this 6 months from now after people have been using it and more benchmarks are revealed. Reply
  • GhandiInstinct - Saturday, October 30, 2004 - link

    No one is debating that SLI delivers phenomenal performance. The issue is with the limit on manufacturing creating an "SLI monopoly" lol.

    Everyone knows Nvidia cheats on visual quality and that ATI's cards perform better on an overwhelming amount of games. So if I have $400 to spend on my SLI setup I'd go for the latter of cards. Get it? It's nothing complex here folks.
    Reply
  • mkruer - Saturday, October 30, 2004 - link

    Here is my 2 cents on SLI.

    I think there is a misconception buying two 6600GT at the start, and thinking its is going to be cheaper then a single 6800GT is incorrect. The TOC (total cost of ownership) for people jumping on issue of just is not there. Currently it is just as expensive as picking up a single card solution.

    Here is where SLI makes sense.

    12-18 months down the line the next latest and greatest game will arrive demanding twice the processing power that you currently have. Now you could purchase the bleeding edge graphic card for another $400US or you could pick up another 6800GT for half that and get nearly double the performance (that would also translate into the same performance of the new card), if not better. TCO is now about 75% of picking up a new bleeding edge $400US card.

    So I guess my recommendation to all of you out there that are thinking of picking up two 6600GT, don’t. Spend the same amount of money and get the 6800GT, and in the next 12-18 months pick up a second 6800GT for half the price, and you will still be getting the same performance as nVidia’s next generation, but for half the cost.

    Possible future.

    ATI is undoubtedly working on a similar solution, and possible working on a few “flaws” in nVidias current design, namely the SLI bridge connection. I suspect that in the future the SLI bridge connection will disappear completely and instead, be migrated to the last 8x of the 16x pci-e connection, thereby creating a direct point to point connection between the two cards. The advantage of this is now both cards could share there collective memory similarly to how AMD does with it processors between memory banks. This will allow for two 256mb cards to truly act as one 512mb card.
    Reply
  • Tides - Saturday, October 30, 2004 - link

    dual core gpus in the future? Reply
  • Sokaku - Saturday, October 30, 2004 - link

    #62 - PrinceGaz

    Thanks for clearing that up, I stand corrected. :-)
    Reply
  • Ivo - Saturday, October 30, 2004 - link

    The enthusiastic market, where the two graphic cards SLI solution is positioned, is something like the F1 by cars: it advertises and proves new technologies, but it doesn't sell directly in profitable quantities. Probably, the mainstream market will never adopt it, at least because it is too expensive and too noisy. Nevertheless, a modified SLI solution, with IGP and ONE graphic card, could still be interesting for this market. In that case, the card, a 3D accelerator, should be idle for not intensive 3D applications and the SLI should adopt effective combination of two unequal GPUs. Reply
  • Denial - Saturday, October 30, 2004 - link

    I'm glad I can buy anything I want at my job as the CFO doen't know what a 6800GT is. "Uhhh, it keeps the flux capicitor cool."

    How long till the SLI boards come out? Better yet, that dual SLI board from Tyan. I hope all this is out before the new year so I can slip it in with all the other end-of-year hardware purchases.
    Reply

Log in

Don't have an account? Sign up now