Performance: x16 vs. x16/x2 vs. x8/x8 (SLI)

The best way to verify the success of the mod was to run benchmarks. We had already done extensive testing of SLI performance in Anand's NVIDIA's GeForce 6 SLI: Demolishing Performance Barriers. To get right to the point, we tested the Ultra modded to SLI with Half Life 2, Doom 3, and Far Cry at both 1280x1024 and 1600x1200. We also benchmarked at both settings with and without the eye candy - since Anti-Aliasing and Anisotropic Filtering can exact a large hit on a single GPU.

We were interested to see exactly what performance you could get with two video cards on the Ultra board before the mod to SLI, so we also ran benchmarks of the performance of x16/X2 Ultra dual-video card mode.

All tests were run on a DFI LANParty UT nF4 Ultra-D and a DFI LANParty nF4 SLI-DR. We first confirmed that test results were the same on the LANParty UT modified to SLI and the LANParty nF4 SLI, which is a native SLI chipset board. There was no difference in performance after the SLI modification to the Ultra chipset, so results are reported as SLI and relevant to either SLI or Ultra modified to SLI.

Video cards were a single MSI 6800 Ultra PCIe or a matched pair of MSI 6800 Ultra in SLI and x16/x2 modes. Memory in all benchmarks was OCZ 3200 Platinum Rev. 2 (Samsung TCCD) at 2-2-2-10 timings. The CPU was an Athlon 64 4000+, and the power supply was an OCZ PowerStream 600.

In the course of testing, we found that we could actually run the x16/x2 mode on either the SLI board or the Ultra board by leaving the jumpers in normal mode, using an SLI bridge across the two video cards, and enabling SLI in the nVidia driver. Results on the SLI board in x16/x2 mode were, as expected, the same on the nF4 Ultra board as shipped or the Ultra after SLI modification. The one huge advantage of the SLI-mod was that once we had SLI-modded the Ultra chip, we could run x16/x2 mode with any nVidia Forceware driver up to 70.xx. The 70.90 driver was the highest driver to support x16/x2 mode even with an SLI chip. x16/x2 would not run, however, with the most recent 71.xx drivers. 71.xx drivers report the board to be SLI-capable, but it does not recognize the second card as an appropriate card for SLI. Clearly, nVidia must have turned off x16/x2 support in the most recent driver as well, only allowing their specified x8/x8 mode to work. We suspect that enthusiasts will find a way to correct this very quickly.

UPDATE: The Gigabyte 3D1 is a single video card with two 6600GT GPUs. It will only work in x8/x8 (nVidia) SLI mode on a Gigabyte SLI board. However, we did find the 3D1 will operate in x16/x2 mode on both DFI boards with jumpers in "normal" position. We have added test results to our charts with both single 6600GT and x16/x2 dual video mode with the 3D1. The Gigabyte 3D1 provides the interesting possibility of a form of SLI performance on single x16-slot Ultra boards with the SLI mod.

Breaking the SLI "Code" Half Life 2: x16 vs. x16/x2 vs. x8/x8 (nVidia SLI)
Comments Locked

85 Comments

View All Comments

  • DrDisconnect - Thursday, January 20, 2005 - link

    Does nVidia management have any links with Bausch and Lomb?? They were selling the same exact same contact lenses in two different product channels ie. daily wear and monthly wear. Those who bought daily wear threw out a perfectly good product after a few days, those who bought monthly wear spent a fortune on the same product the daily wear people threw out after a few days.

    Selling flawed chips (eg. missing pipelines) as a less powerful product I can understand. But this is just outright customer abuse by nVidia.
  • HardwareD00d - Thursday, January 20, 2005 - link

    Maybe Anand can do an article on how to make a custom SLI bridge ;) Maybe someone could create a flexible bridge that could be like a "universal adapter".
  • HardwareD00d - Thursday, January 20, 2005 - link

    unless you use the 3D1 card, #62
  • adnauseam - Thursday, January 20, 2005 - link

    #28, Please note I went to the DFI site again today and they have CHANGED the picture that was there the other day. It no Longer shows a SLI bridge in the photo. see here: http://www.dfi.com.tw/Upload/Product_Picture/Cable...
    and here:
    http://www.dfi.com.tw/Product/xx_product_spec_deta...
    Compare with photo from #28 post. I dont see why more people are not addressing this, it makes the mod worthless if you cant get a bridge.
  • cryptonomicon - Thursday, January 20, 2005 - link

    if the new DFI board is anything like the LP nf3 250gb, it will be the best overclocking board for 939, not to mention this incredible sli exploit
  • Wesley Fink - Thursday, January 20, 2005 - link

    #57,#58,#59 - The single card/dual GPU Gigabyte 3D1 ran in 16X/2X dual video mode on both DFI boards with the jumper setting at "Normal". As stated in the comments and the article, the Gigabyte would not run in x8/x8 (nVidia SLI) with the jumper in SLI because it requires special BIOS hooks for that mode only supplied by the Gigabyte board.

    This is not a change from what we described in the review - just more information about alternate modes.

    We do agree the single card/dual GPU idea has promise for the future. That is why we tested the 3D1 on the boards and shared results. Even in x16/x2 the 3D1 performance boost compared to a single 6600GT card was significant.
  • poor Leno - Thursday, January 20, 2005 - link

    @ Wesley:

    But if all the tweaking can make the 3D1 to run on "other speeds"/ config (16x/2x and not 8x/8x), it shows that there is some flexibility maybe... is there not a way in this lifetime to run two 3D1's maybe on 4x8????? 2x8 on one bus and 2x8 on the other? Do you think in the future it will be possible to mod to 2x16??, cause in that way the 4x8 will be possible i think (correct me if im wrong) :P
  • Pete84 - Thursday, January 20, 2005 - link

    #57 My thought's exactly. I thought that the Gigabyte drivers were specially tweaked so that SLI would be running out of a single x16 slot . . .
  • johnsonx - Wednesday, January 19, 2005 - link

    To Wesley Fink:

    What happened to the Gigabyte 3D1 board requiring special BIOS hooks to POST, which only the one Gigabyte mainboard had? I thought that was mentioned at least twice in your original article on the 3D1.
  • ksherman - Wednesday, January 19, 2005 - link

    since obviosly Anandtech doesnt mind soft moding hardware... why not a SoftQuadro mod for the Video cards? I would really like to see an article about the soft moding of the gaming cards to the workstation graphics. Yes i realise that they will not perform as well as the actual card, I wonder how close... Also, since cards like the 6800GT can be used in SLI, It might be interesting to see a SLI workstation setup...

Log in

Don't have an account? Sign up now