Performance: x16 vs. x16/x2 vs. x8/x8 (SLI)

The best way to verify the success of the mod was to run benchmarks. We had already done extensive testing of SLI performance in Anand's NVIDIA's GeForce 6 SLI: Demolishing Performance Barriers. To get right to the point, we tested the Ultra modded to SLI with Half Life 2, Doom 3, and Far Cry at both 1280x1024 and 1600x1200. We also benchmarked at both settings with and without the eye candy - since Anti-Aliasing and Anisotropic Filtering can exact a large hit on a single GPU.

We were interested to see exactly what performance you could get with two video cards on the Ultra board before the mod to SLI, so we also ran benchmarks of the performance of x16/X2 Ultra dual-video card mode.

All tests were run on a DFI LANParty UT nF4 Ultra-D and a DFI LANParty nF4 SLI-DR. We first confirmed that test results were the same on the LANParty UT modified to SLI and the LANParty nF4 SLI, which is a native SLI chipset board. There was no difference in performance after the SLI modification to the Ultra chipset, so results are reported as SLI and relevant to either SLI or Ultra modified to SLI.

Video cards were a single MSI 6800 Ultra PCIe or a matched pair of MSI 6800 Ultra in SLI and x16/x2 modes. Memory in all benchmarks was OCZ 3200 Platinum Rev. 2 (Samsung TCCD) at 2-2-2-10 timings. The CPU was an Athlon 64 4000+, and the power supply was an OCZ PowerStream 600.

In the course of testing, we found that we could actually run the x16/x2 mode on either the SLI board or the Ultra board by leaving the jumpers in normal mode, using an SLI bridge across the two video cards, and enabling SLI in the nVidia driver. Results on the SLI board in x16/x2 mode were, as expected, the same on the nF4 Ultra board as shipped or the Ultra after SLI modification. The one huge advantage of the SLI-mod was that once we had SLI-modded the Ultra chip, we could run x16/x2 mode with any nVidia Forceware driver up to 70.xx. The 70.90 driver was the highest driver to support x16/x2 mode even with an SLI chip. x16/x2 would not run, however, with the most recent 71.xx drivers. 71.xx drivers report the board to be SLI-capable, but it does not recognize the second card as an appropriate card for SLI. Clearly, nVidia must have turned off x16/x2 support in the most recent driver as well, only allowing their specified x8/x8 mode to work. We suspect that enthusiasts will find a way to correct this very quickly.

UPDATE: The Gigabyte 3D1 is a single video card with two 6600GT GPUs. It will only work in x8/x8 (nVidia) SLI mode on a Gigabyte SLI board. However, we did find the 3D1 will operate in x16/x2 mode on both DFI boards with jumpers in "normal" position. We have added test results to our charts with both single 6600GT and x16/x2 dual video mode with the 3D1. The Gigabyte 3D1 provides the interesting possibility of a form of SLI performance on single x16-slot Ultra boards with the SLI mod.

Breaking the SLI "Code" Half Life 2: x16 vs. x16/x2 vs. x8/x8 (nVidia SLI)
Comments Locked

85 Comments

View All Comments

  • Klaasman - Tuesday, January 18, 2005 - link

    What would be sweet is TWO Gigabyte 3D1 for a total of four GPU's.
  • mclearn - Tuesday, January 18, 2005 - link

    Wesley,
    Are there any visible differences in the DFI Ultra and SLI boards? Maybe the fix for this is as simple as the chipset's "upgrade"
  • Deucer - Tuesday, January 18, 2005 - link

  • cnq - Tuesday, January 18, 2005 - link

    Wesley,
    Nice article, and thanks for switching your SLI measurements to 12x10 & 16x12 (I savaged you for including 10x7 in a prev review, which probably no SLI user will both with).

    One suggestion for future FarCry runs on SLI: please try with the magnificent eye candy setting "HDR" enabled (new with FC v1.3). It looks great, but is a graphics card crippler -- and thus the **perfect** test for a 2x6800U SLI system.
  • Wesley Fink - Tuesday, January 18, 2005 - link

    Unfortunately the Gigabyte 3D1 dual-gpu 6600GT does NOT work on the DFI when trhe jumpers are switched to SLI mode. The nVidia driver sees that the system is SLI-capble, but it does not recognize the 2nd GPU as there for SLI. This is true with 66.93, 70.90 and 71.40 drivers. If the Gigabyte single-slot dual-GPU would work with more boards they would sell a lot more of them.

    However, the Gigabyte 3D1 in x16/x2 mode performs quite well when jumpers are left in normal mode. After the mod to SLI it works fine with drivers to 70.xx.
  • crazyeddie - Tuesday, January 18, 2005 - link

    It would be a little late in the game for Nvidia to go back and do a whole lot of reverse engineering to the NF4 SLI chipset to make it less moddable (new word?). Nvidia will:

    A) Let the NF4 Ultra out the door as is, eat the lost sales of more profitable SLI chipsets, and take solice that their graphic card sales will be quite brisk. This presumes they can actually ship enough video chips to take advantage of the increased demand.

    B) Dry up the current supply of Ultra chipsets and go back to the drawing board to disable them more thoroughly. They will miss out on shipping Ultra chipsets to motherboard manufacturers, which may or may not cause contract problems. It would ensure the continued desirability of the SLI chipset at higher margins, however.

    I've personally been hoping for an inexpensive PCI-E board for the 939 Athlon 64 platform that I can pair with a Radeon X800XL solution. This news story jeapordizes the shipment of the NF4 Ultra if Nvidia is determined to protect margins for the sake of overall volume. I guess it's no loss to me, because I couldn't buy an X800XL right now anyway.

    I guess we'll have to wait and see whether Nvidia wants to focus on volume or margin-per-unit.
  • Wesley Fink - Tuesday, January 18, 2005 - link

    #22 - Your idea was so intriguing I had to give it a try. With the DFI UT Ultra modded to SLI, jumpers in normal (non-SLI) position and the Gigabyte 2 gpu 3D1 I was able to run SLI fine with drivers up to 70.xx. This suggests that the Gigabyte Dual 6600GT might run in the single slot of any nF4 Ultra motherboard in a "semi-SLI" mode. That means 2 video cards are potentially NOT required. Modding to SLI woujld enable a wider range of working drivers. More testing needs to be done before reaching any conclusions.

    I am getting ready to try the Gigabyte 3D1 now in full SLI (x8/x8) mode to see if that works on the DFI.
  • ChiefNutz - Tuesday, January 18, 2005 - link

    #23, #27 I dunno, when you go to the DFI website, it shows a picture with the link in the box contents on the ultra
    http://www.dfi.com.tw/Upload/Product_Picture/Cable... for the box contents, and
    http://www.dfi.com.tw/Product/xx_product_spec_deta...
    for the main product page. It's sitting right there in 3 of the 4 pictures right below the Package listing? What gives?
  • adnauseam - Tuesday, January 18, 2005 - link

    the one I think everyone is missing is....where do you get a SLI bridge, without purchasing an actual SLI board. Remember the bridges ship with the boards not the cards, because the spacing between the PCIe slots could be different on each maufacturers board. It seems the only way to get one would be to wait until someone who doesnt plan on using it sells one on ebay. unless........there is a way to purchase a replacement....Ill have to check that out actually.............
  • razor2025 - Tuesday, January 18, 2005 - link

    This is awesome find. It gives us more choices. I was planning to buy a NF4 board, so I can use the X800XL I have in pre-order. I didn't want a SLI-ready setup, because the cost is too much. However, if I can get a SLI-capable board (after hack) like the DFI UT for around $130-140, I'll definitely go for them. Most single SLI NF4 boards are fetching around $130-140, and if DFI UT and the Epox board retails for around the same price, everyone who wanted a single SLI NF4 will changed their decision to these awesome boards. Even though I won't be doing SLI anytime soon, I'm sure there will be capable and cheaper cards that can run SLI on these boards down the road. If that doesn't happen, oh well, I still needed a PCI-E board that use Athlon64.

Log in

Don't have an account? Sign up now