Performance: x16 vs. x16/x2 vs. x8/x8 (SLI)

The best way to verify the success of the mod was to run benchmarks. We had already done extensive testing of SLI performance in Anand's NVIDIA's GeForce 6 SLI: Demolishing Performance Barriers. To get right to the point, we tested the Ultra modded to SLI with Half Life 2, Doom 3, and Far Cry at both 1280x1024 and 1600x1200. We also benchmarked at both settings with and without the eye candy - since Anti-Aliasing and Anisotropic Filtering can exact a large hit on a single GPU.

We were interested to see exactly what performance you could get with two video cards on the Ultra board before the mod to SLI, so we also ran benchmarks of the performance of x16/X2 Ultra dual-video card mode.

All tests were run on a DFI LANParty UT nF4 Ultra-D and a DFI LANParty nF4 SLI-DR. We first confirmed that test results were the same on the LANParty UT modified to SLI and the LANParty nF4 SLI, which is a native SLI chipset board. There was no difference in performance after the SLI modification to the Ultra chipset, so results are reported as SLI and relevant to either SLI or Ultra modified to SLI.

Video cards were a single MSI 6800 Ultra PCIe or a matched pair of MSI 6800 Ultra in SLI and x16/x2 modes. Memory in all benchmarks was OCZ 3200 Platinum Rev. 2 (Samsung TCCD) at 2-2-2-10 timings. The CPU was an Athlon 64 4000+, and the power supply was an OCZ PowerStream 600.

In the course of testing, we found that we could actually run the x16/x2 mode on either the SLI board or the Ultra board by leaving the jumpers in normal mode, using an SLI bridge across the two video cards, and enabling SLI in the nVidia driver. Results on the SLI board in x16/x2 mode were, as expected, the same on the nF4 Ultra board as shipped or the Ultra after SLI modification. The one huge advantage of the SLI-mod was that once we had SLI-modded the Ultra chip, we could run x16/x2 mode with any nVidia Forceware driver up to 70.xx. The 70.90 driver was the highest driver to support x16/x2 mode even with an SLI chip. x16/x2 would not run, however, with the most recent 71.xx drivers. 71.xx drivers report the board to be SLI-capable, but it does not recognize the second card as an appropriate card for SLI. Clearly, nVidia must have turned off x16/x2 support in the most recent driver as well, only allowing their specified x8/x8 mode to work. We suspect that enthusiasts will find a way to correct this very quickly.

UPDATE: The Gigabyte 3D1 is a single video card with two 6600GT GPUs. It will only work in x8/x8 (nVidia) SLI mode on a Gigabyte SLI board. However, we did find the 3D1 will operate in x16/x2 mode on both DFI boards with jumpers in "normal" position. We have added test results to our charts with both single 6600GT and x16/x2 dual video mode with the 3D1. The Gigabyte 3D1 provides the interesting possibility of a form of SLI performance on single x16-slot Ultra boards with the SLI mod.

Breaking the SLI "Code" Half Life 2: x16 vs. x16/x2 vs. x8/x8 (nVidia SLI)
Comments Locked

85 Comments

View All Comments

  • nitrus - Wednesday, January 19, 2005 - link

    what wrong with nvidia making a little money. i'd like to see a healthy ati and nvidia so they would make cards cheaper(older models) and better every six months. if we dont support these companies, then imagine a motherboard with no exspansion slots as companies start to intergrate everything. ati/nvidia are starting to branch out into other sectors, and id gladly support them for "cinematic" graphics. i have an an8-sli i bought for 199.99 from ZZF with a 6800gt and 6600gt so i can run 4 lcds. worth every penny...
  • nitrus - Wednesday, January 19, 2005 - link

  • Deucer - Wednesday, January 19, 2005 - link

    For those of you who are asking for an application for this mod, I have one. I want to play HL 2 now. I don't have enough money to afford an SLI rig, but I can scrape together enough for a 6600GT and this board. This would be significantly better than what I'm running right now. I want SLI as an upgrade path so when the 6600GT costs as much as the 9600 does now, I can buy another one and get a very affordable boost in proformance. And I'm relatvely sure that I will be able to find an SLI bridge within the next year so that takes care of that issue too.

    Of course this doesn't make any sense for someone who is running Dual 6800 Ultras, this is a cost lowering solution, think about what people on a budget are actually buying and how this could help them.
  • MarkM - Tuesday, January 18, 2005 - link

    Thank you Wesley for the intriguing article & detective work. It's really neat to have a resource who's not afraid to be your guinea pig and risk frying a very precious commodity right now, an nF4 board :)

    #30 - I don't see why A and B are exclusive -- why can't they keep producing and shipping Ultra chips in current architecture while at THE SAME TIME preparing a change to prevent this mod form working & then jus switch to that when it's ready?

    #50 LOL, never spent more tha $200 on a single piece of compuer equipment?!?! I can tell that you weren't buying computers in the 80s!!
  • Wesley Fink - Tuesday, January 18, 2005 - link

    The graphs have been updated with results from a single Gigabyte 6600 GT and the "dual 6600GT on a single card" Gigabyte 3D1 running in x16/x2 dual video mode. The Gigabyte 3D1 provides the interesting possibility of a form of SLI performance on single x16-slot Ultra boards with the SLI mod.
  • DigitalDivine - Tuesday, January 18, 2005 - link

    I have never spent anything more than $200 for a single computer equipment. And i don't plan to in the future.

    Hopefully, Via could get their act together faster and release an sli solution for guys like me on the cheap. Honestly, I bought an Asrock k7v88 at newegg for $50 and it overclocked my 2500+ barton to a 3200+ like a champ, and it was a via kt880 chipset, a very good performance competitor to the nforce2.

    i mean i just bought an epox nforce 3 250gb at newegg for $70 for my sempron 3100+ clocked at 2.4ghz, and if an sli solution comes for around $100, i will surely hop on the pci-express boat, and maybe buy an athlon 64.
  • Wesley Fink - Tuesday, January 18, 2005 - link

    #49 - Corrected. Thanks for bringing this to our attention.

    ALL - I have some very interesting numbers with the Gigabyte 3D1 dual-GPU in single GPU vs. Dual GPU x16/x2 mode on the DFI. As I said earlier, the 3D1 does not work properly in x8/x8 mode in any SLI board except the Gigabyte, but it does work fine in x16/x8 mode on both the SLI and modded SLI DFI with SLI jumpers in normal (x16/x2) position instead of SLI (x8/x8) position. I am considering adding the numbers to the charts.
  • Azsen - Tuesday, January 18, 2005 - link

    There is a possible typo you might like to fix?

    In the Half Life 2 tests, scroll down to the 3rd set of graphs (1280x1024 4xAA/8xAF).

    It has "2 x 6800U x8/x8 (nVidia SLI)" listed twice?

    Shouldn't the green bar be labelled: "2 x 6800U x16/x2 (Ultra SLI)" as in all the other graphs?

    Great article anyway, cheers. :)
  • Cygni - Tuesday, January 18, 2005 - link

    Ahhh, got it Wesley. I was confused by that.
  • DEMO24 - Tuesday, January 18, 2005 - link

    Nice article! Now to get the review of the DFI up so we can all stop wondering what it performs like :(

Log in

Don't have an account? Sign up now