Breaking the SLI "Code"

With the flood of nForce4 motherboards getting ready to enter the market, we had a decent selection of very recent nForce4 Ultra and nForce4 SLI motherboards. We also had both the SLI and the Ultra versions of the DFI based on the same PCB. With this wide selection of boards, we could look at the differences in the Ultra and SLI chipset and also confirm that they were not unique in any way.

If you look closely at the pictures of the SLI and Ultra, you will see that the chipset themselves appear identical. However, a closer look at the resistors and pads surrounding the chip shows some differences. The resistors appear the same on both, but there are 3 sets of resistor pads that are closed on the SLI chipset while just two sets are closed on Ultra. The vertical set of resistor pads just to the right edge of the chip itself is closed on SLI and open on Ultra. We could find no other obvious differences in the 2 chipsets. Could it be this simple?

We closed the set of resistor pads on the DFI LANParty UT nF4 Ultra-D with conductive paint, as you can see in the photo below.

We set the jumpers to SLI, attached the top bridge from an SLI board, since the Ultra boards do not ship with an SLI bridge, and fired up the system. The system was immediately recognized as an SLI chipset on boot and in Windows XP by our latest 71.40 Forceware drivers. Our little bit of very easy modification had "turned" the Ultra chipset into SLI. We no longer had driver limitations and performance was now exactly the same as the performance that we achieved with a normal SLI chipset.

We also tried modifying an Ultra to SLI with an ordinary #2 pencil. It worked perfectly, and with there being so much room around the set of resistor pads, you don't have to be that neat. If you close the pads, you have converted the Ultra to SLI. Those of you who remember Athlon XP modding for CPU speed will recall how close the sets of pads were in that mod. This required masking and careful painting of the pads to be closed. With the Ultra to SLI mod, there is huge real estate around the resistor on which you are working. As a result, even "all thumbs" modders should have an easy time with this one.

Index Performance: x16 vs. x16/x2 vs. x8/x8 (SLI)
Comments Locked


View All Comments

  • DrDisconnect - Thursday, January 20, 2005 - link

    Does nVidia management have any links with Bausch and Lomb?? They were selling the same exact same contact lenses in two different product channels ie. daily wear and monthly wear. Those who bought daily wear threw out a perfectly good product after a few days, those who bought monthly wear spent a fortune on the same product the daily wear people threw out after a few days.

    Selling flawed chips (eg. missing pipelines) as a less powerful product I can understand. But this is just outright customer abuse by nVidia.
  • HardwareD00d - Thursday, January 20, 2005 - link

    Maybe Anand can do an article on how to make a custom SLI bridge ;) Maybe someone could create a flexible bridge that could be like a "universal adapter".
  • HardwareD00d - Thursday, January 20, 2005 - link

    unless you use the 3D1 card, #62
  • adnauseam - Thursday, January 20, 2005 - link

    #28, Please note I went to the DFI site again today and they have CHANGED the picture that was there the other day. It no Longer shows a SLI bridge in the photo. see here:
    and here:
    Compare with photo from #28 post. I dont see why more people are not addressing this, it makes the mod worthless if you cant get a bridge.
  • cryptonomicon - Thursday, January 20, 2005 - link

    if the new DFI board is anything like the LP nf3 250gb, it will be the best overclocking board for 939, not to mention this incredible sli exploit
  • Wesley Fink - Thursday, January 20, 2005 - link

    #57,#58,#59 - The single card/dual GPU Gigabyte 3D1 ran in 16X/2X dual video mode on both DFI boards with the jumper setting at "Normal". As stated in the comments and the article, the Gigabyte would not run in x8/x8 (nVidia SLI) with the jumper in SLI because it requires special BIOS hooks for that mode only supplied by the Gigabyte board.

    This is not a change from what we described in the review - just more information about alternate modes.

    We do agree the single card/dual GPU idea has promise for the future. That is why we tested the 3D1 on the boards and shared results. Even in x16/x2 the 3D1 performance boost compared to a single 6600GT card was significant.
  • poor Leno - Thursday, January 20, 2005 - link

    @ Wesley:

    But if all the tweaking can make the 3D1 to run on "other speeds"/ config (16x/2x and not 8x/8x), it shows that there is some flexibility maybe... is there not a way in this lifetime to run two 3D1's maybe on 4x8????? 2x8 on one bus and 2x8 on the other? Do you think in the future it will be possible to mod to 2x16??, cause in that way the 4x8 will be possible i think (correct me if im wrong) :P
  • Pete84 - Thursday, January 20, 2005 - link

    #57 My thought's exactly. I thought that the Gigabyte drivers were specially tweaked so that SLI would be running out of a single x16 slot . . .
  • johnsonx - Wednesday, January 19, 2005 - link

    To Wesley Fink:

    What happened to the Gigabyte 3D1 board requiring special BIOS hooks to POST, which only the one Gigabyte mainboard had? I thought that was mentioned at least twice in your original article on the 3D1.
  • ksherman - Wednesday, January 19, 2005 - link

    since obviosly Anandtech doesnt mind soft moding hardware... why not a SoftQuadro mod for the Video cards? I would really like to see an article about the soft moding of the gaming cards to the workstation graphics. Yes i realise that they will not perform as well as the actual card, I wonder how close... Also, since cards like the 6800GT can be used in SLI, It might be interesting to see a SLI workstation setup...

Log in

Don't have an account? Sign up now