If you have been following the news, some very strange things are going on with the nVidia nForce4 chipsets. About six weeks ago, MSI showed an nForce4 ULTRA motherboard with a regular x16 PCIe slot, plus an open-ended x4 PCIe slot. Those who saw the demos said that MSI was running two matched video cards in what they called a "semi-SLI mode", which ran at about 90% of the performance of normal nVidia SLI. This was an interesting development because nF4 Ultra chipsets are cheaper than nF4 SLI chipsets. The boards based on the Ultra chipset are, therefore, much cheaper than the high-end SLI parts that we are seeing in the market. An arrangement like this would be a god-send for computer enthusiasts who watch their budget, yet still like to enjoy most of the benefits of SLI dual video-card performance.

Just as quickly, we learned that nVidia was not happy with this "SLI hack" and they changed their drivers quickly so that "semi-SLI would not work with current and later Forceware drivers." It appears that the later Forceware drivers check the chipset ID and if the driver sees "Ultra", then SLI is not enabled. MSI decided to kill the "semi-SLI" board because it would be a nightmare supporting a board that would only run with older nVidia SLI drivers.

Then, at CES, DFI was displaying both nForce4 SLI and nForce4 Ultra motherboards with two x16 PCIe slots. We were told that Epox also had an nForce4 Ultra motherboard with another semi-SLI solution based on the cheaper Ultra chipset. DFI told us that they used the same PCB for both versions of the nForce4 boards for economy, and that in fact, the nForce4 Ultra board could run a dual x2 video mode with earlier nVidia Forceware drivers in addition to standard single x16 video mode. Given AnandTech's close working relationship with DFI, we had arranged an exclusive look at both DFI boards. When the boards arrived, we were indeed able to run an x16/x2 dual video mode on the nForce4 Ultra with driver version 66.75 - a very early nVidia SLI driver. We tried many, many Forceware versions and also found that 70.41 also worked by adding one line to the registry. However, like MSI, the Ultra dual-video only worked on very old SLI drivers or on drivers with a Registry mod.

It was clear at this point that this Ultra dual-video solution did work, but that nVidia had turned it off in recent drivers. This caused us to wonder what was really going on with nForce4 chipsets. If nVidia could enable/disable this Ultra SLI in drivers, then the base chips must be very, very similar. In fact, it would be logical if the nF4 Ultra and nF4 SLI were exactly the same chip with some modification, making the chip an Ultra in one case and an SLI in another. The pin-out configurations are, after all, exactly the same with both chipsets.

It was with this idea that we took a closer look into the possibilities, and what we found will surprise you! It turns out that the nForce4 Ultra is apparently just an nForce4 SLI with SLI turned off. What is even more important is that we also found a way to turn on the disabled SLI!

Breaking the SLI "Code"
Comments Locked


View All Comments

  • DrDisconnect - Thursday, January 20, 2005 - link

    Does nVidia management have any links with Bausch and Lomb?? They were selling the same exact same contact lenses in two different product channels ie. daily wear and monthly wear. Those who bought daily wear threw out a perfectly good product after a few days, those who bought monthly wear spent a fortune on the same product the daily wear people threw out after a few days.

    Selling flawed chips (eg. missing pipelines) as a less powerful product I can understand. But this is just outright customer abuse by nVidia.
  • HardwareD00d - Thursday, January 20, 2005 - link

    Maybe Anand can do an article on how to make a custom SLI bridge ;) Maybe someone could create a flexible bridge that could be like a "universal adapter".
  • HardwareD00d - Thursday, January 20, 2005 - link

    unless you use the 3D1 card, #62
  • adnauseam - Thursday, January 20, 2005 - link

    #28, Please note I went to the DFI site again today and they have CHANGED the picture that was there the other day. It no Longer shows a SLI bridge in the photo. see here:
    and here:
    Compare with photo from #28 post. I dont see why more people are not addressing this, it makes the mod worthless if you cant get a bridge.
  • cryptonomicon - Thursday, January 20, 2005 - link

    if the new DFI board is anything like the LP nf3 250gb, it will be the best overclocking board for 939, not to mention this incredible sli exploit
  • Wesley Fink - Thursday, January 20, 2005 - link

    #57,#58,#59 - The single card/dual GPU Gigabyte 3D1 ran in 16X/2X dual video mode on both DFI boards with the jumper setting at "Normal". As stated in the comments and the article, the Gigabyte would not run in x8/x8 (nVidia SLI) with the jumper in SLI because it requires special BIOS hooks for that mode only supplied by the Gigabyte board.

    This is not a change from what we described in the review - just more information about alternate modes.

    We do agree the single card/dual GPU idea has promise for the future. That is why we tested the 3D1 on the boards and shared results. Even in x16/x2 the 3D1 performance boost compared to a single 6600GT card was significant.
  • poor Leno - Thursday, January 20, 2005 - link

    @ Wesley:

    But if all the tweaking can make the 3D1 to run on "other speeds"/ config (16x/2x and not 8x/8x), it shows that there is some flexibility maybe... is there not a way in this lifetime to run two 3D1's maybe on 4x8????? 2x8 on one bus and 2x8 on the other? Do you think in the future it will be possible to mod to 2x16??, cause in that way the 4x8 will be possible i think (correct me if im wrong) :P
  • Pete84 - Thursday, January 20, 2005 - link

    #57 My thought's exactly. I thought that the Gigabyte drivers were specially tweaked so that SLI would be running out of a single x16 slot . . .
  • johnsonx - Wednesday, January 19, 2005 - link

    To Wesley Fink:

    What happened to the Gigabyte 3D1 board requiring special BIOS hooks to POST, which only the one Gigabyte mainboard had? I thought that was mentioned at least twice in your original article on the 3D1.
  • ksherman - Wednesday, January 19, 2005 - link

    since obviosly Anandtech doesnt mind soft moding hardware... why not a SoftQuadro mod for the Video cards? I would really like to see an article about the soft moding of the gaming cards to the workstation graphics. Yes i realise that they will not perform as well as the actual card, I wonder how close... Also, since cards like the 6800GT can be used in SLI, It might be interesting to see a SLI workstation setup...

Log in

Don't have an account? Sign up now