Index

If you have been following the news, some very strange things are going on with the nVidia nForce4 chipsets. About six weeks ago, MSI showed an nForce4 ULTRA motherboard with a regular x16 PCIe slot, plus an open-ended x4 PCIe slot. Those who saw the demos said that MSI was running two matched video cards in what they called a "semi-SLI mode", which ran at about 90% of the performance of normal nVidia SLI. This was an interesting development because nF4 Ultra chipsets are cheaper than nF4 SLI chipsets. The boards based on the Ultra chipset are, therefore, much cheaper than the high-end SLI parts that we are seeing in the market. An arrangement like this would be a god-send for computer enthusiasts who watch their budget, yet still like to enjoy most of the benefits of SLI dual video-card performance.

Just as quickly, we learned that nVidia was not happy with this "SLI hack" and they changed their drivers quickly so that "semi-SLI would not work with current and later Forceware drivers." It appears that the later Forceware drivers check the chipset ID and if the driver sees "Ultra", then SLI is not enabled. MSI decided to kill the "semi-SLI" board because it would be a nightmare supporting a board that would only run with older nVidia SLI drivers.

Then, at CES, DFI was displaying both nForce4 SLI and nForce4 Ultra motherboards with two x16 PCIe slots. We were told that Epox also had an nForce4 Ultra motherboard with another semi-SLI solution based on the cheaper Ultra chipset. DFI told us that they used the same PCB for both versions of the nForce4 boards for economy, and that in fact, the nForce4 Ultra board could run a dual x2 video mode with earlier nVidia Forceware drivers in addition to standard single x16 video mode. Given AnandTech's close working relationship with DFI, we had arranged an exclusive look at both DFI boards. When the boards arrived, we were indeed able to run an x16/x2 dual video mode on the nForce4 Ultra with driver version 66.75 - a very early nVidia SLI driver. We tried many, many Forceware versions and also found that 70.41 also worked by adding one line to the registry. However, like MSI, the Ultra dual-video only worked on very old SLI drivers or on drivers with a Registry mod.

It was clear at this point that this Ultra dual-video solution did work, but that nVidia had turned it off in recent drivers. This caused us to wonder what was really going on with nForce4 chipsets. If nVidia could enable/disable this Ultra SLI in drivers, then the base chips must be very, very similar. In fact, it would be logical if the nF4 Ultra and nF4 SLI were exactly the same chip with some modification, making the chip an Ultra in one case and an SLI in another. The pin-out configurations are, after all, exactly the same with both chipsets.

It was with this idea that we took a closer look into the possibilities, and what we found will surprise you! It turns out that the nForce4 Ultra is apparently just an nForce4 SLI with SLI turned off. What is even more important is that we also found a way to turn on the disabled SLI!

Breaking the SLI "Code"
Comments Locked

85 Comments

View All Comments

  • nitrus - Wednesday, January 19, 2005 - link

    what wrong with nvidia making a little money. i'd like to see a healthy ati and nvidia so they would make cards cheaper(older models) and better every six months. if we dont support these companies, then imagine a motherboard with no exspansion slots as companies start to intergrate everything. ati/nvidia are starting to branch out into other sectors, and id gladly support them for "cinematic" graphics. i have an an8-sli i bought for 199.99 from ZZF with a 6800gt and 6600gt so i can run 4 lcds. worth every penny...
  • nitrus - Wednesday, January 19, 2005 - link

  • Deucer - Wednesday, January 19, 2005 - link

    For those of you who are asking for an application for this mod, I have one. I want to play HL 2 now. I don't have enough money to afford an SLI rig, but I can scrape together enough for a 6600GT and this board. This would be significantly better than what I'm running right now. I want SLI as an upgrade path so when the 6600GT costs as much as the 9600 does now, I can buy another one and get a very affordable boost in proformance. And I'm relatvely sure that I will be able to find an SLI bridge within the next year so that takes care of that issue too.

    Of course this doesn't make any sense for someone who is running Dual 6800 Ultras, this is a cost lowering solution, think about what people on a budget are actually buying and how this could help them.
  • MarkM - Tuesday, January 18, 2005 - link

    Thank you Wesley for the intriguing article & detective work. It's really neat to have a resource who's not afraid to be your guinea pig and risk frying a very precious commodity right now, an nF4 board :)

    #30 - I don't see why A and B are exclusive -- why can't they keep producing and shipping Ultra chips in current architecture while at THE SAME TIME preparing a change to prevent this mod form working & then jus switch to that when it's ready?

    #50 LOL, never spent more tha $200 on a single piece of compuer equipment?!?! I can tell that you weren't buying computers in the 80s!!
  • Wesley Fink - Tuesday, January 18, 2005 - link

    The graphs have been updated with results from a single Gigabyte 6600 GT and the "dual 6600GT on a single card" Gigabyte 3D1 running in x16/x2 dual video mode. The Gigabyte 3D1 provides the interesting possibility of a form of SLI performance on single x16-slot Ultra boards with the SLI mod.
  • DigitalDivine - Tuesday, January 18, 2005 - link

    I have never spent anything more than $200 for a single computer equipment. And i don't plan to in the future.

    Hopefully, Via could get their act together faster and release an sli solution for guys like me on the cheap. Honestly, I bought an Asrock k7v88 at newegg for $50 and it overclocked my 2500+ barton to a 3200+ like a champ, and it was a via kt880 chipset, a very good performance competitor to the nforce2.

    i mean i just bought an epox nforce 3 250gb at newegg for $70 for my sempron 3100+ clocked at 2.4ghz, and if an sli solution comes for around $100, i will surely hop on the pci-express boat, and maybe buy an athlon 64.
  • Wesley Fink - Tuesday, January 18, 2005 - link

    #49 - Corrected. Thanks for bringing this to our attention.

    ALL - I have some very interesting numbers with the Gigabyte 3D1 dual-GPU in single GPU vs. Dual GPU x16/x2 mode on the DFI. As I said earlier, the 3D1 does not work properly in x8/x8 mode in any SLI board except the Gigabyte, but it does work fine in x16/x8 mode on both the SLI and modded SLI DFI with SLI jumpers in normal (x16/x2) position instead of SLI (x8/x8) position. I am considering adding the numbers to the charts.
  • Azsen - Tuesday, January 18, 2005 - link

    There is a possible typo you might like to fix?

    In the Half Life 2 tests, scroll down to the 3rd set of graphs (1280x1024 4xAA/8xAF).

    It has "2 x 6800U x8/x8 (nVidia SLI)" listed twice?

    Shouldn't the green bar be labelled: "2 x 6800U x16/x2 (Ultra SLI)" as in all the other graphs?

    Great article anyway, cheers. :)
  • Cygni - Tuesday, January 18, 2005 - link

    Ahhh, got it Wesley. I was confused by that.
  • DEMO24 - Tuesday, January 18, 2005 - link

    Nice article! Now to get the review of the DFI up so we can all stop wondering what it performs like :(

Log in

Don't have an account? Sign up now