Half Life 2: x16 vs. x16/x2 vs. x8/x8 (nVidia SLI)


Half-Life 2 - Guru3D Demo 5

Half-Life 2 - Guru3D Demo 5

Half-Life 2 - Guru3D Demo 5

Half-Life 2 - Guru3D Demo 5

As seen in past SLI reviews, performance with SLI improves more as resolution and "eye-candy" increases. The range of performance improvement was from a small 6% at 1280 with no AA or AF to a significant 61.0% in 1600x1200 with 4xAA and 8xAF. Performance improvements are even greater when going from a 6600 GT, 6800, or 6800GT to a dual card SLI mode. So consider this to be the smallest performance increase that you will see since we are using the top-line video cards.

Performance: x16 vs. x16/x2 vs. x8/x8 (SLI) Doom3: x16 vs. x16/x2 vs. x8/x8 (nVidia SLI)
Comments Locked

85 Comments

View All Comments

  • nitrus - Wednesday, January 19, 2005 - link

    what wrong with nvidia making a little money. i'd like to see a healthy ati and nvidia so they would make cards cheaper(older models) and better every six months. if we dont support these companies, then imagine a motherboard with no exspansion slots as companies start to intergrate everything. ati/nvidia are starting to branch out into other sectors, and id gladly support them for "cinematic" graphics. i have an an8-sli i bought for 199.99 from ZZF with a 6800gt and 6600gt so i can run 4 lcds. worth every penny...
  • nitrus - Wednesday, January 19, 2005 - link

  • Deucer - Wednesday, January 19, 2005 - link

    For those of you who are asking for an application for this mod, I have one. I want to play HL 2 now. I don't have enough money to afford an SLI rig, but I can scrape together enough for a 6600GT and this board. This would be significantly better than what I'm running right now. I want SLI as an upgrade path so when the 6600GT costs as much as the 9600 does now, I can buy another one and get a very affordable boost in proformance. And I'm relatvely sure that I will be able to find an SLI bridge within the next year so that takes care of that issue too.

    Of course this doesn't make any sense for someone who is running Dual 6800 Ultras, this is a cost lowering solution, think about what people on a budget are actually buying and how this could help them.
  • MarkM - Tuesday, January 18, 2005 - link

    Thank you Wesley for the intriguing article & detective work. It's really neat to have a resource who's not afraid to be your guinea pig and risk frying a very precious commodity right now, an nF4 board :)

    #30 - I don't see why A and B are exclusive -- why can't they keep producing and shipping Ultra chips in current architecture while at THE SAME TIME preparing a change to prevent this mod form working & then jus switch to that when it's ready?

    #50 LOL, never spent more tha $200 on a single piece of compuer equipment?!?! I can tell that you weren't buying computers in the 80s!!
  • Wesley Fink - Tuesday, January 18, 2005 - link

    The graphs have been updated with results from a single Gigabyte 6600 GT and the "dual 6600GT on a single card" Gigabyte 3D1 running in x16/x2 dual video mode. The Gigabyte 3D1 provides the interesting possibility of a form of SLI performance on single x16-slot Ultra boards with the SLI mod.
  • DigitalDivine - Tuesday, January 18, 2005 - link

    I have never spent anything more than $200 for a single computer equipment. And i don't plan to in the future.

    Hopefully, Via could get their act together faster and release an sli solution for guys like me on the cheap. Honestly, I bought an Asrock k7v88 at newegg for $50 and it overclocked my 2500+ barton to a 3200+ like a champ, and it was a via kt880 chipset, a very good performance competitor to the nforce2.

    i mean i just bought an epox nforce 3 250gb at newegg for $70 for my sempron 3100+ clocked at 2.4ghz, and if an sli solution comes for around $100, i will surely hop on the pci-express boat, and maybe buy an athlon 64.
  • Wesley Fink - Tuesday, January 18, 2005 - link

    #49 - Corrected. Thanks for bringing this to our attention.

    ALL - I have some very interesting numbers with the Gigabyte 3D1 dual-GPU in single GPU vs. Dual GPU x16/x2 mode on the DFI. As I said earlier, the 3D1 does not work properly in x8/x8 mode in any SLI board except the Gigabyte, but it does work fine in x16/x8 mode on both the SLI and modded SLI DFI with SLI jumpers in normal (x16/x2) position instead of SLI (x8/x8) position. I am considering adding the numbers to the charts.
  • Azsen - Tuesday, January 18, 2005 - link

    There is a possible typo you might like to fix?

    In the Half Life 2 tests, scroll down to the 3rd set of graphs (1280x1024 4xAA/8xAF).

    It has "2 x 6800U x8/x8 (nVidia SLI)" listed twice?

    Shouldn't the green bar be labelled: "2 x 6800U x16/x2 (Ultra SLI)" as in all the other graphs?

    Great article anyway, cheers. :)
  • Cygni - Tuesday, January 18, 2005 - link

    Ahhh, got it Wesley. I was confused by that.
  • DEMO24 - Tuesday, January 18, 2005 - link

    Nice article! Now to get the review of the DFI up so we can all stop wondering what it performs like :(

Log in

Don't have an account? Sign up now