Quad SLI Redux?

With the hardware requirements met, it's time to look at the software requirements. Currently, 3-way SLI is only supported under Windows Vista, not XP. Other than the OS stipulation, 3-way SLI isn't really any different from conventional 2-card SLI. Many of you will remember the ill fated Quad SLI product NVIDIA brought to market just under two years ago.

Quad SLI had three major problems that kept it from being a worthwhile product:

1) It relied on the 7950 GX2, which was a single card, dual GPU solution. The problem is that the each GPU on a 7950 GX2 was slower than a 7900 GTX. So a single 7950 GX2 was slower than a pair of 7900 GTXs. Quad SLI used two of these 7950 GX2s, so the performance improvement over a pair of 7900 GTXs wasn't all that great at its best.

2) The best performing games with Quad SLI used AFR (Alternate Frame Rendering) to divide up the rendering workload, where each GPU was responsible for rendering its own frame. The result is that GPU 1 would render frame 1, while GPU 2 would work on the next frame, GPU 3 would work on the third and GPU 4 would work on the fourth. DirectX 9 unfortunately only allowed for a 3-frame render ahead, meaning that this AFR mode wouldn't work. With the vast majority of games being DX titles, this posed a significant problem to Quad SLI performance.

3) The final issue with Quad SLI was that by the end of the year, G80 was out, and G80 was much faster. A pair of 8800 GTXs demolished a Quad SLI setup, and in some cases even a single card was faster.

Thankfully, 3-way SLI doesn't have these problems. The three-card SLI setup relies on regular 8800 GTX/Ultra cards, which are still among the fastest GPUs that NVIDIA offers today. The 3-frame render ahead limitations of DX9 aren't present in DX10, so we can get good scaling with AFR in DX10 titles.

The problem of planned obsolesce is a concern though and it's almost inevitable that 3-way SLI based on G80 will be replaced very soon. There's no doubt that NVIDIA will eventually replace the 8800 GTX and Ultra with G92 based variants, which will reduce power consumption and improve performance. The fact that G80 came out over a year ago should preclude any thoughts of purchasing a brand new 3-way SLI setup, but for users who already have two 8800 GTX or Ultra cards, adding a third is a mostly reasonable proposition.

The Test

Special thanks to both EVGA and ASUS for providing us with hardware for this review. Both EVGA and ASUS sent us 8800 Ultras and 780i based motherboards for this comparison, although it is worth mentioning that you could use a 680i motherboard and any brand (or mixture of brands) of 8800 GTX/Ultra cards - provide of course that your 680i motherboard has the necessary x16 PCIe slots.

Test Setup
CPU Intel Core 2 Extreme QX9650 @ 3.33GHz
Motherboard EVGA nForce 780i SLI
Video Cards NVIDIA GeForce 8800 Ultra x 3
Video Drivers NVIDIA: 169.18
Hard Drive Seagate 7200.9 300GB 8MB 7200RPM
RAM 4x1GB Corsair XMS2 DDR2-800 4-4-4-12
Operating System Windows Vista Ultimate 32-bit
 

Index Wanna 3-way?
Comments Locked

48 Comments

View All Comments

  • IKeelU - Tuesday, December 18, 2007 - link

    When will this nonsense stop? It is perfectly reasonable for a game company to "permit" users to increase the detail if they so choose. On "high" the game looks and runs great on a sub-$400 video card. In fact, on "high" it looks better than anything out there, on any platform. At least with a "very high" setting available, the game will continue to look good a year from now when other games have caught up.
  • andrew007 - Tuesday, December 18, 2007 - link

    Uuuh... no, Crysis is not playable at "high" at any decent resolution on my 8800GT and 3.4GHz overclocked quad core Q6600. Decent being 1280 x whatever. And when you drop to medium, the game looks nothing special. Sure, there are a few areas that look great (forest level for example) but overall I was certainly not blown away. Unlike replaying Bioshock in 1920x1200 which this setup is capable of running very smoothly and which looks amazing in DX10. Quite simply, Crysis is one of the worst optimized games ever. At least it doesn't crash, that's something I guess. Looking forward to replaying it in 2 years. Come to think of it, it was the same with Far Cry, it took 2 years to be able to play that game with decent frame rates.
  • JarredWalton - Tuesday, December 18, 2007 - link

    There's nothing that says Crytek can't make a game where maximum detail settings exceed the capacity of every PC currently available. We've seen this in the past (Oblivion for one), and then a year later suddenly the game is more than playable at max settings on less expensive hardware. It doesn't appear that Tri-SLI is fully implemented for many titles, and considering the age of Crysis I'd expect more performance over time. Just like many games don't fully support SLI (or support it at all in some cases) at launch, only to end up greatly benefiting once drivers are optimized.

    FWIW, I'm playing Crysis on a single 8800 GTX at High detail settings and 1920x1200 with a Core 2 Duo 3.0GHz 2MB (OC'ed E4400). It might be too sluggish for online play where ping and frame rates matter more, but for single player I'm having no issues with that res/settings. It's a matter of what you feel is necessary. I'm willing to shut off AA to get the performance I want.
  • tshen83 - Tuesday, December 18, 2007 - link

    First of all, people who will fork over 1500 dollars worth of GPUs will want to play all games at the highest settings. That means highest AA and AF settings. I don't think you used AA and AF in your testing. It is almost pointless to play without AA for such a nice setup at 120fps(bioshock) where you are becoming CPU bound rather than GPU bound.

    Secondly, your Crysis test used 1920x1200. Why not 2560x1600? Why not 2560x1600 at 4xAA and 16xAF? Crysis at 1920x1200 without AA and AF are severely CPU bound in your case, as you have witnessed that a faster CPU gave you linear scaling.

    Third, there is actually no point of testing triple SLI at any other resolution other than 2560x1600. The target audience triple SLI is aimed at are those with 30 inch Cinema Displays.

    I think to be fair, you should rerun the benchmarks in a non-CPU bound situation with AA+AF on, you will see the proper scaling then.

    Thanks,

  • eternalkp - Tuesday, December 25, 2007 - link

    very good point Tshen

    I have a 30inch monitor.
    the 7900gtx was killing my frame rate.
    i was getting average 25fps @ 2560x1600, medium, 2X AA, 16X aniso...in FEAR Perseus Mandate.

    Just bought MSI OC 8800GTS G92 and very happy with it.

    Now i can crank up maximum graphic setting, 4X AA, 16X aniso @ average 40fps...very nice. :D

    Crysis is a hot engine, i only get 30fps @ medium, AA off.

    YES. what is the point of 3 GPU and have your AA/Aniso off?
    game will look like crap.
    Crysis recommends 4gb of ram.
  • kmmatney - Tuesday, December 18, 2007 - link

    "Third, there is actually no point of testing triple SLI at any other resolution other than 2560x1600"

    The point was testing at settings that are "playable". Who cares if the framerate goes from 8 to 12 @ 2560 x 1600. Its unplayable.

    I don't see how even an "enthusiast" wouldn't see triple SLI as a wate of money, though.
  • cmdrdredd - Tuesday, December 18, 2007 - link

    The point is that running 1600x1200 is really not anything you shouldn't be able to do with one card. Even 1920x1080 in many games is perfect. Showing off 10000000fps means jack, turn the res and AA/AF up and show us what it can push out.
  • defter - Tuesday, December 18, 2007 - link

    The author missed one advantage of 3-way SLI:
    Of course it doesn't make any sense to spend >$1500 on three 8800GTX/Ultras today, but what about those folks that already have a SLI 8800GTX/Ultras?

    For them adding a third card could be a reasonable upgrade option in comparison to replacing both cards with new G92 based cards.

    3-way SLI isn't for everyone, but it has its advantages.
  • praeses - Tuesday, December 18, 2007 - link

    I was under the impression that bioshock did not support AA in DX10. If that is indeed the case, that's hardly the fault of the benchmarker/reviewer.

    Also, I see much merrit in benchmarking at 1920x1200, its a much more common resolution and desktop-friendly resolution given the physical foot print of monitors. Lets be honest, many games aren't sitting 4ft from their displays. At 2-3ft a 24" display which most likely has 1920x1200 is much more comfortable for longer action based viewing. Ideally though they would have a lower dot pitch or simply higher resolution on the smaller screen.
  • tshen83 - Tuesday, December 18, 2007 - link

    One more thing: you are using Vista Ultimate 32bit with 4GB of memory. Since in 32bit, you have 3 768MB Ultras(2.4GB reserved just for video cards) , the system will only see about 1.5GB of memory. That is not sufficient system memory for high resolution benchmarks, especially Crysis.

Log in

Don't have an account? Sign up now