Quad SLI with 9800 GX2: Pushing a System to its Limitby Derek Wilson on March 25, 2008 9:00 AM EST
- Posted in
Sometimes it’s the little quirks in life that sneak up on you and change the way you look at the world. The past couple weeks have done that in testing all this new high end gear. Sure, we’ve had our problems testing bleeding edge stuff before, but in putting all of this from CrossFireX through 9800 GX2 Quad SLI to the test, we’ve gotten ourselves lost in some other dimension of existence. It’s the only explanation really. Like Holmes would have said … whatever remains, however improbable … But I’m getting ahead of myself.
Skulltrail (the D5400XS Intel board we’re using that runs both CrossFire and SLI) rivals Frankenstein’s monster. From the 2x LGA775 sockets, FB-DIMMs, and NVIDIA nForce 100 PCIe chips, it’s not for the feint of heart. We’ve been determined to test on a single platform to compare CrossFire and SLI and trying to work out some kinks has given us a little trouble. AMD and NVIDIA and Intel have all worked with us to try and make things go more smoothly (thanks everyone), but there are still some things that we just can’t explain going on.
After loads of nearly useless testing and many conversations with different people, we set out on a trail of mystical discovery that has unlocked secrets of the universe here-to-fore untold. We’ll get to that in a bit, but first we’ve got to stop and take a look at what we are covering today.
Quad SLI. NVIDIA would like us to tell you it’s the new hotness. Certainly, without even opening a page of this review you should all know that this is the top of the top of the line and nothing is faster right now. But we do need to answer a few key questions about this $1200 setup: how does it scale from two GPUs to four, how does scaling compare to CrossFireX, and what kind of performance and value does this solution actually offer.
Honestly, we also have to acknowledge from our previous review of the 9800 GX2 that a single card is enough to run almost any game at maximum settings … that is, with the glaring exception of Crysis. Can Quad SLI change that? From what we saw in our CrossFireX testing from AMD, we would have thought not. However, NVIDIA has managed to get Crysis to scale across all four GPUs despite the interframe dependencies that make it so difficult. Is it enough to run Crysis at a decent resolution with all the eye candy turned on?
Let’s find out …
Post Your CommentPlease log in or sign up to comment.
View All Comments
DDH III - Thursday, October 23, 2008 - linkSo you're saying that when my 1st 9800gx2 gets here in the mail next week, that it actually help my landlord on his heating bill? That's good because he pays for the electric too, and I feel kinda bad already. Since while heating bills in Interior Alaska are nasty, the electric is just as .. but anyways.
Great review. Though I won't be able to run these in quad on my current mobo. But then, I never liked the FPS's much since quake, which brings me to my point:
These days I play insane amounts of Supreme Commander. It is CPU intensive. If you are looking for another Benchmark when you start cranking up the GHz, ...well just give it a look.
But to say a bit: 8 player is CPU pain. I only have dual cores, but my first dual cores were AMD (my games never needed the most rocking FPS) I had to buy into go C2D, just for game speed..forget the eye-candy.
In short I read this article and went..huh...me too.
Anyways, I am one of those Dell 3007 owner's, and I learned a lot. TY. And I don't live in North Pole AK, I only work there.
luther01 - Friday, May 9, 2008 - linkhi guys. i have a quad 9800 gx2 system as follows:
q6600 processor @2.8GHz
4 gig pc-6400 RAM
i must say that so far i haven't noticed any performance gains over one 9800gx2, and in some games like The Witcher i have actually had a performance drop of 10-15 fps. In crysis, the frame rate in v high settings is disappointingly low, sometimes dropping to 15fps in large environments. i think maybe there is another bottleneck in my system
Das Capitolin - Sunday, April 27, 2008 - linkIt appears that NVIDIA is aware of the problem, yet there's no word on what or when it might be fixed.
Mr Roboto - Sunday, March 30, 2008 - linkOver at tweaktown they're coming up with the same results as Anand did even with the 9800GTX.
Is it possible a lot of the problems are because of the G92\94 has a smaller 256-Bit memory interface? Could it cause that much of a difference especially combined with less VRAM. Just a thought.
ianken - Thursday, March 27, 2008 - link...logging CPU loading shows it evenly loaded across four cores at about 50%. But then I'm running a lowly 8800GTS(640mb) SLI setup.
The numbers in this review just seem kinda off.
DerekWilson - Friday, March 28, 2008 - linkwhat settings were you testing on?
we see high quality and very high quality settings have an increasing impact on CPU ... the higher the visual quality, the higher the impact on CPU ... it's kind of counter intuitive, but it's a fact.
you actually might not be able to hit the CPU limit with very high settings because the GPU limit might be below the CPU limit... Which seems to be somewhere between 40 and 50 fps (depending on the situation) with a 3.2GHz intel processor.
seamusmc - Wednesday, March 26, 2008 - linkI like Hard OCP's review because while the Quad SLI solution does put up higher numbers then a 8800 Ultra SLI solution they pointed out some serious problems in all games besides Crysis.
It appears that memory is a bottleneck and many games have severe momentary drops in FPS at high resolutions and/or with AA, making the gaming experience worse then an 8800 Ultra SLI solution. I strongly recommend folks take a look at Hard OCP's review.
AnandTech's review only covers average FPS which does not address nor reveal the kinds of issues the Quad SLI solution is having.
B3an - Wednesday, March 26, 2008 - linkThanks for mention of the HardOCP review. A lot better than Anands.
Very disappointed with Anand on this article. I posted a comment asking why i was getting such bad frame rates at high res with AA, and Anand did not even address this. When they must have run in to it at 2560x1600, and just about all the games they tested at 2560x1600 will get killed with AA because of the memory bottleneck. I'm talking from trying this myself. If i knew about it i wouldn't have got a GX2 as it's pretty pointless with a 30" monitor.
So are they getting paid by NV or what?
AssBall - Wednesday, March 26, 2008 - link"Very disappointed with Anand on this article. I posted a comment asking why i was getting such bad frame rates at high res with AA, and Anand did not even address this."
They are trying to answer their own questions and solve their own problems right now. This comment section is for comments on the article, not your personal technical support line to Anand.
B3an - Thursday, March 27, 2008 - linkWhen i said "I posted a comment asking why i was getting such bad frame rates at high res with AA, and Anand did not even address this."
I meant, why has this website not addressed this issue in any of there GX2 articles. As any decent website would, especially being as it's not hard to run in to. I mean a high-end card like this would be for people with 24" and 30" monitors like myself. As anything lower than 1920x1200 a GTX would be more than enough. Yet the card has massive memory bandwidth problems and/or not enough usable VRAM. So frame rates get completely crippled when a certain amount of AA is used on games.
Making the card pretty pointless.