(Not so) Final Words

Unfortunately, we can’t really draw a fair final conclusion from the data we have here. Certainly this is an expensive solution, and it is absolutely not for everyone. But does it fill the needs of those who would want it and could afford it? Maybe and maybe not.

In almost every game other than Crysis, we don’t have a need to move beyond one 9800 GX2 (or 8800 GT/GTS/GTX/Ultra SLI). And in Crysis, we aren’t simply going to trust NVIDIA when they say we should see 60% scaling. We need to actually make it happen ourselves. The fact that we’ve run into some pretty strange problems doesn’t bode well for the solution in general, but we are willing to see what we can do to make it perform near to what we expect before we cast final judgment.

At the same time, that final judgment must include all the facts about what you gain from Quad SLI for the money. If it makes Crysis smooth as butter at Very High settings (it is actually playable even with the 40 average FPS system limitation), then that is something. But $1200 for a Crysis accelerator is a bit of overkill. NVIDIA has made the point to me that $1200 spent on graphics cards is better placed than $1200 spent on an Extreme Edition CPU. That isn’t a particularly compelling argument for us as I don’t believe we have ever recommended the purchase of an Extreme Edition processor (except for overclocking enthusiasts, perhaps). You certainly don’t get what you pay for unless you really need it for a specific CPU heavy workload. Content creation, engineering, math, and workstation applications might be a good fit, but certainly not gaming and especially not in a world where the more extreme you get the more cores you have.

Which brings me to a side rant. Parallelism is a good thing, but neither Intel nor AMD can ignore the single-threaded code path. Not everything can be split up easily, and every thread will always be limited in performance by the speed of the core it is running on. Of course, specialized cores on a heterogeneous processor would help, as would dedicated hardware. That just makes us lament the death of the PPU through the NVIDIA acquisition even more. But I digress.

On the topic of 9800 GX2 Quad SLI, there are benefits aside from the potential of Crysis that we haven’t covered here. NVIDIA has enabled AA on S.T.A.L.K.E.R., but it is very processing and memory heavy. Quad SLI could enable playable frame rates at higher resolutions with 2xAA enabled. With Clear Sky coming out soon, this could be a good thing for fans. You also get SLI AA modes. These do offer a benefit, but AA has diminishing returns at higher resolutions, and especially at higher AA levels. We will be testing SLI AA again at some point, but we want to look at both image quality and performance when we do so.

These cards also act as incredible space heaters. That may not be important right now with summer coming on, but any of our readers that live at the North Pole (Hi Santa! I've been good!) or in Antarctica (sentient penguins running Linux, for example) might appreciate the ability to turn down the thermostat while they sit next to their toasty Quad SLI system.

The bottom line right now is that this is not a solution for most people. Unless we see great scaling in Crysis, there are only a few other compelling features that can currently be enabled through the use of Quad SLI. Sure, it might handle future games with ease, but we always advise against trying to future proof when it comes to graphics. That’s always a path that leads to heartache and the hemorrhaging of money. Just ask previous 7950 GX2 Quad SLI owners about how happy they've been with support and performance over the past year. If you aren’t obsessed with Crysis, skip it. If you are obsessed with Crysis, we’ll get back to you with some numbers on what performance is like once we find a system we can get some headroom on: I’ll have 790i this week and I’ll be hard at work testing it out.

All the rest
POST A COMMENT

53 Comments

View All Comments

  • DDH III - Thursday, October 23, 2008 - link

    So you're saying that when my 1st 9800gx2 gets here in the mail next week, that it actually help my landlord on his heating bill? That's good because he pays for the electric too, and I feel kinda bad already. Since while heating bills in Interior Alaska are nasty, the electric is just as .. but anyways.

    Great review. Though I won't be able to run these in quad on my current mobo. But then, I never liked the FPS's much since quake, which brings me to my point:

    These days I play insane amounts of Supreme Commander. It is CPU intensive. If you are looking for another Benchmark when you start cranking up the GHz, ...well just give it a look.

    But to say a bit: 8 player is CPU pain. I only have dual cores, but my first dual cores were AMD (my games never needed the most rocking FPS) I had to buy into go C2D, just for game speed..forget the eye-candy.

    In short I read this article and went..huh...me too.

    Anyways, I am one of those Dell 3007 owner's, and I learned a lot. TY. And I don't live in North Pole AK, I only work there.




    Reply
  • luther01 - Friday, May 09, 2008 - link

    hi guys. i have a quad 9800 gx2 system as follows:

    q6600 processor @2.8GHz
    4 gig pc-6400 RAM
    780i motherboard
    2x 9800gx2

    i must say that so far i haven't noticed any performance gains over one 9800gx2, and in some games like The Witcher i have actually had a performance drop of 10-15 fps. In crysis, the frame rate in v high settings is disappointingly low, sometimes dropping to 15fps in large environments. i think maybe there is another bottleneck in my system
    Reply
  • Das Capitolin - Sunday, April 27, 2008 - link

    It appears that NVIDIA is aware of the problem, yet there's no word on what or when it might be fixed.

    http://benchmarkreviews.com/index.php?option=com_c...">http://benchmarkreviews.com/index.php?o...&tas...
    Reply
  • Mr Roboto - Sunday, March 30, 2008 - link

    Over at tweaktown they're coming up with the same results as Anand did even with the 9800GTX.
    Is it possible a lot of the problems are because of the G92\94 has a smaller 256-Bit memory interface? Could it cause that much of a difference especially combined with less VRAM. Just a thought.

    http://www.tweaktown.com/articles/1363/1/page_1_in...">http://www.tweaktown.com/articles/1363/1/page_1_in...
    Reply
  • ianken - Thursday, March 27, 2008 - link

    ...logging CPU loading shows it evenly loaded across four cores at about 50%. But then I'm running a lowly 8800GTS(640mb) SLI setup.

    The numbers in this review just seem kinda off.
    Reply
  • DerekWilson - Friday, March 28, 2008 - link

    what settings were you testing on?

    we see high quality and very high quality settings have an increasing impact on CPU ... the higher the visual quality, the higher the impact on CPU ... it's kind of counter intuitive, but it's a fact.

    you actually might not be able to hit the CPU limit with very high settings because the GPU limit might be below the CPU limit... Which seems to be somewhere between 40 and 50 fps (depending on the situation) with a 3.2GHz intel processor.
    Reply
  • seamusmc - Wednesday, March 26, 2008 - link

    I like Hard OCP's review because while the Quad SLI solution does put up higher numbers then a 8800 Ultra SLI solution they pointed out some serious problems in all games besides Crysis.

    It appears that memory is a bottleneck and many games have severe momentary drops in FPS at high resolutions and/or with AA, making the gaming experience worse then an 8800 Ultra SLI solution. I strongly recommend folks take a look at Hard OCP's review.

    AnandTech's review only covers average FPS which does not address nor reveal the kinds of issues the Quad SLI solution is having.
    Reply
  • B3an - Wednesday, March 26, 2008 - link

    Thanks for mention of the HardOCP review. A lot better than Anands.

    Very disappointed with Anand on this article. I posted a comment asking why i was getting such bad frame rates at high res with AA, and Anand did not even address this. When they must have run in to it at 2560x1600, and just about all the games they tested at 2560x1600 will get killed with AA because of the memory bottleneck. I'm talking from trying this myself. If i knew about it i wouldn't have got a GX2 as it's pretty pointless with a 30" monitor.
    So are they getting paid by NV or what?

    Very disappointed.
    Reply
  • AssBall - Wednesday, March 26, 2008 - link

    "Very disappointed with Anand on this article. I posted a comment asking why i was getting such bad frame rates at high res with AA, and Anand did not even address this."


    They are trying to answer their own questions and solve their own problems right now. This comment section is for comments on the article, not your personal technical support line to Anand.
    Reply
  • B3an - Thursday, March 27, 2008 - link

    When i said "I posted a comment asking why i was getting such bad frame rates at high res with AA, and Anand did not even address this."

    I meant, why has this website not addressed this issue in any of there GX2 articles. As any decent website would, especially being as it's not hard to run in to. I mean a high-end card like this would be for people with 24" and 30" monitors like myself. As anything lower than 1920x1200 a GTX would be more than enough. Yet the card has massive memory bandwidth problems and/or not enough usable VRAM. So frame rates get completely crippled when a certain amount of AA is used on games.
    Making the card pretty pointless.
    Reply

Log in

Don't have an account? Sign up now