(Not so) Final Words

Unfortunately, we can’t really draw a fair final conclusion from the data we have here. Certainly this is an expensive solution, and it is absolutely not for everyone. But does it fill the needs of those who would want it and could afford it? Maybe and maybe not.

In almost every game other than Crysis, we don’t have a need to move beyond one 9800 GX2 (or 8800 GT/GTS/GTX/Ultra SLI). And in Crysis, we aren’t simply going to trust NVIDIA when they say we should see 60% scaling. We need to actually make it happen ourselves. The fact that we’ve run into some pretty strange problems doesn’t bode well for the solution in general, but we are willing to see what we can do to make it perform near to what we expect before we cast final judgment.

At the same time, that final judgment must include all the facts about what you gain from Quad SLI for the money. If it makes Crysis smooth as butter at Very High settings (it is actually playable even with the 40 average FPS system limitation), then that is something. But $1200 for a Crysis accelerator is a bit of overkill. NVIDIA has made the point to me that $1200 spent on graphics cards is better placed than $1200 spent on an Extreme Edition CPU. That isn’t a particularly compelling argument for us as I don’t believe we have ever recommended the purchase of an Extreme Edition processor (except for overclocking enthusiasts, perhaps). You certainly don’t get what you pay for unless you really need it for a specific CPU heavy workload. Content creation, engineering, math, and workstation applications might be a good fit, but certainly not gaming and especially not in a world where the more extreme you get the more cores you have.

Which brings me to a side rant. Parallelism is a good thing, but neither Intel nor AMD can ignore the single-threaded code path. Not everything can be split up easily, and every thread will always be limited in performance by the speed of the core it is running on. Of course, specialized cores on a heterogeneous processor would help, as would dedicated hardware. That just makes us lament the death of the PPU through the NVIDIA acquisition even more. But I digress.

On the topic of 9800 GX2 Quad SLI, there are benefits aside from the potential of Crysis that we haven’t covered here. NVIDIA has enabled AA on S.T.A.L.K.E.R., but it is very processing and memory heavy. Quad SLI could enable playable frame rates at higher resolutions with 2xAA enabled. With Clear Sky coming out soon, this could be a good thing for fans. You also get SLI AA modes. These do offer a benefit, but AA has diminishing returns at higher resolutions, and especially at higher AA levels. We will be testing SLI AA again at some point, but we want to look at both image quality and performance when we do so.

These cards also act as incredible space heaters. That may not be important right now with summer coming on, but any of our readers that live at the North Pole (Hi Santa! I've been good!) or in Antarctica (sentient penguins running Linux, for example) might appreciate the ability to turn down the thermostat while they sit next to their toasty Quad SLI system.

The bottom line right now is that this is not a solution for most people. Unless we see great scaling in Crysis, there are only a few other compelling features that can currently be enabled through the use of Quad SLI. Sure, it might handle future games with ease, but we always advise against trying to future proof when it comes to graphics. That’s always a path that leads to heartache and the hemorrhaging of money. Just ask previous 7950 GX2 Quad SLI owners about how happy they've been with support and performance over the past year. If you aren’t obsessed with Crysis, skip it. If you are obsessed with Crysis, we’ll get back to you with some numbers on what performance is like once we find a system we can get some headroom on: I’ll have 790i this week and I’ll be hard at work testing it out.

All the rest
Comments Locked

54 Comments

View All Comments

  • iceveiled - Tuesday, March 25, 2008 - link

    I understand crysis is a good game to test the muscle power of video cards, but if anybody out there hasn't played the game yet and wants the best setup for it, please don't spend $1200 in video cards. I've played through half life 2 numerous times, call of duty 4 numerous times, and crysis only once. Once you get over the wow factor of the graphics, it's not that amazing of an experience....
  • mark3450 - Tuesday, March 25, 2008 - link

    In this and the last article on the 9800GX2 the following benchmarking data on Crysis @2560x1600 has shown up in a chart.

    9800GX2 - 8.9FPS
    8800Ultra - 16.3FPS
    8800GT - 12.3FPS

    Now I look at this and I say the FPS scaling you get by adding a second card is generally around 50% to 60% in the best case scenario. If we assume that, then 2x 8800Ultra would be getting around 25FPS, which is getting into playable range especially with the motion blur that Crysis uses. Obviously this is assuming decent scaling, but this data just screams give it a try.

    On a slightly realated note, I also see that the same Crysis chart shows that two cards scale roughly linearly with resolution up to to 2560x1600 (8800GT and 8800Ultra) while the others show a sharp drop at 2560x1600 (9800GX2, all AMD cards). This makes me ask the question what's different about these two groups of cards. One common feature I note is that the cards that scale linearly are all using PCIe 2.0, while the ones that have a sharp drop off @2560x1600 are using PCIe 1.x (the 9800GX2 is externally PCIe 2.0, but internally the two cards are connected via PCIe 1.x). Mabey it has nothing to do with the type of PCIe connection, but it certainly correlates.

    Basically all this makes me think that for gaming at 2560x1600 I'm likely to be better off with two 8800Ultra's (or even 8800 GTX's) than I am with one or even two 9800GX2's (and since I and a lot of people interested gaming on high end rigs at 2560x1600 likely have a 8800GTX/Ultra already it would be far cheaper as well). This is of course all speculation since there are no reported benchmarks for 8800GTX/Ultra in SLI mode in these comparisons, which is why I like to request them. :)

    -Mark
  • mark3450 - Tuesday, March 25, 2008 - link

    Turns out hardocp has a review up at

    http://enthusiast.hardocp.com/article.html?art=MTQ...">http://enthusiast.hardocp.com/article.html?art=MTQ...
    (can't insert a proper link for some reason)

    that compares 2x 9800GX2 with 2x 8800GTX's. The short summary is 2x 8800GTX's are better than 2x 9800GX2 at hi-res gaming. The 9800GX2's often have higher average frame rates than the 8800GTX's, but the 8800GTX's have much more consistent frame rates (the 9800GX2's often had there frame rates crash to unacceptable levels for short periods of time, whereas the 8800GTX were playable throughout).

    Essentially it looks like I am better off getting a second 8800GTX rather than 1 or 2 9800GX2's for gaming at 2560x1600, and it's way cheaper to boot.

    I will still wait till next week to see how the 9800GTX performs, but given the leaked info on it and recent history of anemic releases by NVIDIA I'm not holding out much hope for the 9800GTX.

    -Mrk
  • zshift - Tuesday, March 25, 2008 - link

    in the second paragraph you noted the skulltrail as having 2x lga775 sockets, but i'm pretty sure it has lga771 sockets only. if i'm mistaken, i apologize, if i'm right, please correct the error so other less knowledgeable readers don't receive false information.
  • Tilmitt - Tuesday, March 25, 2008 - link

    You guys shouldn't be using Skulltrail to benchmark games. It's not a gaming platform. Most games run slower on it than a single socket quad core systems due to the FB DIMMs. It provides a sub-optimal environment for both SLI and crossfire which negates any value that it might have for levelling the playing field there. I think the author is letting his personal desire to use the Skulltrail system get in the way of doing a proper review. The fact of the matter is that Skulltrail is slow for games and doesn't reflect how the vast majority of people would run their SLI and crossfire setups.

    As to the multi-GPUness, I think you'd have to be mad to buy them given the price and horrendous scaling. As always, the generation cards will mostly outperform a multi-GPU systems at less cost, less power consumption and more consistent performance across all games.
  • tynopik - Tuesday, March 25, 2008 - link

    faint of heart, not feint ;)
  • cactusjack - Tuesday, March 25, 2008 - link

    This is my point. The testers here had "some problems" and these guys are very experienced and tecnically savy. They also have access to alot of PSU's ram etc etc to try if things dont work right. If it were a car or a television it would be sent back as what it is, a failure, and a lemon. Why do we accept it with PC parts.?
  • Inkjammer - Tuesday, March 25, 2008 - link

    Case in point, I have the 9800 GX2:
    * I can not run multiple monitors with SLI enabled. So I have to swap between my 24" monitor and my Wacom Cintiq 21". When I change over, the drivers won't auto-detect the resolution, and uses resolutions and hertz the Wacom doesn't support, and I get an "out of signal" error. I have to disable SLI to use my $2,500 art tablet as a secondary monitor.
    * I'm a graphic designer, and I can't take screenshots anymore without them coming out garbled like this:
    http://www.inkjammer.com/broken_screencaps.jpg">http://www.inkjammer.com/broken_screencaps.jpg

    I could find workarounds, get a screen cap program or just disable SLI, but this is all basic functionality gone bad.

    There are a LOT of little problems that could impede testing without being visible. The fact that SLI breaks basic functions like multi-monitor setups and screen capture in Vista is puzzling. These drivers feel like betas lacking basic functionality. If I even try Crysis with 8X FSAA my entire system crashes.
  • dare2savefreedom - Tuesday, March 25, 2008 - link

    Did you report this to nvidia:

    http://www.nvidia.com/object/vistaqualityassurance...">http://www.nvidia.com/object/vistaqualityassurance...

  • Inkjammer - Tuesday, March 25, 2008 - link

    I adopted from an 8800 GTX to a 9800 GX2, and I'm really frustrated with the drivers - there are a lot of issues with them running it in "SLI". There card has a lot of raw performance, but seeing it doubled up with two cards...

    Costs aside, it really seems like anything beyond 2 GPUs at this point and time is rather useless. The technology is there, but the drivers are still too immature and the rest of the tech it requires to be useful hasn't caught up to speed.

Log in

Don't have an account? Sign up now