Sometimes it’s the little quirks in life that sneak up on you and change the way you look at the world. The past couple weeks have done that in testing all this new high end gear. Sure, we’ve had our problems testing bleeding edge stuff before, but in putting all of this from CrossFireX through 9800 GX2 Quad SLI to the test, we’ve gotten ourselves lost in some other dimension of existence. It’s the only explanation really. Like Holmes would have said … whatever remains, however improbable … But I’m getting ahead of myself.

Skulltrail (the D5400XS Intel board we’re using that runs both CrossFire and SLI) rivals Frankenstein’s monster. From the 2x LGA775 sockets, FB-DIMMs, and NVIDIA nForce 100 PCIe chips, it’s not for the feint of heart. We’ve been determined to test on a single platform to compare CrossFire and SLI and trying to work out some kinks has given us a little trouble. AMD and NVIDIA and Intel have all worked with us to try and make things go more smoothly (thanks everyone), but there are still some things that we just can’t explain going on.

After loads of nearly useless testing and many conversations with different people, we set out on a trail of mystical discovery that has unlocked secrets of the universe here-to-fore untold. We’ll get to that in a bit, but first we’ve got to stop and take a look at what we are covering today.

Quad SLI. NVIDIA would like us to tell you it’s the new hotness. Certainly, without even opening a page of this review you should all know that this is the top of the top of the line and nothing is faster right now. But we do need to answer a few key questions about this $1200 setup: how does it scale from two GPUs to four, how does scaling compare to CrossFireX, and what kind of performance and value does this solution actually offer.

Honestly, we also have to acknowledge from our previous review of the 9800 GX2 that a single card is enough to run almost any game at maximum settings … that is, with the glaring exception of Crysis. Can Quad SLI change that? From what we saw in our CrossFireX testing from AMD, we would have thought not. However, NVIDIA has managed to get Crysis to scale across all four GPUs despite the interframe dependencies that make it so difficult. Is it enough to run Crysis at a decent resolution with all the eye candy turned on?

Let’s find out …

The Setup and The Test
Comments Locked

54 Comments

View All Comments

  • strikeback03 - Wednesday, March 26, 2008 - link

    you posted your comment 20 hrs ago. I don't see any comments posted by any Anandtech staff in this review since that time, so no guarantee they have actually read your comment.
  • gudodayn - Tuesday, March 25, 2008 - link

    < Layzer253 ~ No, they shouldnt. It works just fine >

    Given that 9800x2 is a power card, more powerful than 3870x2 that 9800x2 will run into CPU limitations..........

    Then in theory, when using the same platform (same CPU, RAM, etc.), shouldn't 9800x2 score at least within the ball park range of the 3870x2??

    It would just mean with a faster CPU, 9800x2 will have lots of room for improvement whereas the 3870x2 wouldn't!!

    But that's not what's happening here though, is it??

    For some reason, 3870x2 in Crossfire is scaling a lot better than 9800x2 in SLI ~ in a lot of the tests!!

    Either SLI drivers are messed up or 9800x2 cant run quad GPUs effectively.........
  • LemonJoose - Tuesday, March 25, 2008 - link

    I honestly don't know why the hardware vendors keep trying to push these high end solution that aren't stable and don't work the way they are supposed to. Exactly who is the market for $500 motherboards that require expensive RAM designed for servers, $1000 CPUs that will be outperfromed by mainstream CPUs in a year or less, and $600 video cards that have stability issues and driver problems even when not run in SLI mode.

    I would really love it Anandtech and other hardware sites would come out and give these products the 1 out of 10 or 2 out of 10 review scores that they deserve, and tell the hardware companies to spend more time developing solutions for real entusiasts with mainstream pocketbooks, instead of wasting their engineering resources on these high end solutions that nobody wants.
  • strikeback03 - Wednesday, March 26, 2008 - link

    Dunno if anyone has actually bought Skulltrail, but obviously people do buy $1000 CPUs and $500 video cards, as some GX2 owners have already posted in this thread, and some people previously bought the Ultra even when it was well over $500. Not something I would do, but there are obviously those with the time and money to play with this kind of thing.
  • B3an - Tuesday, March 25, 2008 - link

    To the writer of this article, or anyone else... i have this strange problem with the GX2.

    In the article you're getting 60+ FPS @ 2560x1200 res with 4xAA. Now if i do this i get nowhere near that frame rate. Without AA it's perfect, but with AA it completely kills it at that res.

    At first i was thinking that the 512MB usable VRAM was not enough for 2560x1600 + AA so i get a slide slow with 4xAA at that res. But from these tests, you're not getting that.

    What backed up my theory for this is that with games with higher res/bigger textures, even 2xAA would kill frame rates. I'd go from 60+ FPS to literally single digit FPS just by turning on 2xAA @ 2560x1200. But with older games with lower res textures i can turn the AA up a lot higher before this happens.

    Does anyone know what the problem could be? Because i'd really like to run games at 2560x1600 with AA, but cannot at the moment.

    I'm on Vista SP1, with 4GB RAM, and a Quad @ 3.5GHz.
    I've also tried 3 different sets of drivers.
  • nubie - Tuesday, March 25, 2008 - link

    OK, it is about time that Socketable GPU's are on the market, how about a mATX board with a edge connected PCIe x32 slot? Make the video board ATX compliant (IE video board + mATX board = ATX compliant).

    Then we can finally cool these damn video cards, and maybe find a way of getting them power that doesn't get in the way.

    You purchase the proper daughterboard for your class (main-stream, enthusiast, overclocker/benchmarker), and it will come with the ram slots(or onboard ram) and proper voltage circuitry. Then you can change just the GPU when you need to upgrade.

    I know it would be hard to implement and confusing, but it would be less confusing than the current situation once we got used to it, and it would be a hell of a lot more sane.

    You could use a PCI-e extender to connect it to existing PCI-e mATX or full ATX motherboards.

    It is either this, or put the GPU socket on the damn motherboard already, it is 2008, it needs to happen. (If the videocard was connected over Hypertransport III you wouldn't have any problem with bandwidth). The next step really needs to be a general-purpose processor built into the video card, that way you aren't cpu/system bound like the current ridiculous setup (The 790i chipset seems to be helping with the ability to have the northbridge do batch commands across the GPU's, but minimum we need to see some Physics moving away from the CPU, general physics haven't changed since the universe started, why waste a general purpose CPU on them?)
  • nubie - Tuesday, March 25, 2008 - link

    Just to re-iterate, why am I buying a new card every time when they just seem to be recycling the reference design? All GPU systems need the same as a motherboard, Clean voltage to the main processor and its memory, as well as a bus for the RAM and some I/O. The 7900 8800 and 9600 are practically the same boards with the same memory bus, can't we have it socketed?
  • tkrushing - Tuesday, March 25, 2008 - link

    I am all for SLI/Crossfire or whatever you can afford to do but why are we starting to lose focus on single card solution users? I'm sure if I really wanted to sacrafice I could save up for a multiple GPU system but a single g92 for less than half the price for a relatively small performance hit in crysis. And yes I love it but I'm saying it, I just think Crysis is a poorly optimized game to some degree. Give use new and not reused single card solutions! (9 series)
  • tviceman - Tuesday, March 25, 2008 - link

    I have been thinking the same thing lately, but last week after reading about Intel's plans to use one of it's cores in CPU's as a graphics unit, I started thinking about the ramifications of this. I am willing to bet that Nvidia is trying to develop a hybrid CPU/GPU to compete on the same platform that Intel and AMD will eventually have. If this is true, that it's probably a reasonable explanation as to why there has been a severe lack of all new GPU's since the launch of the 8xxx series a year ago.
  • dlasher - Tuesday, March 25, 2008 - link

    page 1, paragraph 2, line 1:

    Is:
    "...it’s not for the feint of heart.."

    Should be:
    "...it’s not for the faint of heart.."

Log in

Don't have an account? Sign up now