Original Link: http://www.anandtech.com/show/2488

Sometimes it’s the little quirks in life that sneak up on you and change the way you look at the world. The past couple weeks have done that in testing all this new high end gear. Sure, we’ve had our problems testing bleeding edge stuff before, but in putting all of this from CrossFireX through 9800 GX2 Quad SLI to the test, we’ve gotten ourselves lost in some other dimension of existence. It’s the only explanation really. Like Holmes would have said … whatever remains, however improbable … But I’m getting ahead of myself.

Skulltrail (the D5400XS Intel board we’re using that runs both CrossFire and SLI) rivals Frankenstein’s monster. From the 2x LGA775 sockets, FB-DIMMs, and NVIDIA nForce 100 PCIe chips, it’s not for the feint of heart. We’ve been determined to test on a single platform to compare CrossFire and SLI and trying to work out some kinks has given us a little trouble. AMD and NVIDIA and Intel have all worked with us to try and make things go more smoothly (thanks everyone), but there are still some things that we just can’t explain going on.

After loads of nearly useless testing and many conversations with different people, we set out on a trail of mystical discovery that has unlocked secrets of the universe here-to-fore untold. We’ll get to that in a bit, but first we’ve got to stop and take a look at what we are covering today.

Quad SLI. NVIDIA would like us to tell you it’s the new hotness. Certainly, without even opening a page of this review you should all know that this is the top of the top of the line and nothing is faster right now. But we do need to answer a few key questions about this $1200 setup: how does it scale from two GPUs to four, how does scaling compare to CrossFireX, and what kind of performance and value does this solution actually offer.

Honestly, we also have to acknowledge from our previous review of the 9800 GX2 that a single card is enough to run almost any game at maximum settings … that is, with the glaring exception of Crysis. Can Quad SLI change that? From what we saw in our CrossFireX testing from AMD, we would have thought not. However, NVIDIA has managed to get Crysis to scale across all four GPUs despite the interframe dependencies that make it so difficult. Is it enough to run Crysis at a decent resolution with all the eye candy turned on?

Let’s find out …

The Setup and The Test

We did have some issues again with this one. If you’ve already got one 9800 GX2 with the driver installed, uninstall the driver, reboot, power down, plug in the second card, boot, reboot, then install the driver. Trust me, it will save you a headache. It seems NVIDIA and AMD still need some time to sort out Vista and multi-GPU when adding a card or removing a card. We didn’t have the same problems we did with CrossFireX, but the potential is there to cause some frustration.

We also ran into a huge (in our opinion) bug that was very difficult to track down. The first two things we do when our graphics driver is installed is to disable vsync and disable image scaling to fit panel size. We run with no scaling at centered timings. This affords us the ability to see things at the same DPI across the board and it also gives us the ability to tell what resolution we are running by looking at the screen. This saves us a lot of trouble when things inevitably get mucked up for one reason or another. I also tend toward the obsessive / compulsive and if I can’t see it, I need to set the res like four times just to be sure.

In any case, 2x 9800 GX2 cards in Quad SLI will not run any games at less than panel resolution if scaling is disabled. You run the game and get a black screen. If you change resoluion in the game to something lower than panel res you get a black screen. Well, to be fair, it’s not just a black screen. It’s a hard lock. This needs to be fixed. It happens on both Skulltrail and 780i, so it’s not an isolated issue.

Also, NVIDIA decided to install a link to a trial version of Portal on the user's desktop when their driver is installed. I suppose a link to the site is better than bundling Earthsim, but not even asking if their customer wants more clutter on their desktop before putting it there is terribly inappropriate. I don’t care about bundling a trial, but please ask before you put something on my desktop.

The test system we used is the same as the one from the 9800 GX2 review, as are the driver revisions.

Test Setup
CPU 2x Intel Core 2 Extreme QX9775 @ 3.20GHz
Motherboard Intel D5400XS (Skulltrail)
Video Cards ATI Radeon HD 3870 x2
ATI Radeon HD 3870
NVIDIA GeForce 9600 GT 512MB
NVIDIA GeForce 8800 GT 512MB
NVIDIA GeForce 8800 Ultra
NVIDIA GeForce 9800 GX2
Video Drivers Catalyst 8.3
ForceWare 174.53
Hard Drive Seagate 7200.9 120GB 8MB 7200RPM
RAM 2xMicron 2GB FB-DIMM DDR2-8800
Operating System Windows Vista Ultimate 64-bit SP1

Thanks goes out to EVGA for supplying the two 9800 GX2 units for this review.

As for power consumption, here’s what we’ve got from these beasts.

Idle Power

Load Power

Overall Performance Scaling With 4 GPUs

Before we present the percent scaling when moving from SLI / CrossFire to Quad-SLI / CrossFireX, we need to make note of a few things.

First, the 9800 GX2 is a higher performance part and is going to run into CPU limitations more readily than the 3870 X2. This means that sometimes scaling won’t reflect the true potential of the NVIDIA solution.

Second, anything over 50% scaling shows that the game is running on all four GPUs. However, less than 50% scaling doesn’t mean that all four GPUs are not doing work. On the contrary, if two GPUs don’t scale near linearly, moving from two to three will likely not scale linearly as well, meaning you could be seeing work done on 3 GPUs at less than 50% scaling up from the single-card dual-GPU solutions we have here. Then adding a fourth GPU might not even bring the percent high enough to make it clear that four GPUs matter.

What more than 50% scaling means is that all four GPUs absolutely matter and the game scales well with a Quad solution. With that in mind, lets take a look at the numbers.

So this becomes very interesting when you hear that NVIDIA claims 60% scaling with Quad-SLI in Crysis when running at Very High settings. We ran these numbers at High + Very High shaders, as we really didn’t expect that Very High would be any more than a slide show. We’ll take a look at this later. For now, suffice it to say that High settings plus Very High Shaders is CPU bound even at 1920x1200 under Crysis. Very High settings are also playable with Quad-SLI, but more on that later.

The Oblivion without AA numbers on NVIDIA are low because a single 9800 GX2 actually outperforms the Quad-SLI until you hit 2560x1600. This beast is made for high res gaming as long as bandwidth doesn’t kill it. 2560x1600 with Crysis isn’t here yet (unless you want to turn the quality way down), but this is only for people with 30” panels. If the only thing you want to buy 9800 GX2 Quad-SLI for is Crysis then by all means save the cash and get a 1920x1200 panel. We certainly don’t recommend such flagrant spending for one title though, and if you want your money’s worth, you’ll want the biggest display possible.

So, CrossFireX doesn’t scale at all in Crysis, all of our AMD cards started crashing out of the World in Conflict benchmark, and CrossFireX won’t run with OpenGL games yet. (AMD will release a driver supporting this at a later time.)

Neither quad solution hits every mark perfectly. We’ll take a look in a minute to see what happens with Crysis at higher settings, but in Oblivion - since performance is actually lower than AMD hardware - we wouldn’t expect to see any sort of system limitation here.

What about Crysis on Very High?

So we tested Crysis on Very high settings with our single 9800 GX2 and Quad setup. Here’s what we got:

As we can see, with very high quality, performance starts to diverge between the dual and quad GPU NVIDIA solutions. It’s also interesting to note that performance doesn’t drop a great deal when moving up in quality. This indicates that the 9800 GX2 is still system bound in some way. And oddly enough, it looks like it is more system bound at the higher quality setting.

To test this absurd theory, we decided to see what happened when we overclocked our 8 CPUs from 3.2 to 4GHz. Let’s test the theory that we are CPU bound to about 45-50 fps at high quality and 40 fps at very high quality. Here is what our 25% overclock netted us when we tested with both very high and high quality using the 9800 GX2 in Quad SLI.

I’ll give you all a second to pick your jaws up off the floor....

Ok, break’s over. We see more than a 50% performance improvement per percent increase in CPU clock speed; a 25% clock speed increase netted us more than half that in real performance under Very High Quality settings. We saw less than 50% improvement at lower res for High Quality plus Very High Shaders until we hit 1920x1200, which netted us a 15% gain on a 25% increase in clock speed.

This indicates that the higher the graphical quality, the MORE CPU bound we are. Crazy isn’t it? It's counter-intuitive, but pure fact. In speaking with NVIDIA about this (they have helped us a lot in understanding some of our issues here), the belief is that more accurate and higher quality physics at higher graphical quality settings is what causes this overhead. Also, keep in mind that we are testing in a timedemo with AI disabled.

And that’s not where it ends. We are platform bound as well. Yes, I said platform bound. A quick check on 780i returned these results:

Not that we are still CPU bound here even though performance is about 25% higher than on Skulltrail (we benefit even more from a platform change than from an overclock). This is with the same speed CPU. It’s quite unfortunate that we stumbled across all of this last night with the help of NVIDIA while trying to troubleshoot our issues. Remember we said that NVIDIA expects a 60% improvement in Crysis at 19x12 with Very High Quality settings. The added number of cores, the fact that I’m only able to run two FB-DIMMS at the moment (for half of the system memory bandwidth I should have), the arrangement of the PCIe lanes on Skulltrail … All of this contributes to our system limitation here and our inability to see scaling from 9800 GX2.

Now, we have seen better performance on Skulltrail in the past, so it is unclear if there is something we can do to remove some of this system limitation at this point. We will certainly be exploring this further as we would still like to make a single platform work for all of our graphics testing. If we can’t then we’ll move on, but it is useful to discover if this is an Intel issue, an OS issue, a driver issue, or something else. If it’s fixable we need to find out how to fix it, as there are probably two or three people out there who’ve purchased a D5400XS board and will not be happy if it performs much worse than 780i boards in cases like this. My working theory right now is that I applied some hotfix or changed some seemingly benign OS setting that caused some problem somewhere. But like I said, we’ll track it down.

Do we see the same thing with World in Conflict?

World in Conflict is also system bound. In our overclock test, we did not see any improvement in performance using the built-in benchmark. This indicates that the limitation is coming from somewhere else. Our platform swap did give us almost 10% improvement but our performance “curve” was still just as flat, meaning we’re still limited somewhere.

With the type of high-end hardware now in our hands that has enabled us to expose such limitations, we still have a lot of investigating to do before we better understand the issues. It is clear that the benchmark for World in Conflict is much harder on the system then actual gameplay, so we will try out some in-game tests to see if we can’t make any more sense of this one. For now, here’s what we see on Skulltrail.

All the rest

Call of Duty 4: Modern Warfare Performance

NVIDIA rocks the house on this benchmark. Not that AMD hardware performs poorly, but 4xAA (or beyond) is a no brainier at any res here.

The Elder Scrolls IV: Oblivion Performance

No, really, AMD is just that much more badass under Oblivion than NVIDIA. The R6xx architecture owns at Oblivion. Someone emailed me and asked if I had gotten the resolutions off when I did the AMD tests. I double-checked for you, and these numbers are correct.

Enemy Territory: Quake Wars Performance

As we mentioned, future AMD drivers will support CrossFireX under OpenGL. Currently this is not supported, and thus we’ve reflected the performance two GPU CrossFire.

S.T.A.L.K.E.R. Performance

Quad SLI does push frame rate up at high res, and for those adventurous enough to enable AA in the driver for S.T.A.L.K.E.R., NVIDIA has made it possible to force it on. Performance drops big time, but it should be possible to get 2xAA at 1920x1200 at playable framerates. CrossFireX drops off in performance much more quickly over resolution in this test as well.

(Not so) Final Words

Unfortunately, we can’t really draw a fair final conclusion from the data we have here. Certainly this is an expensive solution, and it is absolutely not for everyone. But does it fill the needs of those who would want it and could afford it? Maybe and maybe not.

In almost every game other than Crysis, we don’t have a need to move beyond one 9800 GX2 (or 8800 GT/GTS/GTX/Ultra SLI). And in Crysis, we aren’t simply going to trust NVIDIA when they say we should see 60% scaling. We need to actually make it happen ourselves. The fact that we’ve run into some pretty strange problems doesn’t bode well for the solution in general, but we are willing to see what we can do to make it perform near to what we expect before we cast final judgment.

At the same time, that final judgment must include all the facts about what you gain from Quad SLI for the money. If it makes Crysis smooth as butter at Very High settings (it is actually playable even with the 40 average FPS system limitation), then that is something. But $1200 for a Crysis accelerator is a bit of overkill. NVIDIA has made the point to me that $1200 spent on graphics cards is better placed than $1200 spent on an Extreme Edition CPU. That isn’t a particularly compelling argument for us as I don’t believe we have ever recommended the purchase of an Extreme Edition processor (except for overclocking enthusiasts, perhaps). You certainly don’t get what you pay for unless you really need it for a specific CPU heavy workload. Content creation, engineering, math, and workstation applications might be a good fit, but certainly not gaming and especially not in a world where the more extreme you get the more cores you have.

Which brings me to a side rant. Parallelism is a good thing, but neither Intel nor AMD can ignore the single-threaded code path. Not everything can be split up easily, and every thread will always be limited in performance by the speed of the core it is running on. Of course, specialized cores on a heterogeneous processor would help, as would dedicated hardware. That just makes us lament the death of the PPU through the NVIDIA acquisition even more. But I digress.

On the topic of 9800 GX2 Quad SLI, there are benefits aside from the potential of Crysis that we haven’t covered here. NVIDIA has enabled AA on S.T.A.L.K.E.R., but it is very processing and memory heavy. Quad SLI could enable playable frame rates at higher resolutions with 2xAA enabled. With Clear Sky coming out soon, this could be a good thing for fans. You also get SLI AA modes. These do offer a benefit, but AA has diminishing returns at higher resolutions, and especially at higher AA levels. We will be testing SLI AA again at some point, but we want to look at both image quality and performance when we do so.

These cards also act as incredible space heaters. That may not be important right now with summer coming on, but any of our readers that live at the North Pole (Hi Santa! I've been good!) or in Antarctica (sentient penguins running Linux, for example) might appreciate the ability to turn down the thermostat while they sit next to their toasty Quad SLI system.

The bottom line right now is that this is not a solution for most people. Unless we see great scaling in Crysis, there are only a few other compelling features that can currently be enabled through the use of Quad SLI. Sure, it might handle future games with ease, but we always advise against trying to future proof when it comes to graphics. That’s always a path that leads to heartache and the hemorrhaging of money. Just ask previous 7950 GX2 Quad SLI owners about how happy they've been with support and performance over the past year. If you aren’t obsessed with Crysis, skip it. If you are obsessed with Crysis, we’ll get back to you with some numbers on what performance is like once we find a system we can get some headroom on: I’ll have 790i this week and I’ll be hard at work testing it out.

Log in

Don't have an account? Sign up now