Overall Performance Scaling With 4 GPUs

Before we present the percent scaling when moving from SLI / CrossFire to Quad-SLI / CrossFireX, we need to make note of a few things.

First, the 9800 GX2 is a higher performance part and is going to run into CPU limitations more readily than the 3870 X2. This means that sometimes scaling won’t reflect the true potential of the NVIDIA solution.

Second, anything over 50% scaling shows that the game is running on all four GPUs. However, less than 50% scaling doesn’t mean that all four GPUs are not doing work. On the contrary, if two GPUs don’t scale near linearly, moving from two to three will likely not scale linearly as well, meaning you could be seeing work done on 3 GPUs at less than 50% scaling up from the single-card dual-GPU solutions we have here. Then adding a fourth GPU might not even bring the percent high enough to make it clear that four GPUs matter.

What more than 50% scaling means is that all four GPUs absolutely matter and the game scales well with a Quad solution. With that in mind, lets take a look at the numbers.


So this becomes very interesting when you hear that NVIDIA claims 60% scaling with Quad-SLI in Crysis when running at Very High settings. We ran these numbers at High + Very High shaders, as we really didn’t expect that Very High would be any more than a slide show. We’ll take a look at this later. For now, suffice it to say that High settings plus Very High Shaders is CPU bound even at 1920x1200 under Crysis. Very High settings are also playable with Quad-SLI, but more on that later.

The Oblivion without AA numbers on NVIDIA are low because a single 9800 GX2 actually outperforms the Quad-SLI until you hit 2560x1600. This beast is made for high res gaming as long as bandwidth doesn’t kill it. 2560x1600 with Crysis isn’t here yet (unless you want to turn the quality way down), but this is only for people with 30” panels. If the only thing you want to buy 9800 GX2 Quad-SLI for is Crysis then by all means save the cash and get a 1920x1200 panel. We certainly don’t recommend such flagrant spending for one title though, and if you want your money’s worth, you’ll want the biggest display possible.

So, CrossFireX doesn’t scale at all in Crysis, all of our AMD cards started crashing out of the World in Conflict benchmark, and CrossFireX won’t run with OpenGL games yet. (AMD will release a driver supporting this at a later time.)

Neither quad solution hits every mark perfectly. We’ll take a look in a minute to see what happens with Crysis at higher settings, but in Oblivion - since performance is actually lower than AMD hardware - we wouldn’t expect to see any sort of system limitation here.

The Setup and The Test What about Crysis on Very High?
Comments Locked

54 Comments

View All Comments

  • strikeback03 - Wednesday, March 26, 2008 - link

    you posted your comment 20 hrs ago. I don't see any comments posted by any Anandtech staff in this review since that time, so no guarantee they have actually read your comment.
  • gudodayn - Tuesday, March 25, 2008 - link

    < Layzer253 ~ No, they shouldnt. It works just fine >

    Given that 9800x2 is a power card, more powerful than 3870x2 that 9800x2 will run into CPU limitations..........

    Then in theory, when using the same platform (same CPU, RAM, etc.), shouldn't 9800x2 score at least within the ball park range of the 3870x2??

    It would just mean with a faster CPU, 9800x2 will have lots of room for improvement whereas the 3870x2 wouldn't!!

    But that's not what's happening here though, is it??

    For some reason, 3870x2 in Crossfire is scaling a lot better than 9800x2 in SLI ~ in a lot of the tests!!

    Either SLI drivers are messed up or 9800x2 cant run quad GPUs effectively.........
  • LemonJoose - Tuesday, March 25, 2008 - link

    I honestly don't know why the hardware vendors keep trying to push these high end solution that aren't stable and don't work the way they are supposed to. Exactly who is the market for $500 motherboards that require expensive RAM designed for servers, $1000 CPUs that will be outperfromed by mainstream CPUs in a year or less, and $600 video cards that have stability issues and driver problems even when not run in SLI mode.

    I would really love it Anandtech and other hardware sites would come out and give these products the 1 out of 10 or 2 out of 10 review scores that they deserve, and tell the hardware companies to spend more time developing solutions for real entusiasts with mainstream pocketbooks, instead of wasting their engineering resources on these high end solutions that nobody wants.
  • strikeback03 - Wednesday, March 26, 2008 - link

    Dunno if anyone has actually bought Skulltrail, but obviously people do buy $1000 CPUs and $500 video cards, as some GX2 owners have already posted in this thread, and some people previously bought the Ultra even when it was well over $500. Not something I would do, but there are obviously those with the time and money to play with this kind of thing.
  • B3an - Tuesday, March 25, 2008 - link

    To the writer of this article, or anyone else... i have this strange problem with the GX2.

    In the article you're getting 60+ FPS @ 2560x1200 res with 4xAA. Now if i do this i get nowhere near that frame rate. Without AA it's perfect, but with AA it completely kills it at that res.

    At first i was thinking that the 512MB usable VRAM was not enough for 2560x1600 + AA so i get a slide slow with 4xAA at that res. But from these tests, you're not getting that.

    What backed up my theory for this is that with games with higher res/bigger textures, even 2xAA would kill frame rates. I'd go from 60+ FPS to literally single digit FPS just by turning on 2xAA @ 2560x1200. But with older games with lower res textures i can turn the AA up a lot higher before this happens.

    Does anyone know what the problem could be? Because i'd really like to run games at 2560x1600 with AA, but cannot at the moment.

    I'm on Vista SP1, with 4GB RAM, and a Quad @ 3.5GHz.
    I've also tried 3 different sets of drivers.
  • nubie - Tuesday, March 25, 2008 - link

    OK, it is about time that Socketable GPU's are on the market, how about a mATX board with a edge connected PCIe x32 slot? Make the video board ATX compliant (IE video board + mATX board = ATX compliant).

    Then we can finally cool these damn video cards, and maybe find a way of getting them power that doesn't get in the way.

    You purchase the proper daughterboard for your class (main-stream, enthusiast, overclocker/benchmarker), and it will come with the ram slots(or onboard ram) and proper voltage circuitry. Then you can change just the GPU when you need to upgrade.

    I know it would be hard to implement and confusing, but it would be less confusing than the current situation once we got used to it, and it would be a hell of a lot more sane.

    You could use a PCI-e extender to connect it to existing PCI-e mATX or full ATX motherboards.

    It is either this, or put the GPU socket on the damn motherboard already, it is 2008, it needs to happen. (If the videocard was connected over Hypertransport III you wouldn't have any problem with bandwidth). The next step really needs to be a general-purpose processor built into the video card, that way you aren't cpu/system bound like the current ridiculous setup (The 790i chipset seems to be helping with the ability to have the northbridge do batch commands across the GPU's, but minimum we need to see some Physics moving away from the CPU, general physics haven't changed since the universe started, why waste a general purpose CPU on them?)
  • nubie - Tuesday, March 25, 2008 - link

    Just to re-iterate, why am I buying a new card every time when they just seem to be recycling the reference design? All GPU systems need the same as a motherboard, Clean voltage to the main processor and its memory, as well as a bus for the RAM and some I/O. The 7900 8800 and 9600 are practically the same boards with the same memory bus, can't we have it socketed?
  • tkrushing - Tuesday, March 25, 2008 - link

    I am all for SLI/Crossfire or whatever you can afford to do but why are we starting to lose focus on single card solution users? I'm sure if I really wanted to sacrafice I could save up for a multiple GPU system but a single g92 for less than half the price for a relatively small performance hit in crysis. And yes I love it but I'm saying it, I just think Crysis is a poorly optimized game to some degree. Give use new and not reused single card solutions! (9 series)
  • tviceman - Tuesday, March 25, 2008 - link

    I have been thinking the same thing lately, but last week after reading about Intel's plans to use one of it's cores in CPU's as a graphics unit, I started thinking about the ramifications of this. I am willing to bet that Nvidia is trying to develop a hybrid CPU/GPU to compete on the same platform that Intel and AMD will eventually have. If this is true, that it's probably a reasonable explanation as to why there has been a severe lack of all new GPU's since the launch of the 8xxx series a year ago.
  • dlasher - Tuesday, March 25, 2008 - link

    page 1, paragraph 2, line 1:

    Is:
    "...it’s not for the feint of heart.."

    Should be:
    "...it’s not for the faint of heart.."

Log in

Don't have an account? Sign up now