The Test

Our test configuration was as follows:

AMD Athlon 64 FX-55 (2.6GHz)

MSI K8N Neo4 Platinum/SLI

2 x 512MB Corsair DDR400

NVIDIA Graphics Cards:

NVIDIA GeForce 6600GT x 2
NVIDIA GeForce 6800GT x 2

NVIDIA 66.75 Drivers

Windows XP with DirectX 9.0c

Because of our limited time and the fact that we were thousands of miles away from our labs we could only test the cards that MSI had on hand at the time, which were NVIDIA-only.

Click to Enlarge

Setting up SLI Doom 3 Performance


View All Comments

  • bob661 - Friday, October 29, 2004 - link

    I think some these guys are mad because the motherboard that suits their needs won't be considered "the best". For some, it's an image thing. If it isn't, then why do you care that SLI is even available? Just but the HF4 Ultra. Then there some that come here just to piss people off. Reply
  • bob661 - Friday, October 29, 2004 - link

    Two GPU's on one card is more expensive and there would proabably be some heat issues. Reply
  • Pete - Friday, October 29, 2004 - link

    Whoops. NV43 has only four ROPs, while NV40 has sixteen. So SLIed 6600GTs still have only half the ROPs as a single 6800GT. Mah bad. Reply
  • Tides - Friday, October 29, 2004 - link

    SLI is meant for one thing, HIGH END. It's like spending 800 on an Athlon FX. Before now the option wasn't there, now it is. What's the problem? Reply
  • Pete - Friday, October 29, 2004 - link

    Thanks for the preview, Anand (and MSI). One note:

    "At 1280 x 1024 we see something quite unusual, the 6800GT gains much more from SLI than the 6600GT. The 6800GT received a 63.5% performance boost from SLI while the 6600GT gets "only" a 45.7% improvement; given the beta nature of the drivers we'll avoid hypothesizing about why."

    Not enough RAM? 12x10 4xAA is getting pretty RAM-intensive, no? That's one of the reasons I'm not that excited about SLI'ing two 6600GTs to the level of a 6800GT, but without the extra breathing room afforded by 256MB.

    Two questions for you, too, Anand:

    (1) The 6600GT is 500MHz core, 8 pipe, 4 ROP, 500MHz 128-bit memory. The 6800GT is 350MHz core, 16 pipe, eight ROP, 500MHz 256-bit memory. All else being equal, I'd have thought the SLI'ed 6600GTs would way outperform the 6800GT because they have the same specs and a 40% higher core clock. Is this just a matter of some efficiency lost due to SLI overhead?

    (2) Is there a way to tell if the cards are rendering in "SLI" or AFR mode, or even to force one or the other? I'd be curious to know which helps which app more.
  • justauser - Friday, October 29, 2004 - link

    I don't get it. Why not just put two GPUs on one 16x card. This bridge thing is so hokey. Reply
  • Tides - Friday, October 29, 2004 - link

    Better yet don't buy the SLI version of the mobo, there ARE 3 versions of NF4 boards afterall. Reply
  • Tides - Friday, October 29, 2004 - link

    Why are people complaining about an additional feature on motherboards, that you are no way forced to use? It's like having 2 agp slots on a motherboard, it's ROOM FOR UPGRADE. What's wrong with that? Reply
  • xsilver - Friday, October 29, 2004 - link

    I think the performance boost is viable, only you need to know when to buy

    6600gt SLI is close to a 6800gt in most benchies and in the ones that aren't may be due to driver issues rather than performance... however 2X 6600gt does not equal 6800gt in price, but in say 12months time will a 6600gt + the price of the old 6600gt = or be less than the price of a 6800gt originally?
    The new mainstream product in 12 months time should still perform less than a 6600gt in SLI
    Think of it as getting as good card on "layaway" (am I saying this right? im not in the US :)

    The other viability is of course having 2X 6800GT and saying I've got the best performance money can buy.... again you should not be superceded within 12-18 months

  • haris - Friday, October 29, 2004 - link

    This is a horrible move by Nvidia. Several people have already stated so because of some of the main problems: Heat, noise, power requirements, and SLI may only work if the driver supports that specific game/engine. It might work out great for them since they will be able to get people to pay for two cards instead of just getting a more powerful single card solution which will work just as well if not better in every game. For most people, by the time they would be ready to upgrade a low-mid range card, it would probably still be more cost effective to just buy a new card.

    I love the performance boost as much as the next guy/girl, but I still think that this is just plain stupid.

Log in

Don't have an account? Sign up now