The Hardware

First up on the chopping block is the Gigabyte 3D1. Gigabyte is touting this board as being a 256bit 256MB card, but this is only really half the story. Each of the 6600GT GPUs is only privy to half of the bandwidth and RAM on the card. The 3D1 is obviously a 256MB card, since it physically has 256MB of RAM on board. But, due to the nature of SLI, two seperate 128bit/128MB busses do not translate to the same power as a 256bit/256MB setup does on a single GPU card. The reason for this is that a lot of duplicate data needs to be stored in local memory off of each GPU. Also, when AA/AF are applied to the scene, we see cases where both GPUs end up becoming memory bandwidth limited. Honestly, it would be much more efficient (and costly) to design a shared memory system into which both GPUs could draw if NVIDIA knew someone was going to drop the chips on one PCB. Current NVIDIA SLI technology really is designed to connect graphics cards together at the board level rather than at the chip level, and that inherently makes the 3D1 design a bit of a kludge.



Of course, without the innovators, we would never see great products. Hopefully, Gigabyte will inspire NVIDIA to take single-board multi-chip designs into account when building future multi-GPU options into silicon. Even if we do see a "good" shared memory design at some point, the complexity added would be orders of magnitude beyond what this generation of SLI offers. We would certainly not expect to see anything more than a simple massage of the current feature set in NV5x.

The 3D1 does ship with a default RAM overclock of 60MHz (120MHz effective), which will end up boosting memory intensive performance a bit. Technically, since this card just places two 6600GT cards physically on a single board and permanently links their SLI interfaces, there should be no other performance advantages over other 6600GT SLI configurations.

One thing that the 3D1 loses over other SLI solutions is the ability to turn off SLI and run two cards with more than two monitors on the output. Personally, I really enjoy running three desktops and working on the center one. It just exudes a sense of productivity that far exceeds single and dual monitor configurations.

The only motherboard that can run the 3D1 is the GA-K8NXP-SLI. These products will be shipping together as a bundle within the month, and will cost about as much as buying a motherboard and two 6600GT cards. As usual, Gigabyte has managed to pack just about everything but the kitchen sink into this board. The two physical x16 PCIe connectors are wired up with x16 and x8 electrical connections. It's an overclocker-friendly setup (though, overclocking is beyond the scope of this article), and easy to set up and get running. We will have a full review of the board coming along soon.



As it pertains to the 3D1, when connected to the GA-K8NXP-SLI, the x16 PCIe slot is broken into 2 x8 connections that are dedicated to each GPU. This requires the motherboards SLI card be flipped to single setting rather than SLI.

Under the Intel solution, Gigabyte is hoping that NVIDIA will decide to release full-featured multi-GPU drivers that don't require SLI motherboard support. Their GA-8AENXP Dual Graphic is a very well done 925XE board that parallels their AMD solution. On this board, Gigabyte went with x16 and x4 PCI Express graphics connections. SLI performance is, unfortunately, not where we would expect it to be. It's hard to tell exactly from where these limitations are coming, given the state of drivers for Intel lagging the AMD platform. One interesting thing to note is that whenever we had more than one graphics card plugged into the board, the card in the x4 PCIe (the bottom PCI Express slot on the motherboard) took the master role. There was no BIOS option to select which PCI Express slot to boot first as there was in the AMD board. Hopefully, this will be updated in a BIOS revision. We don't think that this explains the SLI performance (as we've seen other Intel boards perform at less than optimal levels), but having the SLI master in a x4 PCIe slot probably isn't going to help.

The revision on the GA-8AENXP Dual Graphic that we have is 0.1, so there is definitely some room for improvement.

But let's take a look at the numbers and see what the tests have to say.

Index The Test
POST A COMMENT

43 Comments

View All Comments

  • johnsonx - Friday, January 7, 2005 - link

    To #19:

    from page 1:

    "....even if the 3D1 didn't require a special motherboard BIOS in order to boot video..."

    In other words, the mainboard BIOS has to do something special to deal with a dual-GPU card, or at least the current implementation of the 3D1.

    What NVidia should do is:

    1. Update their drivers to allow SLI any time two GPU's are found, whether they be on two boards or one.
    2. Standardize whatever BIOS support is required for the dual GPU cards to POST properly, and include the code in their reference BIOS for the NForce4.

    At least then you could run a dual-GPU card on any NForce4 board. Maybe in turn Quad-GPU could be possible on an SLI board.


    Reply
  • bob661 - Friday, January 7, 2005 - link

    #19
    I think the article mentioned a special bios is needed to run this card. Right now only Gigabyte has this bios.
    Reply
  • pio!pio! - Friday, January 7, 2005 - link

    #18 use a laptop Reply
  • FinalFantasy - Friday, January 7, 2005 - link

    Poor Intel :( Reply
  • jcromano - Friday, January 7, 2005 - link

    From the article, which I enjoyed very much:
    "The only motherboard that can run the 3D1 is the GA-K8NXP-SLI."

    Why exactly can't the ASUS SLI board (for example) use the 3D1? Surely not just because Gigabyte says it can't, right?

    Cheers,
    Jim
    Reply
  • phaxmohdem - Friday, January 7, 2005 - link

    ATI Rage Fury MAXX Nuff said...

    lol #6 I think you're on to something though. Modern technology is becoming incredibly power hungry I think that more steps need to be taken to reduce power consumption and heat production, however with the current pixel pushing slugfest we are witnessing FPS has obviously displaced these two worries to our beloved Video card manufacturers. At some point though when consumers refuse to buy the latest Geforce or Radeon card with a heatsink taking up 4 Extra PCI slots, I think that they will get the hint. I personally consider a dual slot heatsink solution ludicrous.

    Nvidia, ATI, Intel, AMD... STOP RAISING MY ELECTRICITY BILL AND ROOM TEMPERATURE!!!!
    Reply
  • KingofCamelot - Friday, January 7, 2005 - link

    #16 I'm tired of you people acting like SLI is only doable with an NVIDIA motherboard, which is obviously not the case. SLI only applies to the graphics cards. On motherboards SLI is just a marketing term for NVIDIA. Any board with 2 16x PCI-E connectors can pull off SLI with NVIDIA graphics cards. NVIDIA's solution is unique because they were able to split a 16x line and give each connector 8x bandwidth. Other motherboard manufacturer's are doing 16x and 4x. Reply
  • sprockkets - Thursday, January 6, 2005 - link

    I'm curious to see how all those lame Intel configs by Dell and others pull off SLI long before thie mb came out. Reply
  • Regs - Thursday, January 6, 2005 - link

    Once again - history repeats itself. Dual core SLI solutions are still a far reach from reality. Reply
  • Lifted - Thursday, January 6, 2005 - link

    Dual 6800GT's???? hahahahahehhehehehahahahah.

    Not laughing at you, but those things are so hot you'd need a 50 pound copper heatsink on the beast with 4 x 20,000 RPM fans running full boar just to prevent a China Syndrome.

    Somebody say dual core? Maybe with GeForce 2 MX series cores.
    Reply

Log in

Don't have an account? Sign up now