Final Words

We would like to begin our conclusion by thanking Gigabyte for being the first to come out with such a creative and bleeding edge product. We love to see companies pushing the envelope whereever possible, and this kind of thinking is what we want to see. Of course, it might be a little easier to work on technology if NVIDIA weren't so tight on restricting what they will and will not enable in their drivers.

Unfortunately, in light of the performance tests, there really isn't much remarkable to say about the 3D1. In fact, unless Gigabyte can become very price competitive, there isn't much reason to recommend the 3D1 over a 2-card SLI solution. Currently, buying all the parts separately would cost the same as what Gigabyte is planning to sell the bundle.

The drawbacks to the 3D1 are its limited application (it will only run on the GA-K8NXP-SLI), the fact that it doesn't perform any better than 2-card SLI, and the fact that the user loses a DVI and an HD-15 display connection when compared to the 2-card solution.

Something like this might be very cool for use in a SFF with a motherboard that has only one physical PCIe x16 connector with the NVIDIA SLI chipset. But until we see NVIDIA relax their driver restrictions, and unless Gigabyte can find a way to boot their card on non-Gigabyte boards, there aren't very many other "killer" apps for the 3D1.

The Gigabyte 3D1 does offer single card SLI in a convenient package, and the bundle will be quite powerful for those who choose to acquire it. But we aren't going to recommend it all the same.

As for the Intel solution, a lot rests on NVIDIA's shoulders here as well. With their new Intel chipset coming down the pipeline at some point in the future, it could be that they just don't want it to work well with others. Maybe they just want to sell more of their own parts. Maybe they are actually concerned that the end user won't have the best possible experience on hardware that hasn't been fully tested and qualified to work with SLI. In the end, we will have to wait and see what comes out of NVIDIA in terms of support for other hardware and the concoctions that their partners and customers cook up.

ATI should take note of the issues that NVIDIA is dealing with now, as there are many ways that they could take advantage of the present landscape.

Again, while we can't recommend the Gigabyte 3D1 over standard 6600 GT SLI solutions, we do hope to see other products like this step up to the plate. Ideally, in future single card, multi-GPU solutions, we would like to see full compatibility with any motherboard, the use of true 256-bit memory busses for each GPU (in order to see scalability apply well to memory-intensive settings as well - multiple NV41 GPUs would be nice to see), and three or four external display connectors rather than just two. It may be a lot to ask, but if we're expected to pay for all that silicon, we want to have the ability to take full advantage of it.

Wolfenstein: Enemy Territory Performance
Comments Locked

43 Comments

View All Comments

  • johnsonx - Friday, January 7, 2005 - link

    To #19:

    from page 1:

    "....even if the 3D1 didn't require a special motherboard BIOS in order to boot video..."

    In other words, the mainboard BIOS has to do something special to deal with a dual-GPU card, or at least the current implementation of the 3D1.

    What NVidia should do is:

    1. Update their drivers to allow SLI any time two GPU's are found, whether they be on two boards or one.
    2. Standardize whatever BIOS support is required for the dual GPU cards to POST properly, and include the code in their reference BIOS for the NForce4.

    At least then you could run a dual-GPU card on any NForce4 board. Maybe in turn Quad-GPU could be possible on an SLI board.


  • bob661 - Friday, January 7, 2005 - link

    #19
    I think the article mentioned a special bios is needed to run this card. Right now only Gigabyte has this bios.
  • pio!pio! - Friday, January 7, 2005 - link

    #18 use a laptop
  • FinalFantasy - Friday, January 7, 2005 - link

    Poor Intel :(
  • jcromano - Friday, January 7, 2005 - link

    From the article, which I enjoyed very much:
    "The only motherboard that can run the 3D1 is the GA-K8NXP-SLI."

    Why exactly can't the ASUS SLI board (for example) use the 3D1? Surely not just because Gigabyte says it can't, right?

    Cheers,
    Jim
  • phaxmohdem - Friday, January 7, 2005 - link

    ATI Rage Fury MAXX Nuff said...

    lol #6 I think you're on to something though. Modern technology is becoming incredibly power hungry I think that more steps need to be taken to reduce power consumption and heat production, however with the current pixel pushing slugfest we are witnessing FPS has obviously displaced these two worries to our beloved Video card manufacturers. At some point though when consumers refuse to buy the latest Geforce or Radeon card with a heatsink taking up 4 Extra PCI slots, I think that they will get the hint. I personally consider a dual slot heatsink solution ludicrous.

    Nvidia, ATI, Intel, AMD... STOP RAISING MY ELECTRICITY BILL AND ROOM TEMPERATURE!!!!
  • KingofCamelot - Friday, January 7, 2005 - link

    #16 I'm tired of you people acting like SLI is only doable with an NVIDIA motherboard, which is obviously not the case. SLI only applies to the graphics cards. On motherboards SLI is just a marketing term for NVIDIA. Any board with 2 16x PCI-E connectors can pull off SLI with NVIDIA graphics cards. NVIDIA's solution is unique because they were able to split a 16x line and give each connector 8x bandwidth. Other motherboard manufacturer's are doing 16x and 4x.
  • sprockkets - Thursday, January 6, 2005 - link

    I'm curious to see how all those lame Intel configs by Dell and others pull off SLI long before thie mb came out.
  • Regs - Thursday, January 6, 2005 - link

    Once again - history repeats itself. Dual core SLI solutions are still a far reach from reality.
  • Lifted - Thursday, January 6, 2005 - link

    Dual 6800GT's???? hahahahahehhehehehahahahah.

    Not laughing at you, but those things are so hot you'd need a 50 pound copper heatsink on the beast with 4 x 20,000 RPM fans running full boar just to prevent a China Syndrome.

    Somebody say dual core? Maybe with GeForce 2 MX series cores.

Log in

Don't have an account? Sign up now