Final Words

We would like to begin our conclusion by thanking Gigabyte for being the first to come out with such a creative and bleeding edge product. We love to see companies pushing the envelope whereever possible, and this kind of thinking is what we want to see. Of course, it might be a little easier to work on technology if NVIDIA weren't so tight on restricting what they will and will not enable in their drivers.

Unfortunately, in light of the performance tests, there really isn't much remarkable to say about the 3D1. In fact, unless Gigabyte can become very price competitive, there isn't much reason to recommend the 3D1 over a 2-card SLI solution. Currently, buying all the parts separately would cost the same as what Gigabyte is planning to sell the bundle.

The drawbacks to the 3D1 are its limited application (it will only run on the GA-K8NXP-SLI), the fact that it doesn't perform any better than 2-card SLI, and the fact that the user loses a DVI and an HD-15 display connection when compared to the 2-card solution.

Something like this might be very cool for use in a SFF with a motherboard that has only one physical PCIe x16 connector with the NVIDIA SLI chipset. But until we see NVIDIA relax their driver restrictions, and unless Gigabyte can find a way to boot their card on non-Gigabyte boards, there aren't very many other "killer" apps for the 3D1.

The Gigabyte 3D1 does offer single card SLI in a convenient package, and the bundle will be quite powerful for those who choose to acquire it. But we aren't going to recommend it all the same.

As for the Intel solution, a lot rests on NVIDIA's shoulders here as well. With their new Intel chipset coming down the pipeline at some point in the future, it could be that they just don't want it to work well with others. Maybe they just want to sell more of their own parts. Maybe they are actually concerned that the end user won't have the best possible experience on hardware that hasn't been fully tested and qualified to work with SLI. In the end, we will have to wait and see what comes out of NVIDIA in terms of support for other hardware and the concoctions that their partners and customers cook up.

ATI should take note of the issues that NVIDIA is dealing with now, as there are many ways that they could take advantage of the present landscape.

Again, while we can't recommend the Gigabyte 3D1 over standard 6600 GT SLI solutions, we do hope to see other products like this step up to the plate. Ideally, in future single card, multi-GPU solutions, we would like to see full compatibility with any motherboard, the use of true 256-bit memory busses for each GPU (in order to see scalability apply well to memory-intensive settings as well - multiple NV41 GPUs would be nice to see), and three or four external display connectors rather than just two. It may be a lot to ask, but if we're expected to pay for all that silicon, we want to have the ability to take full advantage of it.

Wolfenstein: Enemy Territory Performance
Comments Locked

43 Comments

View All Comments

  • reactor - Thursday, January 6, 2005 - link

    so basically it performs the same as sli and for the same price as the sli setup, but only works with gb boards. wouldve like to see some power/cooling comparisons and pics although ive already seen it.

    in the end id rather get a 6800gt.
  • mkruer - Thursday, January 6, 2005 - link

    Just wait we will see Dual Core GPU's soon enough.
  • yelo333 - Thursday, January 6, 2005 - link

    #5,#7,#9 - you've hit the nail on the head...

    Esp. for something like this, we need those pics!

    For those who need to slake their thirst for pics, just run a google search for "gigabyte 3d1" - it turns up plenty of other review's w/ pics.
  • Paratus - Thursday, January 6, 2005 - link

  • Speedo - Thursday, January 6, 2005 - link

    yea, not a single pic in the whole review...
  • semo - Thursday, January 6, 2005 - link

    yeah, it's bad enough i can never own one

    we want to see some pretty pictures!
  • miketheidiot - Thursday, January 6, 2005 - link

    I agree with #5

    wheres the pics?
  • pio!pio! - Thursday, January 6, 2005 - link

    #4 dual core video cards in SLI on a dual core cpu dual cpu mobo w/ quad power supplies
  • pio!pio! - Thursday, January 6, 2005 - link

    no pics of this card in the article??
  • Gigahertz19 - Thursday, January 6, 2005 - link

    It's only a matter of time until we see dual video cards that each have dual cores in a system...>>Homer Simpson>>ahhhhhgggggaaaahhhhhhhhh Quad GPU's :)

Log in

Don't have an account? Sign up now