Final Words

We would like to begin our conclusion by thanking Gigabyte for being the first to come out with such a creative and bleeding edge product. We love to see companies pushing the envelope whereever possible, and this kind of thinking is what we want to see. Of course, it might be a little easier to work on technology if NVIDIA weren't so tight on restricting what they will and will not enable in their drivers.

Unfortunately, in light of the performance tests, there really isn't much remarkable to say about the 3D1. In fact, unless Gigabyte can become very price competitive, there isn't much reason to recommend the 3D1 over a 2-card SLI solution. Currently, buying all the parts separately would cost the same as what Gigabyte is planning to sell the bundle.

The drawbacks to the 3D1 are its limited application (it will only run on the GA-K8NXP-SLI), the fact that it doesn't perform any better than 2-card SLI, and the fact that the user loses a DVI and an HD-15 display connection when compared to the 2-card solution.

Something like this might be very cool for use in a SFF with a motherboard that has only one physical PCIe x16 connector with the NVIDIA SLI chipset. But until we see NVIDIA relax their driver restrictions, and unless Gigabyte can find a way to boot their card on non-Gigabyte boards, there aren't very many other "killer" apps for the 3D1.

The Gigabyte 3D1 does offer single card SLI in a convenient package, and the bundle will be quite powerful for those who choose to acquire it. But we aren't going to recommend it all the same.

As for the Intel solution, a lot rests on NVIDIA's shoulders here as well. With their new Intel chipset coming down the pipeline at some point in the future, it could be that they just don't want it to work well with others. Maybe they just want to sell more of their own parts. Maybe they are actually concerned that the end user won't have the best possible experience on hardware that hasn't been fully tested and qualified to work with SLI. In the end, we will have to wait and see what comes out of NVIDIA in terms of support for other hardware and the concoctions that their partners and customers cook up.

ATI should take note of the issues that NVIDIA is dealing with now, as there are many ways that they could take advantage of the present landscape.

Again, while we can't recommend the Gigabyte 3D1 over standard 6600 GT SLI solutions, we do hope to see other products like this step up to the plate. Ideally, in future single card, multi-GPU solutions, we would like to see full compatibility with any motherboard, the use of true 256-bit memory busses for each GPU (in order to see scalability apply well to memory-intensive settings as well - multiple NV41 GPUs would be nice to see), and three or four external display connectors rather than just two. It may be a lot to ask, but if we're expected to pay for all that silicon, we want to have the ability to take full advantage of it.

Wolfenstein: Enemy Territory Performance
Comments Locked

43 Comments

View All Comments

  • sprockkets - Friday, January 7, 2005 - link

    Thanks for the clarification. But also some were using the Server Intel chipset cause it had 2 16x slots, instead of the desktop chipset to use SLI. Like the article said though, the latest drivers only like the nvidia sli chipet.
  • ChineseDemocracyGNR - Friday, January 7, 2005 - link

    #29,

    The 6800GT PCI-E is probably going to use a different chip (native PCI-E) than the broken AGP version.

    One big problem with nVidia's SLI that I don't see enough people talking about is this:
    http://www.pcper.com/article.php?aid=99&type=e...
  • Jeff7181 - Friday, January 7, 2005 - link

    Why is everyone thinking dual core CPU's and dual GPU video cards is so far fetched? Give it 6-12 months and you'll see it.
  • RocketChild - Friday, January 7, 2005 - link

    I seem to recall ATi was frantically working on a solution like this to bypass Nvidia's SLI solution and I am not reading anything about their progress. From the position the article points to BIOS hurdles, does it look like we are going to have to wait for ATi to release their first chipset to support a multi-GPU ATi card? Anyone here have any information or speculations?
  • LoneWolf15 - Friday, January 7, 2005 - link

    25, the reason you'd want to buy two 6600GT's instead of one 6800GT is that PureVideo functions work completely on the 6600GT, whereas they are partially broken on the 6800GT. If this solution didn't work in only Gigabyte boards, I'd certainly consider it myself.
  • skiboysteve - Friday, January 7, 2005 - link

    Im confused as to why anyone would buy this card at all. Your paying the same price as a 6800GT and getting the same performance with all the issues that go with Gigabyte SLI. thats retarded.
  • ceefka - Friday, January 7, 2005 - link

    Are there any cards available for the remaining PCI-E slots?
  • Ivo - Friday, January 7, 2005 - link

    Obviously, the future belongs to the matrix CPU/GPU (IGP?) solutions with optimized performance/power consumption ratios. But there is still a relatively long way (2 years?) to go. The recent NVIDIA's NF4-SLI game is more marketing, then technical in nature. They are simply checking the market, concurrence, and … enthusiastic IT society :-) The response is moderate, as the challenge is. But the excitements are predetermined.
    Happy New Year 2005!
  • PrinceGaz - Friday, January 7, 2005 - link

    I don't understand why anyone would want to buy a dual-core 6600GT rather than a similarly priced 6800GT.
  • DerekWilson - Friday, January 7, 2005 - link

    I appologize for the omission of pictures from the article on publication.

    We have updated the article with images of the 3D1 and the K8NXP-SLI for your viewing pleasure.

    Thanks,
    Derek Wilson

Log in

Don't have an account? Sign up now