POST A COMMENT

43 Comments

Back to Article

  • beany323 - Tuesday, January 11, 2005 - link

    Wow, I have been planning on getting a new computer (from Cyberpower) and was like, ok get the PCIe...then i see the SLI....so i read up on it..now i am confused! I bought a hp 2 years ago, and now i would be lucky if i could use it to play WoW (i tried already, to old a video card, anyways) i am willing to spend some money but dont want to get stuck with a big paperweight. I thought the idea with 2 cards sounded good (i was even thinking might as well get 2 ultra's :) ) but not sure now... anymore thoughts? Just from reading this thread, you guys know WAY more then i could ever sit down and read...so thanks in advance!!


    beany323
    Reply
  • endrebjorsvik - Tuesday, January 11, 2005 - link

    Would it be possible to use two 3D1-cards in non-SLi-mode and still use all the GPU's? Then you would be alble to get serious performance in triple-/quad-monitor-systems. Reply
  • sxr7171 - Sunday, January 09, 2005 - link

    I thought the whole point of SLI was to offer an upgrade path that allowed consumers to stagger their spending by upgrading their performance in two stages. Buying a single card with two GPUs that costs the same and performs worse than a single 6800 ultra is quite pointless.

    There are absolutely no situations where a dual 6600GT card outperforms a single 6800 ultra card. There are no synergies in having two GPUs on the same card and no incentive to buy such a card.
    Reply
  • PrinceGaz - Saturday, January 08, 2005 - link

    Just in cast that sounded abit harsh, all I will say is would you perosnally swap a 6800GT based card for that 3D1 multi-core card? Reply
  • PrinceGaz - Saturday, January 08, 2005 - link

    I'm sorry but I still see no sensible reason whatsoever to buy a dual-core 6600GT card when a 6800GT could be bought for a similar price. I can see lots of stupid reasons for buying the dual-core 6600GT, but nothing whatsoever people have said in the comments or in the review give a good reason for choosing it in preference to a 6800GT. I'm afraid I'm going to have to jump on a bit of a bandwagon and wonder if everything reviewed here is always considered good generally, as I can't remember the last time something got a well deserved slating. Which this 3D1 should have been given because of the hardware compatibility issues, and also software compatibility issues with games that aren't SLI recognised, or the fact it will work at half speed (one core) with games nVidia hasn't bothered looking at.

    When you review stuff you see good and bad, but all we ever read about here are products which are great, or products which will be very good after they fix this and that. The only review recently which had any constructive criticism was that of normal 6600GT's where you looked at the fan-mounting method. Maybe you only get to see the very best products because that is all the manufacturers will send you (which explains why there was no mid-range/budget memory round-up as they don't want to send a 512MB ValueWhatnot stick that will perform worse than everything else).

    What I'd have said about the 3D1 after looking at the performance is that they shouldn't have bothered with a 6600GT dual-core, but instead have done an NV41 based dual-core card. You'd be a fool to buy the 3D1 the way it as at the moment.

    Review quotes like "Until then, bundling the GA-K8NXP-SLI motherboard and 3D1 is a very good solution for Gigabyte Those who want to upgrade to PCI Express and a multi-GPU solution immediately have a viable option here. They get the motherboard needed to run an SLI system and two GPUs in one package with less hassle." make me wonder if someone paid you to say that.
    Reply
  • AtaStrumf - Saturday, January 08, 2005 - link

    *loose some performance* - compared to a more powerful but equally priced single chip solution Reply
  • AtaStrumf - Saturday, January 08, 2005 - link

    johnsonx the reason for buying one card with 2 6600 GTs or two separate 6600GTs instead of one 6800 GT may be in the fact that you get very close to 6800 *Ultra* performance, provided you don't use AA or Aniso in newer games. Granted OC-ing 6800GT will do the same + give you AA/Aniso performance od the same level.

    I guess the reason to go for SLi is in its ability to provide a cheap upgrade path and not for two GPUs to be put on the same board and save you 0$ + loose some performance. Gigybyte may have missed an important point here. What they should be making is one board 2 x 6800 GTs, since that provides unheard of performance in a single card (something NEW) and/or at least lower the price of 3D1 to make it cheaper than 2 separate cards and of course make it work in non SLi boards. Simply put, give it some tangible extra value over 2 board SLi solution and not the other way around as it is now - only two monitors and such.
    Reply
  • MadAd - Saturday, January 08, 2005 - link

    Nvidia are looking more and more like 3dfx every day - big boards, sli, late with real releases

    Id prefer an AIW X900SLI board tho
    Reply
  • glennpratt - Saturday, January 08, 2005 - link

    johnsonx - I agree, though I have a theory. If your like me and the 10 other computers in your house/immediate family consist of hand me down parts from a couple primary computers, it would be nice to have two decent video cards when you upgrade next. Still stupid... Reply
  • johnsonx - Friday, January 07, 2005 - link

    What I find puzzling about this whole SLI thing right now (not this dual-GPU card in particular) is all the people buying an ultra-expensive SLI board and dual 6600GT's to build a new system. I understand buying an SLI board and one 6600GT or 6800GT to allow for future upgrade, and I even understand buying SLI and dual 6800GT's for maximum performance (though it sure seems like overkill and overspending to me).

    But buying dual 6600GT's on purpose just doesn't make any sense at all. I guess they just want to say they have it? Even though it costs far more and performs the same as a non-SLI board with a 6800GT....

    This particular dual-GPU card doesn't make any sense right now either, for all the obvious reasons.

    Maybe later....
    Reply
  • sprockkets - Friday, January 07, 2005 - link

    Thanks for the clarification. But also some were using the Server Intel chipset cause it had 2 16x slots, instead of the desktop chipset to use SLI. Like the article said though, the latest drivers only like the nvidia sli chipet. Reply
  • ChineseDemocracyGNR - Friday, January 07, 2005 - link

    #29,

    The 6800GT PCI-E is probably going to use a different chip (native PCI-E) than the broken AGP version.

    One big problem with nVidia's SLI that I don't see enough people talking about is this:
    http://www.pcper.com/article.php?aid=99&type=e...
    Reply
  • Jeff7181 - Friday, January 07, 2005 - link

    Why is everyone thinking dual core CPU's and dual GPU video cards is so far fetched? Give it 6-12 months and you'll see it. Reply
  • RocketChild - Friday, January 07, 2005 - link

    I seem to recall ATi was frantically working on a solution like this to bypass Nvidia's SLI solution and I am not reading anything about their progress. From the position the article points to BIOS hurdles, does it look like we are going to have to wait for ATi to release their first chipset to support a multi-GPU ATi card? Anyone here have any information or speculations? Reply
  • LoneWolf15 - Friday, January 07, 2005 - link

    25, the reason you'd want to buy two 6600GT's instead of one 6800GT is that PureVideo functions work completely on the 6600GT, whereas they are partially broken on the 6800GT. If this solution didn't work in only Gigabyte boards, I'd certainly consider it myself. Reply
  • skiboysteve - Friday, January 07, 2005 - link

    Im confused as to why anyone would buy this card at all. Your paying the same price as a 6800GT and getting the same performance with all the issues that go with Gigabyte SLI. thats retarded. Reply
  • ceefka - Friday, January 07, 2005 - link

    Are there any cards available for the remaining PCI-E slots? Reply
  • Ivo - Friday, January 07, 2005 - link

    Obviously, the future belongs to the matrix CPU/GPU (IGP?) solutions with optimized performance/power consumption ratios. But there is still a relatively long way (2 years?) to go. The recent NVIDIA's NF4-SLI game is more marketing, then technical in nature. They are simply checking the market, concurrence, and … enthusiastic IT society :-) The response is moderate, as the challenge is. But the excitements are predetermined.
    Happy New Year 2005!
    Reply
  • PrinceGaz - Friday, January 07, 2005 - link

    I don't understand why anyone would want to buy a dual-core 6600GT rather than a similarly priced 6800GT. Reply
  • DerekWilson - Friday, January 07, 2005 - link

    I appologize for the omission of pictures from the article on publication.

    We have updated the article with images of the 3D1 and the K8NXP-SLI for your viewing pleasure.

    Thanks,
    Derek Wilson
    Reply
  • johnsonx - Friday, January 07, 2005 - link

    To #19:

    from page 1:

    "....even if the 3D1 didn't require a special motherboard BIOS in order to boot video..."

    In other words, the mainboard BIOS has to do something special to deal with a dual-GPU card, or at least the current implementation of the 3D1.

    What NVidia should do is:

    1. Update their drivers to allow SLI any time two GPU's are found, whether they be on two boards or one.
    2. Standardize whatever BIOS support is required for the dual GPU cards to POST properly, and include the code in their reference BIOS for the NForce4.

    At least then you could run a dual-GPU card on any NForce4 board. Maybe in turn Quad-GPU could be possible on an SLI board.


    Reply
  • bob661 - Friday, January 07, 2005 - link

    #19
    I think the article mentioned a special bios is needed to run this card. Right now only Gigabyte has this bios.
    Reply
  • pio!pio! - Friday, January 07, 2005 - link

    #18 use a laptop Reply
  • FinalFantasy - Friday, January 07, 2005 - link

    Poor Intel :( Reply
  • jcromano - Friday, January 07, 2005 - link

    From the article, which I enjoyed very much:
    "The only motherboard that can run the 3D1 is the GA-K8NXP-SLI."

    Why exactly can't the ASUS SLI board (for example) use the 3D1? Surely not just because Gigabyte says it can't, right?

    Cheers,
    Jim
    Reply
  • phaxmohdem - Friday, January 07, 2005 - link

    ATI Rage Fury MAXX Nuff said...

    lol #6 I think you're on to something though. Modern technology is becoming incredibly power hungry I think that more steps need to be taken to reduce power consumption and heat production, however with the current pixel pushing slugfest we are witnessing FPS has obviously displaced these two worries to our beloved Video card manufacturers. At some point though when consumers refuse to buy the latest Geforce or Radeon card with a heatsink taking up 4 Extra PCI slots, I think that they will get the hint. I personally consider a dual slot heatsink solution ludicrous.

    Nvidia, ATI, Intel, AMD... STOP RAISING MY ELECTRICITY BILL AND ROOM TEMPERATURE!!!!
    Reply
  • KingofCamelot - Friday, January 07, 2005 - link

    #16 I'm tired of you people acting like SLI is only doable with an NVIDIA motherboard, which is obviously not the case. SLI only applies to the graphics cards. On motherboards SLI is just a marketing term for NVIDIA. Any board with 2 16x PCI-E connectors can pull off SLI with NVIDIA graphics cards. NVIDIA's solution is unique because they were able to split a 16x line and give each connector 8x bandwidth. Other motherboard manufacturer's are doing 16x and 4x. Reply
  • sprockkets - Thursday, January 06, 2005 - link

    I'm curious to see how all those lame Intel configs by Dell and others pull off SLI long before thie mb came out. Reply
  • Regs - Thursday, January 06, 2005 - link

    Once again - history repeats itself. Dual core SLI solutions are still a far reach from reality. Reply
  • Lifted - Thursday, January 06, 2005 - link

    Dual 6800GT's???? hahahahahehhehehehahahahah.

    Not laughing at you, but those things are so hot you'd need a 50 pound copper heatsink on the beast with 4 x 20,000 RPM fans running full boar just to prevent a China Syndrome.

    Somebody say dual core? Maybe with GeForce 2 MX series cores.
    Reply
  • reactor - Thursday, January 06, 2005 - link

    so basically it performs the same as sli and for the same price as the sli setup, but only works with gb boards. wouldve like to see some power/cooling comparisons and pics although ive already seen it.

    in the end id rather get a 6800gt.
    Reply
  • mkruer - Thursday, January 06, 2005 - link

    Just wait we will see Dual Core GPU's soon enough. Reply
  • yelo333 - Thursday, January 06, 2005 - link

    #5,#7,#9 - you've hit the nail on the head...

    Esp. for something like this, we need those pics!

    For those who need to slake their thirst for pics, just run a google search for "gigabyte 3d1" - it turns up plenty of other review's w/ pics.
    Reply
  • Paratus - Thursday, January 06, 2005 - link

    Reply
  • Speedo - Thursday, January 06, 2005 - link

    yea, not a single pic in the whole review... Reply
  • semo - Thursday, January 06, 2005 - link

    yeah, it's bad enough i can never own one

    we want to see some pretty pictures!
    Reply
  • miketheidiot - Thursday, January 06, 2005 - link

    I agree with #5

    wheres the pics?
    Reply
  • pio!pio! - Thursday, January 06, 2005 - link

    #4 dual core video cards in SLI on a dual core cpu dual cpu mobo w/ quad power supplies Reply
  • pio!pio! - Thursday, January 06, 2005 - link

    no pics of this card in the article?? Reply
  • Gigahertz19 - Thursday, January 06, 2005 - link

    It's only a matter of time until we see dual video cards that each have dual cores in a system...>>Homer Simpson>>ahhhhhgggggaaaahhhhhhhhh Quad GPU's :) Reply
  • Gigahertz19 - Thursday, January 06, 2005 - link

    1st is the worst...2nd is the best....3rd is the one with the hairy chest :) Reply
  • bbomb - Thursday, January 06, 2005 - link

    It seems like Nvidia just wants to make sure that none of their partners can benefit from SLI technology to ensure that Nvidia has some new technology to introduce in th future.

    I bet Nvidia already has a multi-gpu card that work on any board and can probably work in SLI with another multi-GPU card sitting in a cabinet somewhere until Nvidia sees fit to let us get our hand on the technology.

    I hope ATI's solution stomps Nvidias into the ground, but then again Nvidias software team cant seem to get it right and they blow away ATI's driver progam which leads me to beleive that ATI will have driver problems as well.
    Reply
  • HardwareD00d - Thursday, January 06, 2005 - link

    yippie first post! Reply

Log in

Don't have an account? Sign up now