POST A COMMENT

74 Comments

Back to Article

  • Kolbe - Monday, February 14, 2005 - link

    BUYER BEWARE!!
    first, I have this ASUS motherboard and two GigaByte 6600GT's. AFter countless hours of trying to get this to work AND of course upgrading bios and drivers, I came to find out that these two gigabyte cards are not certified by Nvidia and they will not work in the sli mode on this ASUS board. ASUS has not returned my calls or my emails, but gigabyte, bless their hearts, wrote me back and said in essence: "our 6600GT cards work on OUR board" so too bad. Thank God for Newegg and their awesome return policy. I am returning these two and getting one 6800 GT, but of course, not from Gigabyte!
    Reply
  • mashie - Sunday, December 05, 2004 - link

    It would be nice to see tests at 2048x1536. After all if you can afford the videocards for SLI I bet you can get a proper monitor as well ;) Reply
  • Denial - Thursday, December 02, 2004 - link

    Again, why no vanilla 6800's? How would they compete with the 6600GT's in SLI? This is starting to get rediculous. Reply
  • nserra - Friday, November 26, 2004 - link

    Sorry forgot link.

    http://graphics.tomshardware.com/graphic/20041123/...
    Reply
  • nserra - Friday, November 26, 2004 - link

    #69 Yeah i agree.
    But let me tell you i already see something that SLI will give me.

    Having a 6600 and a X700 on the same PC.
    Reply
  • piroroadkill - Friday, November 26, 2004 - link

    I don't know if anyone's said this, but SLI is an absolutely stupid idea, why on earth don't they take the 3Dfx Voodoo5 approach and just stick two GPUs on one card? Surely this would yield similar benefits without special mobo requirements.. 16x PCIe is easily enough bandwidth to cope... then just double the amount of RAM on the card and surely this is more viable solution? sure it'd be an insanely costly card, but still cheaper than an SLI setup, and lets face it, once a single card can outpace your shiny new SLI setup, that SLI setup is going to look poor value for money and you're just going to waste both cards, it seems obscene. Reply
  • stance - Thursday, November 25, 2004 - link

    will the new duel core amd cpus that come out mext year be supported by this motherboard Reply
  • stance - Thursday, November 25, 2004 - link

    Reply
  • tombman - Wednesday, November 24, 2004 - link

    ANAND, please answer:

    1.) Can you really force SLI for games with no profile in the driver?
    2.) please make 2048x1536 or higher Tests (my CRT can da 2304x1440 :D)
    3.) please make 8xAA Tests
    4.) please check if HDR (high dynamic range rendering) in far cry works in SLI mode (other sites say no)

    Especially # 1.) is very important.

    If only games with a profile can run in SLI mode, SLI will not become very popular imo. We know nvidia- they will only have profiles for benchmarks and most common hyped games. For not so popular games there surely will be no profiles...

    thx
    Reply
  • tombman - Wednesday, November 24, 2004 - link

    test Reply
  • CrystalBay - Wednesday, November 24, 2004 - link

    #61 good point, I also wonder how well it runs at super high resolutions... Reply
  • PrinceGaz - Wednesday, November 24, 2004 - link

    We've seen profiles for many games in nVidia drivers since the 5x.xx series, I expect the SLI mode is just something they've added to it for when the card is running in an SLI mode. If in doubt, I'd have thought SLI AFR will be fine for most games that don't have a profile defined (assuming you can choose SLI mode), or SFR for those games that use motion-blurring (usually certain types of racing games). Reply
  • Pampero - Wednesday, November 24, 2004 - link

    where in the review talk about the profiles?
    Is there a list of the games that can enable SLI?

    Reply
  • Gatak - Wednesday, November 24, 2004 - link

    Where is the 2048x1536 tests? It is clear that 1600x1200 is no match for the SLI setup in most games. Why not do testing at higher resolution.

    If 2048x1536 ran smoothly, Then the demand for better monitors would be stronger - giving manufacturers reason to make better monitors for us =).
    Reply
  • PrinceGaz - Wednesday, November 24, 2004 - link

    #55- If SLI is only able to be used on games nVidia have profiled like you say, and that the user cannot themselves force it to run in SLI AFR or SFR mode, then that's a serious problem for people who play non-mainstream games who might be considering SLI.

    After reading this article, I now believe more than ever that the only people who should be seriously considering SLI are those who are willing to buy two 6800GT or 6800 Ultra cards in one go, in order to have a top of the range system. The upgrade option doesn't make sense.

    Anand didn't compare the costs of an SLI upgrade against a non-SLI upgrade; instead he compared buying a second 6600GT later on when they're cheaper, to buying a high-end card initially and *not* doing any upgrade. Of course it's going to be more expensive if you buy a high-end card from the outset.

    The true upgrade alternative is that instead of buying a second (now cheaper) 6600GT to achieve roughly 6800GT performance, you would sell your 6600GT while it can still fetch a good price and put the money towards a (now also cheaper) 6800GT or maybe a mid-range next-generation card that has the required performance. When you look at how much prices fall on high-end cards when something even a little faster comes out, pushing them nearer to mid-range cards; it should be obvious that replacing the card with a faster one is a more cost-effective option for anyone considering an upgrade at a later date, than buying a second identical card on top of the one you already have.

    Yes there's the hassle of selling your first card, but not only do you have total flexibility over what you upgrade to (with SLI you have none); you also don't need an SLI mobo, you won't have two graphics-cards generating excess noise, and you'll have a lot more PCI/PCI-e slots left free for other cards.
    Reply
  • bob661 - Wednesday, November 24, 2004 - link

    IMHO, SLI is for people like me and also for people that need to have the latest and greatest. People like me don't upgrade every 3 months, 6 months, or even a year. I upgrade every 2.5 to 3 years. It would be nice to be able to run 2006 or 2007 games on 2004 technology. Who knows, this might extend my upgrades to 4 years. ;-) Reply
  • Momental - Wednesday, November 24, 2004 - link

    #56 I would imagine that nVidia is aware of this and who knows, they may implement a utility within their driver that automatically flashes the BIOS of the "older" card, if one is detected. Either that, or they could write something into the driver to search for another GPU and once it's found, ask you if you would like to flash its BIOS upon restart. And voila!

    The fact of the matter is that it's way too early to speculate as to whether or not SLI is a viable and cost-effective solution. Something tells me that it will be because it's not like the "next big thing" ie: cards that are twice as fast, are right around the corner. If they were, then I'd say 'no'. It isn't worth it for reasons stated by #42.
    Reply
  • nserra - Wednesday, November 24, 2004 - link

    I think the issue of the card to MUST have the same bios is ENORMOUS. So the buy one now and buy the other LATER will "not" be possible. I doubt that a year old card has the same BIOS of brand new one.

    Too much “issues”…
    Reply
  • nserra - Wednesday, November 24, 2004 - link

    I think the issue of the card to MUST have the same bios is ENORMOUS. So the buy one now and buy the other lather will "not" be possible. I doubt that a year old card has the same BIOS of brand new one.

    Too much “issues”…
    Reply
  • Elliot - Wednesday, November 24, 2004 - link

    I want one of this SLI boards but the article said that you can force the driver to enable SLI on games without profiles on Nvidia drivers but this is not real.

    If no SLI profile exists for a game, there is no SLI rendering. It is not possible to force SLI mode or generate your own profile. According to NVIDIA however the driver already contains over 50 profiles for games running with SLI. For newer titles this therefore means that SLI system owners have to wait for a new driver. But even then there is no guarantee that SLI will be possible with a particular game. So this is not very good news.
    Reply
  • glennpratt - Wednesday, November 24, 2004 - link

    51 - Yeah, and the Voodoo 2 used analog to pass the signal from one card to the other externally. What would you suggest, nVidia make a card that is PCI and combines the signal using analog cables degrading your video quality? Idiocy.

    How many people owned V2 SLI setup and ran it on a crappy computer anyway?
    Reply
  • bob661 - Wednesday, November 24, 2004 - link

    You guys could buy a cheaper CPU and do a mild overclock to get the performance needed. I have a 3500 and I still plan on getting SLI. There's ways to get around the price issue. If I was buying a new system right now I would've gotten a 3200 "winnie" and OC'd it to 2.6GHz. That would put you at FX-55 speeds. If you're lucky you could hit 2.8 to 2.9GHz. Reply
  • bob661 - Wednesday, November 24, 2004 - link

    #49
    You don't need an extravagent budget to afford a monitor that can handle 1600x1200. The Samsung SyncMaster 997DF-T/T 19" CRT can do that for $209 on Newegg.com. I have the older 955DF version which does it too.
    Reply
  • nserra - Wednesday, November 24, 2004 - link

    VOODOO2 didnt NEED any of this, it worked on any MOBO, any monitor, any CARD, any ...... Reply
  • nserra - Wednesday, November 24, 2004 - link

    Buy Monitor.
    Buy PSU.
    Buy MOBO.
    Buy 2 graphics cards.
    Buy good CASE.
    Buy the top of the line processor.


    Too much buys, And all of these itens ALL HAVE TO BE TOP ($$$) OF THE LINE!!!

    I dont have the money, sorry not for me.
    Reply
  • Gundamit - Wednesday, November 24, 2004 - link

    It sounds like if you don't already have a monitor that supports at 1600x1200 you'll have a hard time justifying the SLI expense since you won't see nearly as much performance gain over the single card set-ups at lower resolutions. Just one more expense to consider. Thank goodness LCD panel prices seem to be dropping. I'm onboard for SLI with 6800GTs late Q1 '05. Should be plenty of info and mobo selections out by then. Reply
  • AtaStrumf - Wednesday, November 24, 2004 - link

    I said it before and it looks like I need to say it again:

    SLI like performance improvement (40 - 70% where it counts) in a single GPU over the previous generation single GPU isn't going to happen for AT LEAST 2 years! Example 9700 Pro (2002)/X800XT (2004)

    The other benefit is obviously MUCH lower upgrade cost. Theoretical example: an new $200 9800 Pro or $400 GF 6800 GT and this is really the worst case scenario for SLI -- it would have a lot of performance to make up; but I think that won't happen for a long time.

    And don't forget that we are hitting walls with current technologies, so future generation cards may take much longer than 2 years to bring the 9700/X800 like performance improvement.

    Just look at what ATi is doing. They're going for SLI as well, because there is no way in hell they can compete with it with a single GPU or any kind of single card design that woudn't require it's own power supply and air conditioning unit.

    SLI and dual core is the future; just not for me :-( TOO EXPENSIVE!
    Reply
  • SignalPST - Wednesday, November 24, 2004 - link

    Thank you for the article, Anand. It was very informative and exciting.

    I would like to make a suggestion. Since SLI configurations, as everyone knows, is targeted towards the very top notch enthusiasts, I think it would make a lot of sense to include benchmarks using HDTV(1920x1080) resolutions and the 2048x1536 resolution. A lot of high end 22" CRT monitors as well as high end 23" widescreen LCD's support these resolutions. I imagine these enthusiasts looking for SLI solutions would also be using those types of displays and wondering what kind of performance they would get with their dual video card setup.

    -SignalPST
    Reply
  • SuperStrokey - Wednesday, November 24, 2004 - link

    I wish this would work with 2 agp cards too, would be nice if i could upgrade my bfg6800 gt to a secong one on teh cheap when the new cards come out rather than having to buy a new card. Reply
  • kongming - Wednesday, November 24, 2004 - link

    Nevermind, the V9999 is still just AGP for the time being. Hopefully, they will offer this card in PCI-e in the future. Reply
  • kongming - Wednesday, November 24, 2004 - link

    What I would like to see is the SLI performance increase of the ASUS V9999 6800GT with only 128MB of memory compared with a stock 6800GT with 256MB. If this card gets a particularly good boost from SLI, That would make it an even better deal. Reply
  • coldpower27 - Tuesday, November 23, 2004 - link

    Well there's also the rumored that next generation cards will not be double performance, as it's likely were going to see 6 quad solutions from NV and ATI next year, 8 quad is just too much, for even the 90nm process to handle. THough I would be pleasantly surprised if it's not. Reply
  • Drayvn - Tuesday, November 23, 2004 - link

    What im wondering is what happens if its something like the 9800pro then a year later we got the X800XT-PE or Ultra

    It was twice as fast, and in terms of technology we have SM3 also which we now have in 1 game.

    9800Pro's performance was doubled by the PE and Ultra. So what would happen when lets say if SLI came out then.

    Why would anyone want to buy another 9800Pro? Since u could get the PE with a few added features which are being used now...?

    IT doubles the performance and with the extra features it further expands that performance gap. So is buying 2 9800pros worth it. Especially when they are still really expensive

    Of course this is all hypothetical, and i love SLI but what im getting at, is it now time that nVidia and ATi will slow down their product life cycles?

    Will they now have no refresh cards anymore, since when they bring out their next gen cards, 6 months down the road there is no need to buy a refresh as that only adds little performance and everyone can just buy another card for the same price and get double?
    Reply
  • ceefka - Tuesday, November 23, 2004 - link

    PCI-E, SLI, it´s all graphics so far. Can this technology also be used for soundcards? Can we have 24 channels of 192KHz/32bit someday on PCI-e SLI? If so then the whole bunch should be reconfigurable meaning that you can spread capacity equally over all slots or place emphasis where needed. If that's where we're heading, we're in for some exciting computing. Reply
  • R3MF - Tuesday, November 23, 2004 - link

    #30 is correct, SLI for the Geforce 6 generation makes a lot of sense if your pockets are deep enough.

    SLI for Geforce 7 will be a different proposition, the imminent move to 0.9u and DX10 will create a generational leap when Geforce 8 arrives, so running two Geforce 7's won't be so clever.
    Reply
  • bob661 - Tuesday, November 23, 2004 - link

    I think SLI is worth the money and the present games can use it. But damn is it expensive. I'll still get it though. :-) Reply
  • KAM04m - Tuesday, November 23, 2004 - link

    I personally think SLI is not worth it for the money. Plus i dont run the game at 1600x1200 only 1024x768. SLI setup prices will drop in the future and thats when the newer games will really need the extra bandwith! Until then AGP is still my bud for another year. Reply
  • sophus - Tuesday, November 23, 2004 - link

    cpu limited...? anyone care to theorize if dualies would help increase performance? or what is the limiting factor (bandwidth)? Reply
  • sophus - Tuesday, November 23, 2004 - link

    Reply
  • Filibuster - Tuesday, November 23, 2004 - link

    #33 you can find a few XFX 6800GT PCie cards on pricewatch but they want like $550 for them.
    (I just looked and they are not there anymore though)

    There was a reference card on ebay the other day for $400 though...

    They are basically impossible to get without paying a rediculous amount for.
    Reply
  • bob661 - Tuesday, November 23, 2004 - link

    #33
    I found one here.
    http://www.sharbor.com/products/EVGN5300004.html
    I don't know if they actually have one in stock though.
    Reply
  • jshuck3 - Tuesday, November 23, 2004 - link

    Where are they getting the 6800GT PCI Express cards? I can't find them anywhere...are they even out yet or are these just review boards? Reply
  • L1FE - Tuesday, November 23, 2004 - link

    #28 If the nextgen video cards are also SLI capable, then SLI offers even more performance for a new GPU launch. If you don't want SLI that's your choice, but SLI offers consumers a wider range of choices just because cominations now make it that much more complicated. Whether that's a good or bad thing is yet to be seen, but I like how it makes things exciting between new GPU releases. Reply
  • T8000 - Tuesday, November 23, 2004 - link

    Altrough it is nice to see Nvidia take PC gaming quality one step further, these solutions are more expensive than ever before.

    But where does this money come from, you ask. Well, since CPU's are stuck at around 3 Ghz (or almost equal) for some time now, people look for other upgrades to buy.

    And SLI is an easy way to explain that a top-end GPU solution now costs $1000 instead of $500, because it now contains two $500 cards.

    But since SLI is much cheaper than pre-overclocked (Falcon/Alienware) solutions, it is currently worth its premium for a lot of users.

    It also creates an interesting problem for ATI, to sell technology that is way behind for lowewr prices or to copy the SLI concept, hoping that their users are willing to wait.
    Reply
  • miketheidiot - Tuesday, November 23, 2004 - link

    #28 the next gen of both nVidia and ATI will be only a tiny jump over the current generation. We won't see another big jump until DX10 has been out for a while. The next jump will be a 9700 to 9800 style jump, if that.

    http://www.anandtech.com/news/shownews.aspx?i=2340...
    Reply
  • VIAN - Tuesday, November 23, 2004 - link

    Yeah, where is 8xAA/16xAF. I want that tested. I mean with all that power, who wouldn't want to see the results of the fabulous 8xAA IQ. Reply
  • FICo - Tuesday, November 23, 2004 - link

    So does Nvidia now want everyone to buy 2 of their cards? I really hope its not popular. They should just design faster GPUs rather then relying on such a sledgehammer approach. Nvidia seem to bring out new GPUs once a year, and updates 6 months into a products life. So the new "Geforce 7" chip will be out end of spring time next year. Of course the performance of a single card "GeFroce 7 Ultra" will be a big jump as usual, and will most likely out perform todays PCs with dual 6800 Ultras. Nvidia's SLI technology is certainly interesting, shame its such poor value for money. Surely a dual core approach would be cheaper for the public to buy, yet still offering extra performance. Reply
  • Filibuster - Tuesday, November 23, 2004 - link

    Dual Voodoo2 cards is what, 200Mpixel/s? :) Reply
  • Jeff7181 - Tuesday, November 23, 2004 - link

    Thinking about it more, I think I'd rather just see some 32 pipeline GPU's with 512 MB of RAM and it's very own nuclear reactor to power it :) Reply
  • Souka - Tuesday, November 23, 2004 - link

    Anyone want SLI cheap? don't even have to upgrade your moherboard.....

    For sale.... two 8mb 3dFx Voodoo2 boards wih SLI cable...PCI interface of course..... it rocked in the 90's....why not now?

    :)

    Reply
  • bob661 - Tuesday, November 23, 2004 - link

    #18
    The hardcore gamers would just buy new video cards.
    Reply
  • reboos - Tuesday, November 23, 2004 - link

    "Nvidia bought the patents, pending patent applications, trademarks, brand names, and chip inventory related to the graphics business of 3dfx."

    http://slashdot.org/articles/00/12/15/2244256.shtm...
    Reply
  • fuzzynavel - Tuesday, November 23, 2004 - link

    I think 3DFX were bought by nvidia...or at least the rights to the technology....so it is technically the same company...I remember the days of 3DFX scan line interleave....fantastic! Reply
  • bob661 - Tuesday, November 23, 2004 - link

    #17
    Two Opterons would be downright scary if they were limited too. But a 4000 is no slouch. :-) It's still amazing. I happen to agree with #12 but the real test of that theory would be to test slower CPU's and see how the performance scales.
    Reply
  • reboos - Tuesday, November 23, 2004 - link

    Odd as it may sound, should we be thanking 3DFX for this?

    http://slashdot.org/articles/00/12/15/2244256.shtm...
    Reply
  • Gnoad - Tuesday, November 23, 2004 - link

    Although SLI is exciting, I found myself wanting more info on the Asus board... Reply
  • haris - Tuesday, November 23, 2004 - link

    I just had some more thoughts about why SLI/Multi rendering might not be such a great move by Nvidia/ATI.

    When they launch their next generation cards they are expecting to rake in some extra money from the extreme gamers, right? What happens to that same card when they start purchasing relatively cheap last gen cards instead. This might then lead to something like this: In order for them to get that additional $ during the begining of the next gen card's life cycle they might have to slow down the production cycle of cards to give them more time in the high-end position.
    Reply
  • Jeff7181 - Tuesday, November 23, 2004 - link

    #14... why? You have TWO GPU's here... and ONE CPU. Why is it so amazing that two GPU's can put the squeenze on one CPU? Now... stick a 6800U SLI setup with a couple Opteron 250's with an application that's multi-threaded and THEN I'd be amazed if it was still CPU limited. Reply
  • Aquila76 - Tuesday, November 23, 2004 - link

    Or was that 330 Watts the total system usage? (doubtful) Reply
  • Aquila76 - Tuesday, November 23, 2004 - link

    What power supply was used in your testbed? If the SLI setup requires at load ~ 330 Watts, I would think you'd need around a 550W unit for your setup. Reply
  • bob661 - Tuesday, November 23, 2004 - link

    I find it absolutely amazing that they were CPU limited using a 4000+. Reply
  • Avalon - Tuesday, November 23, 2004 - link

    Anand, you keep saying that a 6600GT in SLI outperforms a 6800U in Doom 3 and HL2, but your benchmarks look partially wrong to be concluding that. It seems it would be more correct if you said that a 6600GT in SLI outperforms a single 6800U in lower res, lower bandwidth situations (such as 12x10 with low AA/AF, or less), but in high res and bandwidth situations (such as 16x12 with a bit of AA/AF), the 6600GT doesn't appear to be able to keep up at all with a single 6800U. Buyers will need to take that into consideration, to make sure that the video setup they will be purchasing will meet their needs specifically. Reply
  • lifeguard1999 - Tuesday, November 23, 2004 - link

    One simple question: Are there Linux drivers that support SLI?

    Historically, people have talked about a setup either being CPU bound or GPU bound. That is no longer the case. With SLI it appears that the limiting factor is data. Simply put, there is not enough data for the dual GPU's to render. This is a common problem in parallel programming, especially when you are talking about thousands of processors. By increasing the amount of data for the GPU to render, one can see that SLI performs better.

    For example, at 1600x1200, the increase is only 20% going from simgle 6800U to 6800U-SLI. Now by increasing the amount of work for the GPU to perform (1600x1200 with 4X AA and 8X AF), the performance increases 48% going from simgle 6800U to 6800U-SLI.

    What this means is that game developers can now have Low, Medium, High, Ultra, and Ultra-SLI rendering modes in their games. :) What a nice "problem" to have.

    In my line of work (Scientific Visualization) where we can have models up to hundreds of millions of polygons, SLI is going to cause a revolution in how we do business.
    Reply
  • Alphafox78 - Tuesday, November 23, 2004 - link

    wow, if I my LCD went to 1600x1200 SLI might help me slightly... I wonder what % of people actually play with the res that high. at 1280x1024 with my 6800GT AGP overclocked to ultra speeds with 4x aa and 8x anistropic I noticed no slowdowns in the game, smooth as can be for 95% of the game. Reply
  • GhandiInstinct - Tuesday, November 23, 2004 - link

    Yes, great performance increase, but its too expensive to afford for most people. Unless you're 3 generations behind and need a new system chances are you'll buy this but not for people that have stabe up to par systems. This just isn't a wise investment when ATI's multi-gpu technology is right around the corner which doesn't limit you to the inferior Nvidia. Reply
  • OriginalReaper - Tuesday, November 23, 2004 - link

    I dont see why 8xAA wasn't used. It's clearly not GPU limited at 4xAA 8xAF. Reply
  • blckgrffn - Tuesday, November 23, 2004 - link

    I am glad that NVIDIAs drivers seem fairly mature already, hopefully they will have a new release out by the time SLI becomes "mainstream". It would have been awful had this real release been tainted by a lot of driver issues. So, the real point here is that if you really want performance, you have to drop $800 at some point? Wow. Two 6600GTs already are not the optimum choice for 1600*1200, so a 6800GT is what should be purchased now if you are really all about fps. Hmmm. I suppose I do know people who would like the incremental upgrade path, but I am not one of them when it comes to graphics cards - I sell my "old" one just before the new product cycle really starts appearing in quanities on the shelves and go for the new technology, and I am guessing that many who have $800 to spend on graphics cards would do something similar. I am jealous of those who will be able to spring for a big-power SLI setup out of pocket... Reply
  • dak - Tuesday, November 23, 2004 - link

    I'm curious, is it possible to use two PCI-X video cards in these SLI boards, but not have them configured SLI? I have some applications that would benefit greatly from having two high end dual output cards in a single computer..... Reply
  • shabby - Tuesday, November 23, 2004 - link

    Great article Anand, can you mention which psu you used to feed those hungry 6800 ultra's? Reply
  • Aquila76 - Tuesday, November 23, 2004 - link

    Looks like SLI'd 6800GT's are a great option for now. I'm greatly awaiting ATI's Multi-Rendering products now. Can't wait for that benchmark setup! Reply
  • JClimbs - Tuesday, November 23, 2004 - link

    A few things glossed over in the 'upgrade path' argument:
    costly up-front mobo purchase. These boards will go down in price, but unless they're a total flop they won't drop nearly as much as a non-SLI board's price slope.
    power supply. Lets face it, when you get around to running two cards, you will need to purchase a more robust PS than those sold with most cases. If you've already spent the cash on a high-quality PS, fine; but upgrade paths are generally not pointed at folks who spend big bucks on a PS.
    Fans. Two GPUs under the hood will almost certainly want more cooling. Admittedly, they're cheap. Good thing...
    Electric Bill. The power draw of today's Graphics Cards is already breathtaking. With two of 'em chugging away under the hood, that drain looks absolutely scary.
    Reply
  • Ecmaster76 - Tuesday, November 23, 2004 - link

    Anyone want to bet that the two GPU's are using a hypertransport or derived interconnect. The bandwidth quoted is in the same neighborhood as an Athlon's, but probably a little faster since the trace lengths are short and straight. Reply
  • ChronoReverse - Tuesday, November 23, 2004 - link

    Pretty nice speed bumps, but the 6600GT sli is disappointing in how it seems to always lag behind even the 6800GT with high-resolutions and AA+AF enabled. Reply
  • ariafrost - Tuesday, November 23, 2004 - link

    SLI has a lot of potential, that's for sure... :D Reply

Log in

Don't have an account? Sign up now