POST A COMMENT

49 Comments

Back to Article

  • archerprimemd - Tuesday, April 01, 2008 - link

    i apologize for the noob question, just didn't know where to look for the info:

    dual gpu single card or single gpu dual cards (meaning, in SLI)

    which is better?

    also, isn't having the 2 gpus in one card sort of like doing an SLI?
    Reply
  • Tephlon - Friday, April 04, 2008 - link

    "dual gpu single card or single gpu dual cards (meaning, in SLI)
    which is better?"

    To be honest, these seems to very based on timing and pricing.
    For instance, back in the 7 series, I bought two 7800GT's and SLI'd them. About a week later, the 7950GX2 became available. It offered similar (if not better in some cases) performance than the two 7800gt's, so I returned the gt's and got the GX2.
    But at the same time... The 7800 Ultra's were available... and two of those in SLI were better than the GX2... but for nearly twice the money.
    Again, this might vary generation to generation, so YMMV.

    "also, isn't having the 2 gpus in one card sort of like doing an SLI?"

    the short answer is yes. I actually posted about this in more detail just a few pages ago, so for more on the subject see http://www.anandtech.com/video/showdoc.aspx?i=3266...">http://www.anandtech.com/video/showdoc.aspx?i=3266...
    Reply
  • Tephlon - Friday, April 04, 2008 - link

    oops, sorry.

    I meant to say see my post at http://www.anandtech.com/video/showdoc.aspx?i=3266...">http://www.anandtech.com/video/showdoc.aspx?i=3266...

    Its about half way down the comments page. or you can just search 'Tephlon'
    Reply
  • SlingXShot - Wednesday, March 26, 2008 - link

    You know 3dfx tried this SLI madness, and put 16 chips on one board...and you know they failed...these products are not attractive to standard joe and the only people who care are the ones who install new computer for a living. Is it not good practice puting 4 video cards together. People want new design, etc. Reply
  • Ravensong - Friday, March 21, 2008 - link

    Ok, here's what I don't get and I hope someone can clarify this for me. In the article "ATI Radeon HD 3870 X2: 2 GPUs 1 Card, A Return to the High End" the CoD4 benchmarks running at 1920x1200 HQ settings, 0x AA/16x AF give a result of 107.3 fps yet this article's benchmark shows a result of 53.8 for 1920x1200. When I saw this I yelled out like like Lil Jon "WHAAT??" How did the frames drop this much? Perhaps the new 8.3 drivers are raping performance? This seems to be the case with every benchmark other than Crysis which received a minor increase from the 8.3 drivers. I'm not a fanboy for ATI/AMD by any means but I hardly see these scores as fair when just a few video articles ago this thing was doing well and then all the sudden it has piss poor performance when the GX2 launches. Reading this site on a daily basis I figured that the weird drop in performance would have been noted?? Not sure if anyone else noticed this but I surely did right away. I know I've had nothing but headaches atm with 8.3 and trying to get the 4 3850's I bought running in crossfire X. Thankfully thats just my secondary rig, if it had been my main I may have smashed it into pieces by now :D Reply
  • TheDudeLasse - Friday, March 21, 2008 - link

    It's gotta be Catalyst 8.3
    The scores im getting with 8.2 are 70% better.


    Reply
  • Ravensong - Friday, March 21, 2008 - link

    Definitely, no other explanation as to why the scores are so horrid compared to only a month and a half ago when the original benches debuted. I wish all the sites using 8.3 would correct this injustice!! lol... Hardocp went as far as saying "The Radeon HD 3870 X2 gets utterly and properly owned, this is true “pwnage” on the highest level." ... just wow. :D Reply
  • Ravensong - Saturday, March 22, 2008 - link

    Any comments on this dilemma Sir Wilson?? (referring to the author) :D Reply
  • TheDudeLasse - Friday, March 21, 2008 - link

    I think you may have had some driver issues with the 3870X2.

    Im running a q6600@3.4 and 3870x2.

    I´ve been running the same benchmarks as you describe and the results are completely different.

    For instance Call of Duty benchmark results vary almost over 70%
    I ran the same benchmark "We start FRAPS as soon as the screen clears in the helicopter and we stop it right as the captain grabs his head gear."

    Example
    1920x1200 4xAA and 16AF Your result 42.3 fps average
    My result 76.056fps average
    That's an over 75% improvment to your score.
    What's the jig? screwed up catalyst drivers or what?


    Reply
  • 7Enigma - Thursday, March 20, 2008 - link

    Derek,

    I see you have not answered the requests regarding why 8800GT and 8800GTS SLI was not included in these benchmarks. I can understand if you were not allowed to due to some Nvidia NDA, and why you might not be able to talk about it.

    If you could please reply with a :) if this is the case, we would be greatly appreciative. Otherwise it looks like there is a gaping whole in this reveiw.

    Thank you.
    Reply
  • dare2savefreedom - Thursday, March 20, 2008 - link

    i just wanted you guys to know that your doing a poor job.

    I checked toms,guru,techpower and they all kick your butt.

    Techpower even took the card apart

    things you should do next time:

    1) more Fing pictures - did you really review the card?
    2) use a amd 4400 and see what happens - real world check - this would be different from everyone else since everyone else did an overpriced intel cpu test.
    2a) test with an realistic price range intel processor core2 8400 or 8200
    3) put rulers in the pictures
    4) try thief 2 for an old game check, somebody on the forums is complaining about nvidia drivers
    5) where da fear bench?
    6) bench at least 1 racing game.
    7) stop playing with your mac and do the review.
    8) where the pci scan listing,i dont get an idea of how the card appears - is it hidden from pci scans that there are 2 boards
    9) a good artist copies a great artist steals
    Reply
  • DoctorDeath - Wednesday, March 19, 2008 - link

    I bought two 9800GX2s from www.dandh.com for $562 each and even without good drivers these cards start to shine after 1600x1200. I sold my three 8800Ultras to by them and i am not sorry that i did. When the new quad drivers are release next week they are going to make a big diff in performance. My 2nd machine has two 512 8800GTS's and to be honest they do not come close in performance. Both machines have 780i boards and overclocked QX9650s and the same 4Gbs of memory,even the same 1200w power supplys.So people that are saying the the GX2 is the same as running 512 8800GTS's are wrong,and the performance proves that. Reply
  • gochichi - Wednesday, March 19, 2008 - link

    While the 8800GT may be great, the GTS 512MB adds just enough that you feel like you went absurdly overboard without breaking the bank.

    I just purchased a very nice but basic Quad-Core processor based Dell. If I wanted to go absolutely crazy without pitching a wonderful computer away, I would go with this ultra mega card (and a power supply definitely).

    I think it's very wise of NVIDIA and ATI to offer something up for the people not willing to go through the hassle (or apparent hassle) of SLI. I mean, I know people that don't know much and they ask me about stuff... and sometimes they just want to splurge on something. They don't necessarily want to build a gaming rig from the ground up.

    We are very unique in our obsessions with computer parts and sensitivity to their price. Look at how well Apple is doing. Most people have more money than they have hardware expertise.

    $599.00 really is quite "reasonable". At least it TROUNCES the competition, it's not like it's close in terms of single slot performance. Compare that with buying two 8800GT's, replacing the motherboard, hoping everything pans out, re-certifying Vista (if it's even possible from an OEM version of it... literally OEM like Dell, Gateway, etc). Etc... this seems like you spend money (of which you may have much readily available, it's completely wrong to assume that money is tight for everyone) slap it in (change of power supply which messes this up, and I think this is a very big obstacle, as big as the price) and it pretty much "just works".

    The black case is very elegant, very desirable looking. Does average joe still want a MacBook Air, or does he want an 9800GX2 and a MacBook? You know what I'm saying here? It's not like we don't buy frivolous stuff ALL OF THE TIME. I have a pair of $250.00 jeans... I am absurd. I just bought YET ANOTHER computer... I am absurd. (Aren't you as well? Do you burn up $200.00 of fuel each month instead of taking the bus?) I don't see why pick this wonderful piece of hardware out as particularly absurd. Like I said, not requiring SLI is huge. SLI = expert/geek and/or SLI= Extremely overpriced computer with glowing lights.

    After seeing these benchmarks, there is a part of me that knows that this won't be beat out in a couple of years... so why not just buy it and get it over with. I am kind of in the market for a card... why not add $400.00 to the budget and experience "too much" for a change.

    And then I come back to reality, a reality where every game except Crysis will run fantastically on 1920x1200 on the very high-tech, slick, modern, efficient, 9600GT or "splurging" for the slightly less elegant 8800GTS 512 which is only a handful of dollars more than the standard 8800GT. This reality includes 8GB of RAM for merely $150.00 (and yet I don't have 8GB of RAM). Main reason I'm bending more towards 9600GT is that I really don't want to mess with a new powersupply.

    I'm telling you all right now. If this 9800 GX2 wonderfulness were something small, and silent that you could just plug into a USB port (imagine here, try to think like normal people for a minute) and it would accelerate the crap out of games... it would become a best selling video card even at $600.00.

    You know what's really absurd? That we read about cards that are less outstanding than this card. I mean this card is a "Ferrari", it's never lame to gawk at a Ferrari. Sadly, I spend most of my time learning about the specs on the next Corolla.

    Buying this card may not be so awesome (for me at this particular time) but in terms of reading it is definitely awesome. I'm more of a daily commuter type.



    Reply
  • instant - Wednesday, March 19, 2008 - link

    Somehow I feel it would be fitting to include the 7950GX2 in the test of its successor.

    Reply
  • AnnonymousCoward - Wednesday, March 19, 2008 - link

    Page 2 says "Windows XP is still limited by a 3 frame render ahead), Quad SLI will be able to implement a 4 frame AFR". Is this the same as the setting that used to appear in the nvidia control panel called "max frames to render ahead"?

    That setting causes input lag. I use RivaTuner to force it to 1 frame, because at the default of 3 frames, games like Oblivion, Mythos, and Hellgate (and probably many others) have input lag at high resolutions! Does Quad SLI require 4 frames of rendering ahead / buffering? If so, forget it!
    Reply
  • dare2savefreedom - Tuesday, March 18, 2008 - link

    i dont get it you guys are mac lovers so maybe you could counterculture or think different:

    How about instead of showing exactly the same stuff that every other hw site shows why don't you guys show the 9800gx2 with a amd x2 4200 or something or amd x2 6000

    i can justify spending 500 on a video card because i know it's worth it but 1000 on a intel cpu that you guys usually bench with it not something that gives me value - i could sli 8800gtxs for that 1000 for a cpu.

    Reply
  • Le Québécois - Tuesday, March 18, 2008 - link

    The main idea is to remove as much limitation for the GPU as it is possible so you can have an idea of how it really perform when "nothing" else slow it down. Reply
  • andylawcc - Tuesday, March 18, 2008 - link

    the chart only shows results in the highest resolution setting, in which, the ATI 3x 3870 fails behind both 9800GX2 and 9600GT SLI. However, the 3x 3870 leads the 9600GT SLI for the majority of the time in lower resolution. Reply
  • JarredWalton - Tuesday, March 18, 2008 - link

    You don't normally buy a $600 video card to play at 1280x1024, or for many even 1920x1200. For this type of GPU, I think 30" LCDs are going to be reasonable. Which is to say that I don't think a lot of people are going to buy one of these, because it's so expensive, but if you have the money for a 9800 GX2 you probably have the money for a 30" LCD. Regardless, we have the resolution scaling charts for people that *do* care about other resolutions.

    Note also that there are some oddities in the results - i.e. 9600 GT SLI really shouldn't even match the 9800 GX2 in any situation, considering it has less memory bandwidth, half the number of shaders, fewer ROPs, and a lower core speed. And yet it does come out ahead in some of the tests in this article. We're still investigating that; the GX2 should actually be a lot closer to 8800 GTS 512MB SLI performance - i.e. faster than 8800 Ultra SLI in many cases.
    Reply
  • coldpower27 - Wednesday, March 19, 2008 - link

    9600 GT SLI would have greater ROP power as your still dealing with 16 ROP per board wit 256Bit interfaces each. The additional ROP power comes from the higher Core speed in that setup.

    Should this card be faster then 8800 Ultra SLI? That setup has more memory bandwidth due to greater memory interface and higher memory clocks, and more ROP power due to having 2 sets of 24 ROP blocks. From reviews I have seen, 8800 GTS 512 SLI is about equal to 8800 GTX SLI, with GTX SLI winning as you get to higher resolutions.

    All things equal I would expect this card to slot somewhere between the 8800 GTS 512 SLI setup and the 8800 GT SLI setup. For the most part that is the case, I see it is indeed faster the the 8800 GT SLI looking at the TechReport results, not too sure on the 8800 GTS 512 SLI.
    Reply
  • andylawcc - Wednesday, March 19, 2008 - link

    okay, thanks for explaining. I never phantom myself playing at resolution beyond 1600x1200. It's just cost too much to get "one notch" above 1920x1200 with the extra cost for monitor and video card. Reply
  • Le Québécois - Tuesday, March 18, 2008 - link

    Having 2 8800 GT in SLI, I know there are some games that either don't show any amelioration for SLI(not a problem here) or become really unstable to the point of being unplayable. One title that comes to mind now is Colin McRae:DiRT. I have to disable SLI every time I want to play it. Even some of the fixes suggested on the official nVidia forum don't works for me.

    Now I don't want tech support;)...

    I just want to know what would happen with a games like that and the 9800GX2 (or the 3870 X2 for that matter) since those cards "sli" configuration can't be disable since the driver doesn't see those cards as dual card configuration but rather one single card.
    Reply
  • Tephlon - Wednesday, March 19, 2008 - link

    Sorry to be kind of off topic by not answering your question specifically- I actually don't own either card and can't comment on their specific operation.
    But your concern is quite valid, in fact the SLI issues seem to become more complicated with the hybrid "two cards in one" solutions than that of a normal two-card SLI setup.


    "...since those cards "sli" configuration can't be disabled since the driver doesn't see those cards as dual card configuration but rather one single card."

    Unless I just missed it in the review, there's nothing to make me believe this is actually true. It's certainly not true for nvidia's previous dual-card, the 7950gx2. While the term 'single card' makes it sound as if it would actually operate as a single card, both the OS and the Drivers see the 7950gx2 twice, as two cards. Within the drivers, you disable SLI the same as if there were two physical cards in the system, sli'd together. It RELIES on this to function at it's full potential... or, in other words, is literally nothing more than just two cards sli'd, only crammed into one slot. Again, there's nothing to make me believe this wouldn't be true for the 9800 GX2 as well.

    This is actually the reason why I can never see myself owning one of these cards, maybe even ever do SLI in general, again.
    One annoyance that I discovered from owning the 7950gx2, is that while it is 'technically' two cards strapped together by SLI, nvidia doesn't consider it as such. It uses SLI profiles, and relies heavily on driver tweaks and support to optimize it's use with games, but if you try to join the SLI Club to get some support/feedback for your card, they tell you you aren't SLI and boot you out the door.
    Nearly every game that came out needed special driver tweaks and profiles to make the game actually run at all, not to mention work correctly using both gpu's. It turned into a "wait 6 months after a games' release for new drivers" type situation. I very seldom could make this $600 powerhouse run a game the way it should right out of the box, or even within the first several weeks of release.
    I spent the majority of my time at lans tweaking sli profiles to try to make a game run instead of playing it. Your investment starts to look really worthless really quick when the guy next to you has a $200 value gamer card of the same generation, and can run the game as good as you because you're waiting on driver updates to make both cards work correctly together for the game.

    In my opinion, it became a situation of getting all the pains of SLI (tinkering/struggling with SLI profiles, etc) with very few of it's perks (blazing speed/high-end gamer respect and support).

    And while my performance seemed decent while the card was the newest, greatest thing, it started to fade the instant the 8 series landed. As new products/sli configurations are released, they become the priority, and my then slightly-dated but serious hunk of technology got thrown under the bus. I swear the support for a game as simple as WoW is worse now for the 7950gx2 than it ever was.
    The cards simply seem to become 'not a priority', to the point where you wonder where your 600 dollars went. This seems to be especially true for these complicated, highly driver/profile reliant 'dual cards'.

    I bought the 7950 GX2 because even though the reviews had phrases like "the card currently doesn't support that feature (but that will be fixed in a future driver)" (sound familiar?), it was scoring well and seemed like a solid, high-end product that would kick some games' asses, and for less than two 7800 GTs could do at the time (by 60 bucks or so).

    I think the 'buy one now and be able to throw a second one in in a year for less' theory is starting be debunked. The card's prices hardly ever drop enough fast enough, and by the time you're ready for a second card, the new series is out with a single card that will out perform two of what you own now... and all without the pain of SLI issues.

    I dunno, for me, I think from now on I'll do the best true single card solution available for the time, and leave all the Dual gpu/sli issues to the people who enjoy configuring their machines more than playing on them.
    Reply
  • Donkey2008 - Tuesday, March 18, 2008 - link

    "hmmm by hooflung, 6 hours ago - If AMD is competitive at 1680x1050 and 1900x1200 for ~200+ dollars less would the conclusion have been less favorable and start to nitpick the sheer awkwardness of this card?"


    I agree and if you look at the charts carefully (which most people don't), you'll see that the 3870x2 is dead on, if not better, then the GeForce 8800 Ultra (the card it is meant to compete against) in performance at every resolution. Yet the 3870x2 is also hundreds of dollars cheaper and runs at a very similar power threshold.

    Although these tests are not about performance-per-dollar or who has the better value for high-level gaming, they are about who has the biggest schlong. I'll admit that.

    Anyway, there is no way you can love and admire computer hardware as much as frequent visitors to Anandtech without realizing the sheer majesty of different companies raising the bar with truely groundbreaking products (I.E. Intel's Pentium CPU, the original Nvidia Geforce, AMD's Athlon 64, ATI's 9700 Pro).

    Although a very impressive piece of work by Nvidia, this product is not one of them.

    PS - I own and use a 8800 GTS in my home system.
    Reply
  • karioskasra - Tuesday, March 18, 2008 - link

    Yes, they owned the market for a long time with it, but it's cost to build was high and it was an expensive part

    it's = its
    Reply
  • AcydRaine - Tuesday, March 18, 2008 - link

    Way to test on a system that nobody has. For sure to show typical results..... Reply
  • Spacecomber - Tuesday, March 18, 2008 - link

    The point of using skulltrail is that it allows both crossfire and sli to be tested on the same system. What better way to compare amd and nvidia multicard solutions?

    The goal of the article is to see how the video card solutions stack up against each other, not to to determine what results you'll get in your particular system.
    Reply
  • Pemalite - Tuesday, March 18, 2008 - link

    [quote]MD finally pulled out a wild card with the 3870 X2, and rather than putting their money into a large high cost chip, they combined two GPUs onto one board for their high end offering. Even though NVIDIA was first out of the gate with a single board dual GPU product (the 7950 GX2), we haven't seen a similar solution from their DX10 lineup until today.[quote]

    I think that part of the article is a little bit... wrong.
    AMD-ATI were the first to release a single card, Dual GPU solution in the form of the RAGE Fury MAXX. (Released in 1999).
    Reply
  • araczynski - Tuesday, March 18, 2008 - link

    unfortunately at that price point i don't care how miraculous it is.

    looks like the 8800gtsG92 will be in my box next. unless i hear they plan on releasing an affordable 9800gt/gts?
    Reply
  • Malory - Tuesday, March 18, 2008 - link

    I had read elsewhere that even for a single card, they had to enable SLI support via the Nvidia console for this card to work.

    Was that the case here? If so I asume that means that games which have no or little support for SLI wouldn't really gain much from this 'single' card.
    Reply
  • iceveiled - Tuesday, March 18, 2008 - link

    Probably the reason why a dual 8800 GT wasn't tested is because the mobo in the setup doesn't support SLi (it's an intel mobo)....
    Reply
  • chizow - Tuesday, March 18, 2008 - link

    Its Skull Trail which does support SLI (they actually mention a GX2 SLI, ie Quad SLI, review upcoming). More likely NV put an embargo or warning on direct 8800GT/GTS comparisons so the spotlight didn't shift to artificial clock speed and driver discrepancies. After all, they do want to sell these abominations. ;) Reply
  • madgonad - Tuesday, March 18, 2008 - link

    You clearly aren't paying attention to the market. A lot of people who cling to their PC gaming experience would also like to move their PC into the living room so that they can experience the big screen + 5/7.1 surround sound like their console brethren. The new Hybrid power and graphics solutions will allow a HTPC to have one of these Beasts as a partner for the onboard graphics. When watching movies or viewing the internet, this beast will be off and not making heat or noise. But once Crysis comes on, so does the discrete video card and it is off to the races. I have been waiting for the market to mature so that I can build a PC that games well, holds all my movies, and TiVos my shows - all in one box. All that I am waiting for is a Bitstream solution for the HD audio - which are due in Q2. Reply
  • JarredWalton - Tuesday, March 18, 2008 - link

    That's true... but the HybridPower + SLI stuff isn't out yet, right? We need 790g or some such first. I also seem to recall NVIDIA saying that HybridPower would only work with *future* NVIDIA IGPs, not with current stuff. So until we have the necessary chipset, GPU, and drivers I for one would not even think of putting a 9800 GX2 into an HTPC. We also need better HDMI audio solutions.

    Anyway, we're not writing off HTPC... we're just saying that more the vast majority of HTPC users this isn't going to be the ideal GPU. As such, we focused on getting the gaming testing done for this article, and we can talk about the video aspects in a future article. Then again, there's not much to say: this card handles H.264 offload as well as the other G92 solutions, which is good enough for most HTPC users. Still need better HDMI audio, though.
    Reply
  • casanova99 - Tuesday, March 18, 2008 - link

    While this is most likely a G92 variant, this isn't really akin to an SLI 8800GT setup, as the 8800GT has 112 shaders and 56 texture units. This card has 256 (128 * 2) shaders and 128 (64 * 2) texture units.

    It seems to match more with a 8800GTS 512MB, but with an underclocked core and shaders, paired with faster memory.
    Reply
  • chizow - Tuesday, March 18, 2008 - link

    While this is true, it only perpetuates the performance myths Nvidia propagates with its misleading product differentiation. As has been shown time and time again, the differences in shaders/texture units with G92 have much less impact on performance compared to core clock and memory speeds. There's lots of relevant reviews with same-clocked 8800GT vs GTS performing nearly identically (FiringSquad has excellent comparisons), but you really need to look no further than the 9600GT to see how overstated specs like shaders are with current gen GPUs. If you dig enough you'll find the info you're looking for, like an 8800GT vs 8800GTS both at 650/1000 (shocker, 9800GTX is expected to weigh in at 675/1100). Problem is most reviews will take the artificial stock clock speeds of both and compare them, so 600/900 vs 650/1000 and then point to irrelevant core differences as the reason for the performance gap. Reply
  • hooflung - Tuesday, March 18, 2008 - link

    Well, I am really disapointed in this review. It seems almost geared towards being a marketing tool for Nvidia. So it might be geared towards HD resolutions, what about the others resolutions? If AMD is competitive at 1680x1050 and 1900x1200 for ~200+ dollars less would the conclusion have been less favorable and start to nitpick the sheer awkwardness of this card? Also, I find it disturbing that 9600GTs can do nearly what this thing can do, probably at less power ( who knows you didn't review power consumption like every other card revier ) and cost half as much.

    To me, Nvidia is grasping at straws.
    Reply
  • Genx87 - Tuesday, March 18, 2008 - link

    Eh? Grasping at straws with a solution that at times is clearly faster than the competition? This is a pretty solid single card offering if you ask me. Is it for everybody? Not at all. High end uber cards never are. But it definately took the crown back from AMD with authority.

    Reply
  • hooflung - Tuesday, March 18, 2008 - link

    A single slot solution that isn't much better than a SLI 9600GT setup at those highest of high resolutions. Not for everyone is the understatement of the year. Yes I can see it is the single fastest on the block but at what cost? Another drop in the hat of a mislabeled 8 series product that is in a package about as illustrious as a cold sore.

    This card is a road bump. The article is written based on a conclusion that Nvidia cannot afford, and we cannot afford, to have a next generation graphics processor right now. To me, it smacks of laziness. Furthermore, gone are the times of 600 dollar graphics cards I am afraid. I guess Nvidia employees get company gas cards so the don't pay 3.39 a gallon for gasoline.

    How does this card flare for THE MAJORITY of users on 22" and 24" LCDs. I don't care about gaps at resolutions I need a 30" or HDTV to play on.
    Reply
  • Genx87 - Tuesday, March 18, 2008 - link

    Sounds like you have a case of sour grapes. Dont get so hung up on AMD's failings. I know you guys wanted the x2 to trounce or remain competitive with this "bump". But you have to remember AMD has to undo years of mismanagement at the hands of ATI's management.

    600 dollar cards keep showing up because they sell. Nobody is forcing you buy one.
    Reply
  • chizow - Tuesday, March 18, 2008 - link

    Heh ya he's posted similarly all over the video forums as well. Not sure what he's whining on about though, the GX2 is what everyone expected it TO BE based on already known and readily available 8800GT SLI benchmarks. Even though the core is closer to a G92 GTS with 128 SP, the core and shader clocks are closer to the stock 8800GT.

    Pricing isn't far off either; its about 2x as much as TWO G92 GTS, slightly more than TWO G92 GT. But here's the kicker, you don't need SLI to get the benefits of SLI, just as you didn't need a CF board for CF with the 3870 X2. With an SLI board, you can use TWO of these cards for what amounts to QUAD SLI which isn't an option with any other NV solution and certainly much cheaper than the previous high-end multi-GPU solution, Tri-SLI with 8800 GTX/Ultra with a 680/780 board and a 1000W+ PSU.

    For those with SLI capable boards, ofc its more economic to go with 2x 8800GT or 9600GT or even 8800GTS in SLI. For those who have ATI/Intel boards this offers the same thing the X2 did for NV board owners. For those with native SLI boards this offers the highest possible configuration for either camp but its going to cost you accordingly. Sure its not cheap now, but high-end never is. Expect prices to fall but if you buy now you're going to pay a premium, just as all early adopters do.
    Reply
  • Methusela - Tuesday, March 18, 2008 - link

    I don't see any power draw comparisons in the review. Isn't this important? What about heat and sound output? Reply
  • Genx87 - Tuesday, March 18, 2008 - link

    According to Hardocp the 9800X2 draws 196 watts at idle and 365 at load. The 3780x2 draws 151 idle and 381 at load. Reply
  • Griswold - Tuesday, March 18, 2008 - link

    Which part of "For this test we used a wattage meter plugged in at the wall that measures total system power" did you not understand? No, these cards do not suck that much power, its the whole system that draws 365W and 381W at load. Reply
  • Methusela - Tuesday, March 18, 2008 - link

    Derek, I'm shocked you didn't include SLI 8800gt 512mb in the test. Isn't this essentially the same thing as what's inside the 9800gx2, but would cost a lot less? Reply
  • Deusfaux - Tuesday, March 18, 2008 - link

    No, it'd make more sense to test with GTS 512 SLI.

    Even more sense if they were underclocked to 600 mhz core and 1600 shader, and overclocked to 1100 mem, to match the gpus in this card.
    Reply
  • chizow - Tuesday, March 18, 2008 - link

    I agree an 8800GT SLI comparison would've made more sense, although there is the 9600GT SLI and also 8800 GT benches in there to compare with single card performance. Hopefully it was just an oversight on Derek's part and not something sinister like some NV enforced embargo. After all, the 9800GX2 is simply 2 G92 cores at stock GT speeds in SLI. But NV has tried hard to keep consumers in the dark about product differentiation and reviewers all seem willing to tow the line. Reply
  • DigitalFreak - Tuesday, March 18, 2008 - link

    Over at Hardocp, they compare the GX2 with 8800GT SLI and 8800GTS 512 SLI Reply
  • Genx87 - Tuesday, March 18, 2008 - link

    Wow at load the 3870x2 draws more power than this while delivering about 60-70% of the performance? Reply

Log in

Don't have an account? Sign up now