For quite a while now, the 8800 GTX and 8800 Ultra have been the fastest single GPU cards around. In spite of the fact that we haven't seen any faster single GPU solution introduced, it is only recently that the rest of the lineup has become compelling on either the NVIDIA or AMD front. Aiming high is a good thing for those who can afford it, but until the technology makes its way into cheaper products most of us won't see the benefit.

It costs quite a bit of money to develop and produce single GPU solutions of ever increasing die size and complexity. It's a problem of engineering rather than science: yes faster hardware could be built, but it doesn't matter how fast your product is if people who are interested can't afford it. There are trade offs and diminishing returns to consider when designing hardware, and production cost and market value always have something to say about what type of performance a company will be able to target with a given product.

NVIDIA's G80 is a huge chip. Yes, they owned the market for a long time with it, but its cost to build was high and it was an expensive part for end users to own as well. AMD finally pulled out a wild card with the 3870 X2, and rather than putting their money into a large high cost chip, they combined two GPUs onto one board for their high end offering. Sure, NVIDIA had a single board dual GPU product a couple years back (the 7950 GX2) - and ATI tried that as well back in the Rage MAXX days - but we haven't seen a similar solution from their DX10 lineup until today.

With G9x coming in as a glorified die shrink to G80, NVIDIA took the opportunity to move away from huge die size and shift to the cheaper option of combining two GPUs on a single board for its highest end part. It is less expensive to make use of two chips, even if their combined size is larger than a monolithic one because yields are so much better. NVIDIA is able to get more chips per wafer and a higher percentage of those will be good compared to a large design.

Of course, in spite of a cheaper to produce solution, the increased performance of this solution over previous high end has earned the 9800 GX2 a pretty heft price premium. At a retail price of at least $600 US, these bad boys will not be making their way into everyone's systems. There is always a price for having the best of the best.

As we mentioned, NVIDIA has done single card dual GPU in the past. But this board is different from both the 7950 GX2 and it's current competitor, the 3870 X2. Let's take a look at the board and see just what the differences are.

The 9800 GX2 Inside, Out and Quad
POST A COMMENT

49 Comments

View All Comments

  • archerprimemd - Tuesday, April 01, 2008 - link

    i apologize for the noob question, just didn't know where to look for the info:

    dual gpu single card or single gpu dual cards (meaning, in SLI)

    which is better?

    also, isn't having the 2 gpus in one card sort of like doing an SLI?
    Reply
  • Tephlon - Friday, April 04, 2008 - link

    "dual gpu single card or single gpu dual cards (meaning, in SLI)
    which is better?"

    To be honest, these seems to very based on timing and pricing.
    For instance, back in the 7 series, I bought two 7800GT's and SLI'd them. About a week later, the 7950GX2 became available. It offered similar (if not better in some cases) performance than the two 7800gt's, so I returned the gt's and got the GX2.
    But at the same time... The 7800 Ultra's were available... and two of those in SLI were better than the GX2... but for nearly twice the money.
    Again, this might vary generation to generation, so YMMV.

    "also, isn't having the 2 gpus in one card sort of like doing an SLI?"

    the short answer is yes. I actually posted about this in more detail just a few pages ago, so for more on the subject see http://www.anandtech.com/video/showdoc.aspx?i=3266...">http://www.anandtech.com/video/showdoc.aspx?i=3266...
    Reply
  • Tephlon - Friday, April 04, 2008 - link

    oops, sorry.

    I meant to say see my post at http://www.anandtech.com/video/showdoc.aspx?i=3266...">http://www.anandtech.com/video/showdoc.aspx?i=3266...

    Its about half way down the comments page. or you can just search 'Tephlon'
    Reply
  • SlingXShot - Wednesday, March 26, 2008 - link

    You know 3dfx tried this SLI madness, and put 16 chips on one board...and you know they failed...these products are not attractive to standard joe and the only people who care are the ones who install new computer for a living. Is it not good practice puting 4 video cards together. People want new design, etc. Reply
  • Ravensong - Friday, March 21, 2008 - link

    Ok, here's what I don't get and I hope someone can clarify this for me. In the article "ATI Radeon HD 3870 X2: 2 GPUs 1 Card, A Return to the High End" the CoD4 benchmarks running at 1920x1200 HQ settings, 0x AA/16x AF give a result of 107.3 fps yet this article's benchmark shows a result of 53.8 for 1920x1200. When I saw this I yelled out like like Lil Jon "WHAAT??" How did the frames drop this much? Perhaps the new 8.3 drivers are raping performance? This seems to be the case with every benchmark other than Crysis which received a minor increase from the 8.3 drivers. I'm not a fanboy for ATI/AMD by any means but I hardly see these scores as fair when just a few video articles ago this thing was doing well and then all the sudden it has piss poor performance when the GX2 launches. Reading this site on a daily basis I figured that the weird drop in performance would have been noted?? Not sure if anyone else noticed this but I surely did right away. I know I've had nothing but headaches atm with 8.3 and trying to get the 4 3850's I bought running in crossfire X. Thankfully thats just my secondary rig, if it had been my main I may have smashed it into pieces by now :D Reply
  • TheDudeLasse - Friday, March 21, 2008 - link

    It's gotta be Catalyst 8.3
    The scores im getting with 8.2 are 70% better.


    Reply
  • Ravensong - Friday, March 21, 2008 - link

    Definitely, no other explanation as to why the scores are so horrid compared to only a month and a half ago when the original benches debuted. I wish all the sites using 8.3 would correct this injustice!! lol... Hardocp went as far as saying "The Radeon HD 3870 X2 gets utterly and properly owned, this is true “pwnage” on the highest level." ... just wow. :D Reply
  • Ravensong - Saturday, March 22, 2008 - link

    Any comments on this dilemma Sir Wilson?? (referring to the author) :D Reply
  • TheDudeLasse - Friday, March 21, 2008 - link

    I think you may have had some driver issues with the 3870X2.

    Im running a q6600@3.4 and 3870x2.

    I´ve been running the same benchmarks as you describe and the results are completely different.

    For instance Call of Duty benchmark results vary almost over 70%
    I ran the same benchmark "We start FRAPS as soon as the screen clears in the helicopter and we stop it right as the captain grabs his head gear."

    Example
    1920x1200 4xAA and 16AF Your result 42.3 fps average
    My result 76.056fps average
    That's an over 75% improvment to your score.
    What's the jig? screwed up catalyst drivers or what?


    Reply
  • 7Enigma - Thursday, March 20, 2008 - link

    Derek,

    I see you have not answered the requests regarding why 8800GT and 8800GTS SLI was not included in these benchmarks. I can understand if you were not allowed to due to some Nvidia NDA, and why you might not be able to talk about it.

    If you could please reply with a :) if this is the case, we would be greatly appreciative. Otherwise it looks like there is a gaping whole in this reveiw.

    Thank you.
    Reply

Log in

Don't have an account? Sign up now