The 9800 GX2 Inside, Out and Quad

The most noticeable thing about the card is the fact that it looks like a single PCB with a dual slot HSF solution. Appearances are quite deceiving though, as on further inspection, it is clear that their are really two PCBs hidden inside the black box that is the 9800 GX2. This is quite unlike the 3870 X2 which puts two GPUs on the same PCB, but it isn't quite the same as the 7950 GX2 either.

The special sauce on this card is the fact that the cooling solution is sandwiched between the GPUs. Having the GPUs actually face each other is definitely interesting, as it helps make the look of the solution quite a bit more polished than the 7950 GX2 (and let's face it, for $600+ you expect the thing to at least look like it has some value).

NVIDIA also opted not to put both display outputs on one PCB as they did with their previous design. The word on why: it is easier for layout and cooling. This adds an unexpected twist in that the DVI connectors are oriented in opposite directions. Not really a plus or a minus, but its just a bit different. Moving from ISA to PCI was a bit awkward with everything turned upside down, and now we've got one of each on the same piece of hardware.

 

 

On the inside, the GPUs are connected via PCIe 1.0 lanes in spite of the fact that the GPUs support PCIe 2.0. This is likely another case where cost benefit analysis lead the way and upgrading to PCIe 2.0 didn't offer any real benefit.

Because the 9800 GX2 is G9x based, it also features all the PureVideo enhancements contained in the 9600 GT and the 8800 GT. We've already talked about these features, but the short list is the inclusion of some dynamic image enhancement techniques (dynamic contrast and color enhancement), and the ability to hardware accelerate the decode of multiple video streams in order to assist in playing movies with picture in picture features.

We will definitely test these features out, but this card is certainly not aimed at the HTPC user. For now we'll focus on the purpose of this card: gaming performance. Although, it is worth mentioning that the cards coming out at launch are likely to all be reference design based, and thus they will all include an internal SPDIF connector and an HDMI output. Let's hope that next time NVIDIA puts these features on cards that really need it.

Which brings us to Quad SLI. Yes, the beast has reared its ugly head once again. And this time around, under Vista (Windows XP is still limited by a 3 frame render ahead), Quad SLI will be able to implement a 4 frame AFR mode for some blazing fast speed in certain games. Unfortunately, we can't bring you numbers today, but when we can we will absolutely pit it against AMD's CrossFireX. We do expect to see similarities with CrossFireX in that it won't scale quite as well when we move from 3 to 4 GPUs.

 

 

Once again, we are fortunate to have access to an Intel D5400XS board in which we can compare SLI to CrossFire on the same platform. While 4-way solutions are novel, they certainly are not for everyone. Especially when the pair of cards costs between $1200 and $1300. But we are certainly interested in discovering just how much worse price / performance gets when you plug two 9800 GX2 cards into the same box.

It is also important to note that these cards come with hefty power requirements and using a PCIe 2.0 powersupply is a must. Unlike the AMD solutions, it is not possible to run the 9800 GX2 with a 6-pin PCIe power connector in the 8-pin PCIe 2.0 socket. NVIDIA recommends a 580W PSU with PCIe 2.0 support for a system with a single 9800 GX2. For Quad, they recommend 850W+ PSUs.

NVIDIA notes that some PSU makers have built their connectors a little out of spec so the fit is tight. They say that some card makers or PSU vendors will be offering adapters but that future power supply revisions should meet the specifications better.

As this is a power hungry beast, NVIDIA is including its HybridPower support for 9800 GX2 when paired with a motherboard that features NVIDIA integrated graphics. This will allow normal usage of the system to run on relatively low power by turning off the 9800 GX2 (or both if you have Quad set up), and should save quite a bit on your power bill. We don't have a platform to test the power savings in our graphics lab right now, but it should be interesting to see just how big an impact this has.

Index Techical Specs and The Test
POST A COMMENT

49 Comments

View All Comments

  • chizow - Tuesday, March 18, 2008 - link

    Heh ya he's posted similarly all over the video forums as well. Not sure what he's whining on about though, the GX2 is what everyone expected it TO BE based on already known and readily available 8800GT SLI benchmarks. Even though the core is closer to a G92 GTS with 128 SP, the core and shader clocks are closer to the stock 8800GT.

    Pricing isn't far off either; its about 2x as much as TWO G92 GTS, slightly more than TWO G92 GT. But here's the kicker, you don't need SLI to get the benefits of SLI, just as you didn't need a CF board for CF with the 3870 X2. With an SLI board, you can use TWO of these cards for what amounts to QUAD SLI which isn't an option with any other NV solution and certainly much cheaper than the previous high-end multi-GPU solution, Tri-SLI with 8800 GTX/Ultra with a 680/780 board and a 1000W+ PSU.

    For those with SLI capable boards, ofc its more economic to go with 2x 8800GT or 9600GT or even 8800GTS in SLI. For those who have ATI/Intel boards this offers the same thing the X2 did for NV board owners. For those with native SLI boards this offers the highest possible configuration for either camp but its going to cost you accordingly. Sure its not cheap now, but high-end never is. Expect prices to fall but if you buy now you're going to pay a premium, just as all early adopters do.
    Reply
  • Methusela - Tuesday, March 18, 2008 - link

    I don't see any power draw comparisons in the review. Isn't this important? What about heat and sound output? Reply
  • Genx87 - Tuesday, March 18, 2008 - link

    According to Hardocp the 9800X2 draws 196 watts at idle and 365 at load. The 3780x2 draws 151 idle and 381 at load. Reply
  • Griswold - Tuesday, March 18, 2008 - link

    Which part of "For this test we used a wattage meter plugged in at the wall that measures total system power" did you not understand? No, these cards do not suck that much power, its the whole system that draws 365W and 381W at load. Reply
  • Methusela - Tuesday, March 18, 2008 - link

    Derek, I'm shocked you didn't include SLI 8800gt 512mb in the test. Isn't this essentially the same thing as what's inside the 9800gx2, but would cost a lot less? Reply
  • Deusfaux - Tuesday, March 18, 2008 - link

    No, it'd make more sense to test with GTS 512 SLI.

    Even more sense if they were underclocked to 600 mhz core and 1600 shader, and overclocked to 1100 mem, to match the gpus in this card.
    Reply
  • chizow - Tuesday, March 18, 2008 - link

    I agree an 8800GT SLI comparison would've made more sense, although there is the 9600GT SLI and also 8800 GT benches in there to compare with single card performance. Hopefully it was just an oversight on Derek's part and not something sinister like some NV enforced embargo. After all, the 9800GX2 is simply 2 G92 cores at stock GT speeds in SLI. But NV has tried hard to keep consumers in the dark about product differentiation and reviewers all seem willing to tow the line. Reply
  • DigitalFreak - Tuesday, March 18, 2008 - link

    Over at Hardocp, they compare the GX2 with 8800GT SLI and 8800GTS 512 SLI Reply
  • Genx87 - Tuesday, March 18, 2008 - link

    Wow at load the 3870x2 draws more power than this while delivering about 60-70% of the performance? Reply

Log in

Don't have an account? Sign up now