Original Link: http://www.anandtech.com/show/2483

For quite a while now, the 8800 GTX and 8800 Ultra have been the fastest single GPU cards around. In spite of the fact that we haven't seen any faster single GPU solution introduced, it is only recently that the rest of the lineup has become compelling on either the NVIDIA or AMD front. Aiming high is a good thing for those who can afford it, but until the technology makes its way into cheaper products most of us won't see the benefit.

It costs quite a bit of money to develop and produce single GPU solutions of ever increasing die size and complexity. It's a problem of engineering rather than science: yes faster hardware could be built, but it doesn't matter how fast your product is if people who are interested can't afford it. There are trade offs and diminishing returns to consider when designing hardware, and production cost and market value always have something to say about what type of performance a company will be able to target with a given product.

NVIDIA's G80 is a huge chip. Yes, they owned the market for a long time with it, but its cost to build was high and it was an expensive part for end users to own as well. AMD finally pulled out a wild card with the 3870 X2, and rather than putting their money into a large high cost chip, they combined two GPUs onto one board for their high end offering. Sure, NVIDIA had a single board dual GPU product a couple years back (the 7950 GX2) - and ATI tried that as well back in the Rage MAXX days - but we haven't seen a similar solution from their DX10 lineup until today.

With G9x coming in as a glorified die shrink to G80, NVIDIA took the opportunity to move away from huge die size and shift to the cheaper option of combining two GPUs on a single board for its highest end part. It is less expensive to make use of two chips, even if their combined size is larger than a monolithic one because yields are so much better. NVIDIA is able to get more chips per wafer and a higher percentage of those will be good compared to a large design.

Of course, in spite of a cheaper to produce solution, the increased performance of this solution over previous high end has earned the 9800 GX2 a pretty heft price premium. At a retail price of at least $600 US, these bad boys will not be making their way into everyone's systems. There is always a price for having the best of the best.

As we mentioned, NVIDIA has done single card dual GPU in the past. But this board is different from both the 7950 GX2 and it's current competitor, the 3870 X2. Let's take a look at the board and see just what the differences are.

The 9800 GX2 Inside, Out and Quad

The most noticeable thing about the card is the fact that it looks like a single PCB with a dual slot HSF solution. Appearances are quite deceiving though, as on further inspection, it is clear that their are really two PCBs hidden inside the black box that is the 9800 GX2. This is quite unlike the 3870 X2 which puts two GPUs on the same PCB, but it isn't quite the same as the 7950 GX2 either.

The special sauce on this card is the fact that the cooling solution is sandwiched between the GPUs. Having the GPUs actually face each other is definitely interesting, as it helps make the look of the solution quite a bit more polished than the 7950 GX2 (and let's face it, for $600+ you expect the thing to at least look like it has some value).

NVIDIA also opted not to put both display outputs on one PCB as they did with their previous design. The word on why: it is easier for layout and cooling. This adds an unexpected twist in that the DVI connectors are oriented in opposite directions. Not really a plus or a minus, but its just a bit different. Moving from ISA to PCI was a bit awkward with everything turned upside down, and now we've got one of each on the same piece of hardware.



On the inside, the GPUs are connected via PCIe 1.0 lanes in spite of the fact that the GPUs support PCIe 2.0. This is likely another case where cost benefit analysis lead the way and upgrading to PCIe 2.0 didn't offer any real benefit.

Because the 9800 GX2 is G9x based, it also features all the PureVideo enhancements contained in the 9600 GT and the 8800 GT. We've already talked about these features, but the short list is the inclusion of some dynamic image enhancement techniques (dynamic contrast and color enhancement), and the ability to hardware accelerate the decode of multiple video streams in order to assist in playing movies with picture in picture features.

We will definitely test these features out, but this card is certainly not aimed at the HTPC user. For now we'll focus on the purpose of this card: gaming performance. Although, it is worth mentioning that the cards coming out at launch are likely to all be reference design based, and thus they will all include an internal SPDIF connector and an HDMI output. Let's hope that next time NVIDIA puts these features on cards that really need it.

Which brings us to Quad SLI. Yes, the beast has reared its ugly head once again. And this time around, under Vista (Windows XP is still limited by a 3 frame render ahead), Quad SLI will be able to implement a 4 frame AFR mode for some blazing fast speed in certain games. Unfortunately, we can't bring you numbers today, but when we can we will absolutely pit it against AMD's CrossFireX. We do expect to see similarities with CrossFireX in that it won't scale quite as well when we move from 3 to 4 GPUs.



Once again, we are fortunate to have access to an Intel D5400XS board in which we can compare SLI to CrossFire on the same platform. While 4-way solutions are novel, they certainly are not for everyone. Especially when the pair of cards costs between $1200 and $1300. But we are certainly interested in discovering just how much worse price / performance gets when you plug two 9800 GX2 cards into the same box.

It is also important to note that these cards come with hefty power requirements and using a PCIe 2.0 powersupply is a must. Unlike the AMD solutions, it is not possible to run the 9800 GX2 with a 6-pin PCIe power connector in the 8-pin PCIe 2.0 socket. NVIDIA recommends a 580W PSU with PCIe 2.0 support for a system with a single 9800 GX2. For Quad, they recommend 850W+ PSUs.

NVIDIA notes that some PSU makers have built their connectors a little out of spec so the fit is tight. They say that some card makers or PSU vendors will be offering adapters but that future power supply revisions should meet the specifications better.

As this is a power hungry beast, NVIDIA is including its HybridPower support for 9800 GX2 when paired with a motherboard that features NVIDIA integrated graphics. This will allow normal usage of the system to run on relatively low power by turning off the 9800 GX2 (or both if you have Quad set up), and should save quite a bit on your power bill. We don't have a platform to test the power savings in our graphics lab right now, but it should be interesting to see just how big an impact this has.

Techical Specs and The Test

First off we'll break down the technical specifications of the card. Of course, this part has two GPUs on it, so in order to get an idea of what each of the GPUs under the hood of the 9800 GX2 have in them, just divide most of these numbers by two. If the same number of transistors were shoved into a single piece of silicon, performance would be much higher (and it would cost a ton more and heat a small village).



Clearly the GPUs on this card are not a huge leap forward. Putting the two together is what makes this card what it is.

Our test setup is the same Skulltrail system we used in other recent graphics hardware reviews. Remember that isolating the graphics subsystem is important, as removing the CPU as a bottleneck gives us a better indication of the differences in performance between graphics cards. This time, we are also lucky in that the top of the line graphics hardware is meant to be paired with a top of the line system. Skulltrail fits the bill here, though NVIDIA would recommend the 790i in this case.

We would agree that a more gaming oriented board would be a better fit for most, even at the high end as the extra CPU processing power is only going to make a real difference in niche situations. In our case, the ability to run CrossFire and SLI in the same system trumps everything else.

Test Setup
CPU 2x Intel Core 2 Extreme QX9775 @ 3.20GHz
Motherboard Intel D5400XS (Skulltrail)
Video Cards ATI Radeon HD 3870 x2
ATI Radeon HD 3870
NVIDIA GeForce 9600 GT 512MB
NVIDIA GeForce 8800 GT 512MB
NVIDIA GeForce 8800 Ultra
NVIDIA GeForce 9800 GX2
Video Drivers Catalyst 8.3
ForceWare 174.53
Hard Drive Seagate 7200.9 120GB 8MB 7200RPM
RAM 2xMicron 2GB FB-DIMM DDR2-8800
Operating System Windows Vista Ultimate 64-bit SP1

Call of Duty 4: Modern Warfare Performance

Version: 1.4

Settings: All Highest Quality

For this benchmark, we use FRAPS to measure average frame rate during the opening cut-scene of the game. We start FRAPS as soon as the screen clears in the helicopter and we stop it right as the captain grabs his head gear.

This test shows the 9800 GX2 to be out in front of the competition. Well ahead of the Ultra, the 9800 GX2 nudges out the 9600 GT SLI solution for the lead in this benchmark.

Call of Duty 4


While the 9600 GT SLI setup does have and advantage below 2560x1600, those who are spending this much money will want to be playing at the highest res possible. Thus we give the nod to the 9800 GX2. On a budget, though, a pair of 9600 GT cards is a couple hundred dollars cheaper than the 9800 GX2, so it’s nice to see the competition. The value factor is definitely not on the side of the 9800 GX2.


When we enable 4xAA, the slight lower res advantage of the 9600 GT SLI is removed and the 9800 GX2 still leads at the high end. The three way 3870 solution is much further behind here.

Call of Duty 4



Crysis Performance

Version: 1.2

Settings: High Quality with Shaders set to Very High

For this test, we recorded our own demo using the record and demo console commands. Each test was run three times, and we took the highest score of the three (usually the second and third runs were the same or very nearly so). Our recorded demo consisted of a 20 second run through the woods in the level "rescue" and we verified the performance of our timedemo using FRAPS. The run was near the beginning of the level and we stayed clear of enemies in order to reduce the impact of AI on our graphics benchmark.

This isn’t one that even the almighty 9800 GX2 can handle at 2560x1600. While we could play with Very High settings as lower resolutions, we found this quality setup to be the sweet spot for 1920x1200 gaming. Which was quite nice indeed.

The 9800 GX2 solidly dominates the competition here. The 8800 Ultra and 9600 GT SLI setup are way behind before we’re CPU limited. We also left out the 3-way CrossFire setup on this title, as we saw no scaling from a 2-way setup.

It is important to note here that we were more limited on our skulltrail system at lower resolution than on the 790i board. It is possible to get performance over 45fps on a different platform with these settings, but we wanted to keep our numbers coming from a system where the hardware could be directly compared.

Crysis Performance



The Elder Scrolls IV: Oblivion Performance

Version: 1.2.0416 Shivering Isles

Settings: Ultra High Quality settings defaults with vsync disabled

Our Oblivion test takes place in the south of the Shivering Isles, running through woods over rolling hills toward a lake. This is a straight line run that lasts around 20 seconds and uses FRAPS to record framerate. This benchmark is very repeatable, but the first run is the most consistent between cards, so we only run the benchmark once through and take that number.

Under Oblivion, AMD still shows some promise. The three way 3870 solution leads the way without AA enabled. At the same time, the 9800 GX2 clearly has the single card solution performance locked up. At not that much more money required for the 9800 GX2 over the three way 3870 solution, it’s good to see AMD being competitive in at least one benchmark.

Oblivion Performance

With AA enabled, the gap closes a bit. The three way AMD solution still has an advantage at the high end where it counts though. The very high end solutions are playable at 2560x1600 with everything turned up and AA/AF enabled as well.

Oblivion Performance

Enemy Territory: Quake Wars Performance

Version: 1.4

Settings: Everything maxed out without AA, Soft particles enabled

For this benchmark, we created a new timedemo based on multiplayer action in the island level. Our old timedemo no longer works after the 1.4 update. This timedemo is about 10000 frames long and covers a lot of ground so many aspects of gameplay are incorporated. We run it with the timenetdemo and take the output. This is our only OpenGL benchmark.

This game does become CPU bound pretty quickly on higher end solutions, but looking at 2560x1600, we clearly see that the 9800 GX2 has a commanding lead. The three way 3870 solution wasn’t included here because CrossFireX does not yet support OpenGL games (but that will be fixed in a future driver).

Enemy Territory Quake Wars Performance



S.T.A.L.K.E.R. Performance

Version: 1.0005

Settings: full dynamic lighting, everything maxed without AA and no grass shadows

For this test, we walk in a straight line for about 30 seconds and use FRAPS to measure performance. We use the same save game every time and the path doesn't change. Our performance measurements are very consistent between runs. We do two runs and take the second.

While the 9800 GX2 once again leads the pack, the three way AMD solution set out to levy some competition here. It wasn’t enough, especially at the high end where the 9600 GT SLI solution once again became a second place finisher to the 9800 GX2.

S.T.A.L.K.E.R. Performance

World in Conflict Performance

Version: 1.005

Settings: Very High quality with AA disabled

We tested this game using the built in benchmark feature of the game. In our experience this does a good job of testing the different graphical scenarios that can be encountered in the game.

With World In Conflict, we saw more of a system limitation than we would have expected due to our Skulltrail system. On different platforms at lower resolutions, the 9800 GX2 was able to perform much higher. But when everything is plugged into skulltrail, this is what we have. We aren’t sure yet what the source of this issue is, but we also saw similar scaling problems in Crysis with lower quality settings than we used. We will continue to look into this matter of course.

Our funky system limitation, however, didn’t apply at 2560x1600, and we were able to see that we can get excellent performance from our 9800 GX2 with very high quality settings under World in Conflict. On another platform, this difference would be higher.

World in Conflict Performance

Final Words

Once again NVIDIA sets its self on top of the graphics card performance pile. AMD is able to keep up with 3x (or more) 3870 cards in Oblivion, but everywhere else it's the 9800 GX2 on top. As far as single card solutions, the 9800 GX2 is currently the alpha dog.

Like the 3870 X2, we didn't have any trouble installing the driver and getting things rolling. The experience was smooth and clean as far as that was concerned. We did run into some problems that seem like they might be Skulltrail specific at this point. The latest BIOS and 174.53 driver from NVIDIA solve a graphics driver failure when playing Crysis, and we saw some strange scaling at lower resolutions, but we don't expect most users to run into those problems.

Despite the fact that this card does lead the rest of the field in performance, its price tag will be a limiting factor. There are advantages to having the fastest card around of course, and we expect that NVIDIA will use this card to position itself as the best option in computer graphics. Certainly they are the best option when you have deep pockets, but savvy gamers will still pay close attention to price / performance and overall value. It can be fun to explore what is possible with the best of the best, but at the end of the day you have to come home to whatever is in your own box.

AMD has been fighting back, first with strong offerings in the midrange and then with stronger price cuts. While it is clear AMD can't compete directly at the high end, they are still capable of competing for gamers' dollars. Just because another company has the top card out there doesn't mean that all of their other parts gain some mystic value. While it still isn't relevant for games yet and we can't even test the performance of it, AMD's hardware supports DX10.1 and NVIDIA is still lacking in that area. The built-in HD audio device that outputs sound over the DVI port when the AMD HDMI adapter is attached is incredibly convenient (though both vendors are lacking in how well they support audio over HDMI).

The point is that you can't judge a book by its cover or a graphics card by some other part that is much faster. Take each case as it comes and make a decision based on what is best at that price point at that time. If your price point happens to be $600 to $650 per card, by all means, pick up a 9800 GX2. If not, make sure you do your homework.

Log in

Don't have an account? Sign up now