The Card and The Test

This is one of the most unique consumer level graphics boards we have seen in quite a long time. While there have been a few card makers that have dropped two GPUs on one board, this is the first product where we have seen PCIe switch technology used to actually allow the connection of two independent PCIe devices in one slot. The 7950 GX2 is also the first consumer level graphics add-in product we've had in our labs to be built using two separate PCBs.



If we take a close look at the card itself, we can see the PCIe connection between the two boards. This is an X8 PCIe connection, which saves a bit on board routing requirements and physical connector width. Even though PCIe is a serial bus and each new lane only requires two additional wires, NVIDIA had a lot on their hands when designing this product. The first incarnation in the 7900 GX2 currently being used in Quad SLI systems routes all 16 lanes to each graphics card, but the board itself is much larger than the 7950 GX2. It doesn't seem like just cutting down the PCIe lanes would make enough difference to cut out so much board space, so it is likely that NVIDIA spent further time optimizing board features and layout.



The board does have an open SLI connector. Unfortunately, at this time, we have not been able to test DIY Quad SLI. NVIDIA has made it clear that they won't try to stop people from building their own Quad SLI systems, but they also won't actively support such activity. This is in line with their current stance on overclocking. At the moment, NVIDIA tells us that there are many roadblocks to configuring a working Quad SLI system with the 7950 GX2.

On top of BIOS issues with a single PCIe bridge in one X16 slot, supporting multiple bridges and four graphics cards might be beyond the capability of most motherboards at this point. Again, the physical ability is there, but BIOS support may still be lacking. Driver support is also a problem, as the drivers that support the 7950 GX2 do not support Quad SLI and vice versa. NVIDIA assures us that driver support should eventually come along, but until then we will be working on hacking our way around these issues.

As for multiple display capabilities, the 7950 GX2 is just like single GPU boards. There are only two DVI outputs onboard which both support dual-link bandwidth and are driven by only one of the GPUs. NVIDIA has confirmed that it is technically possible to provide 4 dual-link DVI outputs on one 7950 GX2 board, but it doesn't look like any board makers are going down that route. There may also be additional driver support necessary to make this happen, and demand might not ever get high enough to entice anyone to actually build a 7950 GX2 with all the necessary components. Still, it is nice to know that the only thing stopping someone from going down the quad output route is simply the cost of the connectors.

As we are still working on getting the 7950 GX2 set up in SLI with a second card, we will be focusing on a comparison with other single card solutions. The exception, of course, will be 7900 GT SLI. The specifications of the 7950 GX2 indicate that it should perform very similarly to a 7900 GT SLI setup without the hassle. Each GPU on the 7950 GX2 is essentially a higher clocked 7900 GT. Additionally, the 7950 GX2 incorporates 512MB of 1200MHz (effective data rate) GDDR3 for a total of 1GB of on board RAM. Here is a quick reference table NVIDIA provided showing the differences between the 7950 GX2, the 7900 GTX and the 7900 GT. We do take some issue with reporting memory bandwidth, fill rate, and verts/sec as simple aggregates of the two GPUs' capabilities as these quantities don't scale perfectly linearly in either an SLI setup or in the case of the 7950 GX2, but the data is still interesting.



For the rest of our comparisons, we will be looking at the solutions shown here:

AMD Athlon 64 FX-57
ASUS A8N32 SLI Deluxe NVIDIA nForce 4 X16 Motherboard
2GB 3:3:2:8 OCZ PC4000 EB
Seagate 7200.7 160GB HD
700W GameXStream PSU

NVIDIA GeForce 7950 GX2
NVIDIA GeForce 7900 GTX
NVIDIA GeForce 7800 GTX 512
NVIDIA GeForce 7900 GT (and SLI)
ATI Radeon X1900 XT
ATI Radeon X1900 GT

We will lead off with a side by side comparison of the 7950 GX2 and the 7900 GT SLI, and then we will take a look at how the 7950 GX2 stacks up against other high end parts. Before we get too that, here's a quick look at power.

Idle Power


Load Power


Our power tests at idle show that the Radeon X1900 XT comes in at the prime spot, but the tables turn when we flip the switch on Splinter Cell. Under load, the 7950 GX2 drops in behind the X1900 XT in power consumption. This difference between the NVIDIA and ATI high end would be even more exaggerated if we tested the X1900 XTX which expends even more energy to increase performance only slightly. While the GeForce 7950 GX2 draws more power than it's NVIDIA brethren, we are still looking at a manageable power draw (especially when considering the incredibly high power requirements of 7900 GTX SLI or X1900 XT CrossFire in comparison).

The Technology One Card, or Two?
Comments Locked

60 Comments

View All Comments

  • kilkennycat - Monday, June 5, 2006 - link

    Just to reinforce another poster's comments. Oblivion is now the yardstick for truly sweating a high-performance PC system. A comparison of a single GX2 vs dual 7900GT in SLI would be very interesting indeed, since Oblivion pushes up against the 256Meg graphics memory limit of the 7900GT (with or without SLI), and will exceed it if some of the 'oblivion.ini' parameters are tweaked for more realistic graphics in outdoor environments, especially in combo with some of the user-created texture-enhancements mods.
  • Crassus - Monday, June 5, 2006 - link

    That was actually my first thought and the reason I read the article ... "How will it run Oblivion?". I hope you'll find the time to add some graphs for Oblivion. Thanks.
  • TiberiusKane - Monday, June 5, 2006 - link

    Nice article. Some insanely rich gamers may want to compare the absolute high-end, so they may have wanted to see 1900XT in Crossfire. It'd help with the comparison of value.
  • George Powell - Monday, June 5, 2006 - link

    Didn't the ATI Rage Fury Maxx post date the Obsidian X24 card?

    Also on another point its a pity that there are no Oblivion benchmarks for this card.
  • Spoelie - Monday, June 5, 2006 - link

    Didn't the voodoo5 post date that one as well? ^^
  • Myrandex - Monday, June 5, 2006 - link

    For some reason page 1 and 2 worked for me, but when I tried 3 or higher no page would load and I received a "Cannot find server" error message.
  • JarredWalton - Monday, June 5, 2006 - link

    We had some server issues which are resolved now. The graphs were initially broken on a few charts (all values were 0.0) and so the article was taken down until the problem could be corrected.
  • ncage - Monday, June 5, 2006 - link

    This is a very cool but what would be a better idea if nvidia would use the socket concept where you can change out the VPU just like you can a cpu. So you could buy a card with only one VPU and then add another one later if you needed it....
  • BlvdKing - Monday, June 5, 2006 - link

    Isn't that what PCI-Express is? Think of a graphics card like a slot 1 or slot A CPU back in the old days. A graphics card is a GPU with it's own cache on the same PCB. If we were to plug a GPU into the motherboard, then it would have to use system memory (slow) or use memory soldiered onto the motherboard (not updatable). The socket idea for GPUs doesn't make sense.
  • DerekWilson - Monday, June 5, 2006 - link

    actually this isn't exactly what PCIe is ...

    but it is exactly what HTX will be with AMD's Torrenza and coherent HT links from the GPU to the processor. The CPU and the GPU will be able to work much more closely together with this technology.

Log in

Don't have an account? Sign up now