Introduction

Today marks the launch of the first GPU maker sanctioned single card / multi-GPU solution for the consumer market in quite some time. Not since Quantum3D introduced the Obsidian X24 have we seen such a beast (which, interestingly enough, did actual Scan Line Interleaving on a single card). This time around NVIDIA's flavor of SLI and PCIe are being used to connect two boards together for a full featured multi-GPU solution that works like a single card as far as the end user is concerned. No special motherboard is required, the upcoming 90 series driver will support the card, and there is future potential for DIY quad SLI. There is still a ways to go until NVIDIA releases drivers that will support quad SLI without the help of a system vendor, but they are working on it.

For now, we will take a look at the card and its intended use: a card using a single PCIe connection designed to be the fastest NVIDIA graphics board available. While there are some drawbacks of SLI still associated with the 7950 GX2 (certain games scale less than others), the major issues are quite nicely resolved: there is no need for an SLI motherboard, and it's much easier to make sure everything is hooked up correctly (with only one power connector, no SLI bridge needed, and only one card to plug in). The drivers start up and automatically configure support for multi-GPU rendering, and (after our motherboard's BIOS was flashed) we had no problem with the system recognizing the new technology.

While the potential for quad SLI is a reality, the usefulness is still fairly limited - only users with ultrahigh resolution monitors will see the benefits of four GPUs. At lower resolutions, CPU overhead becomes a factor, and some limitations of DX9 come into play. We certainly want to test quad SLI on the 7950 GX2, but we will have to wait until we get the equipment together and track down a driver that will support it. In this article, we will compare the 7950 GX2 with other high end NVIDIA and ATI cards, and we'll also take a look at how well it scales compared to it's close relative: the 7900 GT / 7900 GT SLI. But before we get to the benchmarks, let's take a look at how NVIDIA puts it all together in a way that avoids the necessity of an SLI motherboard or an external power supply.

The Technology
POST A COMMENT

60 Comments

View All Comments

  • kilkennycat - Monday, June 05, 2006 - link

    Just to reinforce another poster's comments. Oblivion is now the yardstick for truly sweating a high-performance PC system. A comparison of a single GX2 vs dual 7900GT in SLI would be very interesting indeed, since Oblivion pushes up against the 256Meg graphics memory limit of the 7900GT (with or without SLI), and will exceed it if some of the 'oblivion.ini' parameters are tweaked for more realistic graphics in outdoor environments, especially in combo with some of the user-created texture-enhancements mods. Reply
  • Crassus - Monday, June 05, 2006 - link

    That was actually my first thought and the reason I read the article ... "How will it run Oblivion?". I hope you'll find the time to add some graphs for Oblivion. Thanks. Reply
  • TiberiusKane - Monday, June 05, 2006 - link

    Nice article. Some insanely rich gamers may want to compare the absolute high-end, so they may have wanted to see 1900XT in Crossfire. It'd help with the comparison of value. Reply
  • George Powell - Monday, June 05, 2006 - link

    Didn't the ATI Rage Fury Maxx post date the Obsidian X24 card?

    Also on another point its a pity that there are no Oblivion benchmarks for this card.
    Reply
  • Spoelie - Monday, June 05, 2006 - link

    Didn't the voodoo5 post date that one as well? ^^ Reply
  • Myrandex - Monday, June 05, 2006 - link

    For some reason page 1 and 2 worked for me, but when I tried 3 or higher no page would load and I received a "Cannot find server" error message. Reply
  • JarredWalton - Monday, June 05, 2006 - link

    We had some server issues which are resolved now. The graphs were initially broken on a few charts (all values were 0.0) and so the article was taken down until the problem could be corrected. Reply
  • ncage - Monday, June 05, 2006 - link

    This is a very cool but what would be a better idea if nvidia would use the socket concept where you can change out the VPU just like you can a cpu. So you could buy a card with only one VPU and then add another one later if you needed it.... Reply
  • BlvdKing - Monday, June 05, 2006 - link

    Isn't that what PCI-Express is? Think of a graphics card like a slot 1 or slot A CPU back in the old days. A graphics card is a GPU with it's own cache on the same PCB. If we were to plug a GPU into the motherboard, then it would have to use system memory (slow) or use memory soldiered onto the motherboard (not updatable). The socket idea for GPUs doesn't make sense. Reply
  • DerekWilson - Monday, June 05, 2006 - link

    actually this isn't exactly what PCIe is ...

    but it is exactly what HTX will be with AMD's Torrenza and coherent HT links from the GPU to the processor. The CPU and the GPU will be able to work much more closely together with this technology.
    Reply

Log in

Don't have an account? Sign up now