The Technology

There is a very significant distinction to be made between NVIDIA's implementation of multi-GPU on a single card and previous attempts. In the past, solutions that drop two GPUs on one PCB (printed circuit board) have relied on the capability of an SLI motherboard to configure a single physical X16 PCIe connection into two X8 data paths. While this solution works, it is not optimal in bringing multi-GPU performance to the masses. Requiring not only a chipset that will allow dynamic PCIe lane configuration, but also restricting NVIDIA based graphics boards to NVIDIA core logic based motherboards really cuts down on the potential market.

With its first in-house multi-GPU design, NVIDIA has lifted the requirement for an SLI chipset and enabled the use of their 7950 GX2 on any motherboard with an X16 PCIe slot (provided the manufacturer has proper BIOS support, but more on that later). This chipset agnostic implementation works is by incorporating a PCIe switch which acts as a bridge between the system's X16 interface and the two GPUs. Because of the way PCIe works, the operating system is able to see the two graphics cards as if they were independent parts. You can think of this as being similar to connecting a USB hub to a single USB port in order to plug in multiple devices. Only in this case, the devices and switch are all in one neat little package.



The PCIe switch itself is a 48 lane device, capable of routing each of the three x16 connections to any one of the other two depending on its intended destination. On their 7900 GX2, NVIDIA takes full advantage of this, but for the 7950 GX2, only 8 lanes are routed from the switch to each GPU. The end result is that what the chipset would have had to manage, NVIDIA's 7950 GX2 moves on board.

We mentioned BIOS compatibility, which can be a potential problem. The reason we could see some issues here is that, while PCI Express switches are perfectly valid and useful devices, we haven't seen any real commercial attempt that takes advantage of them on an add-in board. Combine this with the fact that many motherboard makers only recognize graphics hardware in their x16 PCIe slots, and we end up with some wrinkles which need to be smoothed. The system BIOS must be able to handle finding a PCIe switch, and furthermore it must be able to recognize that a graphics card is beyond the switch in order to load the video BIOS.

NVIDIA has been working hard with the rest of the industry to help get BIOS updates ready and available for launch. The list is relatively long at this point, and we can confirm that the 7950 GX2 will actually run in many ATI based motherboards right now with the proper BIOS update. Inevitably, there will be some systems which will not run the 7950 GX2 at launch. Just how large a problem this is remains to be seen, but we can't put too much of the burden on NVIDIA's shoulders for this problem. Motherboard makers do need to support more than just graphics devices in their X16 slots, and the proper handling of PCIe switches is important as well. It just so happens that NVIDIA has become the catalyst for vendors to roll out support for this type of device. While we do worry about some customers being left out in the cold, often this is the price of admission to the high-tech bleeding edge of computing. To be safe, we strongly recommend interested buyers confirm that their motherboard has proper support before purchasing.

This is also the first NVIDIA product line that will fully and completely support HDCP over DVI. This means that, when combined with a monitor or TV that also supports HDCP over DVI, content which requires HDCP to play will not have any problem. While the entire lineup of NVIDIA and ATI GPUs has been capable of supporting HDCP, no full product lines have actually implemented the required solution.

The reason this is a first is due to the requirements of HDCP. Not only must the hardware be capable of transmitting HDCP content, but it also must provide a vendor specific key. These keys are only provided to vendors after paying a hefty fee. Until now, with the lack of protected content and compatible display devices, graphics board makers have not wanted to shell out the cash for HDCP keys. These keys are actually stored on a chip that must be integrated on the graphics card, so even though older cards have the potential for HDCP, the lack of the HDCP chip means that they cannot support the feature.

While we could take a few thousand words here to editorialize the wastefulness of content "protection" in consumer markets, we'll keep our thoughts brief. Real pirates will always find a way to make their money by selling stolen content. Cost or technical barriers are not sufficient deterrents to people who make their living through illegal distribution of content. If it can be seen or heard in a decrypted format, it will always be possible to copy. Until it is mandatory that decryption hardware and software with a private key for everyone be implanted into our brains, media designed for mass distribution can never really have full protection from copying. Content protection is a flaming pit into which an industry terrified of change is demanding hardware designers, programmers and governments toss as much money as possible.

That being said, the inclusion of HDCP support on the 7950 GX2 is a good thing. There's no reason to make it more difficult on the end user who just wants to watch or listen to the media they paid for. If content providers are going to go down this route either way, then it is certainly better to be prepared. While we have not spoken with every vendor, NVIDIA assures us that every 7950 GX2 will have HDCP key hardware onboard.

Index The Card and The Test
Comments Locked

60 Comments

View All Comments

  • kilkennycat - Monday, June 5, 2006 - link

    Just to reinforce another poster's comments. Oblivion is now the yardstick for truly sweating a high-performance PC system. A comparison of a single GX2 vs dual 7900GT in SLI would be very interesting indeed, since Oblivion pushes up against the 256Meg graphics memory limit of the 7900GT (with or without SLI), and will exceed it if some of the 'oblivion.ini' parameters are tweaked for more realistic graphics in outdoor environments, especially in combo with some of the user-created texture-enhancements mods.
  • Crassus - Monday, June 5, 2006 - link

    That was actually my first thought and the reason I read the article ... "How will it run Oblivion?". I hope you'll find the time to add some graphs for Oblivion. Thanks.
  • TiberiusKane - Monday, June 5, 2006 - link

    Nice article. Some insanely rich gamers may want to compare the absolute high-end, so they may have wanted to see 1900XT in Crossfire. It'd help with the comparison of value.
  • George Powell - Monday, June 5, 2006 - link

    Didn't the ATI Rage Fury Maxx post date the Obsidian X24 card?

    Also on another point its a pity that there are no Oblivion benchmarks for this card.
  • Spoelie - Monday, June 5, 2006 - link

    Didn't the voodoo5 post date that one as well? ^^
  • Myrandex - Monday, June 5, 2006 - link

    For some reason page 1 and 2 worked for me, but when I tried 3 or higher no page would load and I received a "Cannot find server" error message.
  • JarredWalton - Monday, June 5, 2006 - link

    We had some server issues which are resolved now. The graphs were initially broken on a few charts (all values were 0.0) and so the article was taken down until the problem could be corrected.
  • ncage - Monday, June 5, 2006 - link

    This is a very cool but what would be a better idea if nvidia would use the socket concept where you can change out the VPU just like you can a cpu. So you could buy a card with only one VPU and then add another one later if you needed it....
  • BlvdKing - Monday, June 5, 2006 - link

    Isn't that what PCI-Express is? Think of a graphics card like a slot 1 or slot A CPU back in the old days. A graphics card is a GPU with it's own cache on the same PCB. If we were to plug a GPU into the motherboard, then it would have to use system memory (slow) or use memory soldiered onto the motherboard (not updatable). The socket idea for GPUs doesn't make sense.
  • DerekWilson - Monday, June 5, 2006 - link

    actually this isn't exactly what PCIe is ...

    but it is exactly what HTX will be with AMD's Torrenza and coherent HT links from the GPU to the processor. The CPU and the GPU will be able to work much more closely together with this technology.

Log in

Don't have an account? Sign up now