The Technology

There is a very significant distinction to be made between NVIDIA's implementation of multi-GPU on a single card and previous attempts. In the past, solutions that drop two GPUs on one PCB (printed circuit board) have relied on the capability of an SLI motherboard to configure a single physical X16 PCIe connection into two X8 data paths. While this solution works, it is not optimal in bringing multi-GPU performance to the masses. Requiring not only a chipset that will allow dynamic PCIe lane configuration, but also restricting NVIDIA based graphics boards to NVIDIA core logic based motherboards really cuts down on the potential market.

With its first in-house multi-GPU design, NVIDIA has lifted the requirement for an SLI chipset and enabled the use of their 7950 GX2 on any motherboard with an X16 PCIe slot (provided the manufacturer has proper BIOS support, but more on that later). This chipset agnostic implementation works is by incorporating a PCIe switch which acts as a bridge between the system's X16 interface and the two GPUs. Because of the way PCIe works, the operating system is able to see the two graphics cards as if they were independent parts. You can think of this as being similar to connecting a USB hub to a single USB port in order to plug in multiple devices. Only in this case, the devices and switch are all in one neat little package.



The PCIe switch itself is a 48 lane device, capable of routing each of the three x16 connections to any one of the other two depending on its intended destination. On their 7900 GX2, NVIDIA takes full advantage of this, but for the 7950 GX2, only 8 lanes are routed from the switch to each GPU. The end result is that what the chipset would have had to manage, NVIDIA's 7950 GX2 moves on board.

We mentioned BIOS compatibility, which can be a potential problem. The reason we could see some issues here is that, while PCI Express switches are perfectly valid and useful devices, we haven't seen any real commercial attempt that takes advantage of them on an add-in board. Combine this with the fact that many motherboard makers only recognize graphics hardware in their x16 PCIe slots, and we end up with some wrinkles which need to be smoothed. The system BIOS must be able to handle finding a PCIe switch, and furthermore it must be able to recognize that a graphics card is beyond the switch in order to load the video BIOS.

NVIDIA has been working hard with the rest of the industry to help get BIOS updates ready and available for launch. The list is relatively long at this point, and we can confirm that the 7950 GX2 will actually run in many ATI based motherboards right now with the proper BIOS update. Inevitably, there will be some systems which will not run the 7950 GX2 at launch. Just how large a problem this is remains to be seen, but we can't put too much of the burden on NVIDIA's shoulders for this problem. Motherboard makers do need to support more than just graphics devices in their X16 slots, and the proper handling of PCIe switches is important as well. It just so happens that NVIDIA has become the catalyst for vendors to roll out support for this type of device. While we do worry about some customers being left out in the cold, often this is the price of admission to the high-tech bleeding edge of computing. To be safe, we strongly recommend interested buyers confirm that their motherboard has proper support before purchasing.

This is also the first NVIDIA product line that will fully and completely support HDCP over DVI. This means that, when combined with a monitor or TV that also supports HDCP over DVI, content which requires HDCP to play will not have any problem. While the entire lineup of NVIDIA and ATI GPUs has been capable of supporting HDCP, no full product lines have actually implemented the required solution.

The reason this is a first is due to the requirements of HDCP. Not only must the hardware be capable of transmitting HDCP content, but it also must provide a vendor specific key. These keys are only provided to vendors after paying a hefty fee. Until now, with the lack of protected content and compatible display devices, graphics board makers have not wanted to shell out the cash for HDCP keys. These keys are actually stored on a chip that must be integrated on the graphics card, so even though older cards have the potential for HDCP, the lack of the HDCP chip means that they cannot support the feature.

While we could take a few thousand words here to editorialize the wastefulness of content "protection" in consumer markets, we'll keep our thoughts brief. Real pirates will always find a way to make their money by selling stolen content. Cost or technical barriers are not sufficient deterrents to people who make their living through illegal distribution of content. If it can be seen or heard in a decrypted format, it will always be possible to copy. Until it is mandatory that decryption hardware and software with a private key for everyone be implanted into our brains, media designed for mass distribution can never really have full protection from copying. Content protection is a flaming pit into which an industry terrified of change is demanding hardware designers, programmers and governments toss as much money as possible.

That being said, the inclusion of HDCP support on the 7950 GX2 is a good thing. There's no reason to make it more difficult on the end user who just wants to watch or listen to the media they paid for. If content providers are going to go down this route either way, then it is certainly better to be prepared. While we have not spoken with every vendor, NVIDIA assures us that every 7950 GX2 will have HDCP key hardware onboard.

Index The Card and The Test
Comments Locked

60 Comments

View All Comments

  • JarredWalton - Monday, June 5, 2006 - link

    Yes, SLI profiles are used for full utilization of the GX2 card. (AFAIK - Derek can correct me if I'm wrong.)
  • DerekWilson - Monday, June 5, 2006 - link

    SLI profiles are used if availalbe, but SLI profiles are never required to enable multi-GPU support on NVIDIA hardware.

    there are some advanced options for enabling multi-GPU or single-GPU rendering in the control panel -- even down to the AFR or SFR mode type (and SLIAA modes as a fallback if nothing else will work for you).

    in short -- required: no, used: yes.
  • araczynski - Monday, June 5, 2006 - link

    haven't read the article yet as I didn't see reference to Oblivion benchmarks, and lets be honest, that's the only game out these days that's worth benchmarking (in terms of actually giving the high end cards an actual workout).
  • DigitalFreak - Monday, June 5, 2006 - link

    It's amazing all the cool stuff you can do with PCI Express.
  • Sniderhouse - Monday, June 5, 2006 - link

    quote:

    Not since Quantum3D introduced the Obsidian X24 have we seen such a beast (which, interestingly enough, did actual Scan Line Interleaving on a single card).


    The Voodoo5 5500 had two GPUs on a single card which did true SLI, not to mention the Voodoo5 6000 which had four GPUs, but never really made it to market.
  • shabby - Tuesday, June 6, 2006 - link

    The x24 was also a dual pcb video card, thats what he meant. Not dual chip or whatever.
  • timmiser - Monday, June 5, 2006 - link

    Exactly what I was thinking!
  • DerekWilson - Monday, June 5, 2006 - link

    Perhaps I should have said successful products ... or products that were availble in any real quantity :-)
  • photoguy99 - Monday, June 5, 2006 - link

    From page 1, what limitations are being referred to?

    quote:

    At lower resolutions, CPU overhead becomes a factor, and some limitations of DX9 come into play
  • Ryan Smith - Monday, June 5, 2006 - link

    DX9 itself has a good deal of overhead in some situations, something Microsoft is changing for DX10. We'll have more on that in our upcomming Vista article later this week.

Log in

Don't have an account? Sign up now