Original Link: http://www.anandtech.com/show/1526

While the world turned I was on a flight over to Taiwan to meet and discuss future products with just about every Taiwanese Motherboard and Video Card manufacturer I could get a meeting with. The discussions yielded a great deal of important information, such as roadmap updates, a better understanding of some of the current supply shortages and some insight into how the markets here in Taiwan and globally were holding up. While I'll talk about most of these topics in a separate article, I couldn't resist but post information on a very interesting product I managed to get some "alone-time" with while in Taiwan.

Just a few weeks ago our own Wesley Fink and I traveled to NYC to meet with NVIDIA and, more importantly, to get some first hand experience with nForce4 and nForce4 SLI platforms. As you'll know from our previous coverage on the topic, nForce4 SLI is the highest-end nForce4 offering outfitted with a configurable number of PCI Express lanes. The beauty of having a configurable number of PCI Express lanes is that you can have a single PCI Express x16 slot, or you can split that one slot into two x8 slots - which is perfect for installing two graphics cards in.

NVIDIA is less than a month away from sending final shipping nForce4 SLI boards out to reviewers, but we managed to get some quality benchmarking time with a pre-release nForce4 SLI board from MSI. The important thing to note here is that it was pre-release and we had a very limited amount of time with it - not to mention that I'm about halfway around the world from my testing equipment and benchmarks, so forgive me if the number of tests or benchmarks is not as complete as you're used to seeing on AnandTech.

There will be two versions of the MSI nForce4 SLI board shipping worldwide; in the US it will be called the MSI K8N Neo4 Platinum/SLI but in the rest of the world it will be called the MSI K8N Diamond. There will be some slight changes in the specs between the two but nothing major.

Click to Enlarge

The MSI motherboard we tested is actually the very first working sample of the K8N Neo4 Platinum/SLI; in fact, as of right now there are only 4 working nForce4 SLI samples at MSI in Taiwan, two of which happen to be in my hotel room. Despite the early nature of the motherboard, it was 100% stable and didn't crash once during our hours of testing nor in the 12 hours of burn-in before that. There were some rendering issues during some of the testing but we'd chalk that up to drivers that need some work; one thing to keep in mind is that SLI is extremely driver intensive and we'll explain why in a moment. Please be sure to read our nForce4 review and SLI preview before continuing on with this review to understand what's behind nForce4 and SLI.

We did not have time to run a full gamut of benchmarks, so all of our tests are limited to 1024 x 768, 1280 x 1024 and 1600 x 1200 with 4X AA enabled. We tested using an Athlon 64 FX-55 with 1GB of Corsair DDR400 under Windows XP Professional with DX9c. Finding game benchmarks was a bit of a challenge in Taiwan, but despite the Chinese boxes our copies of Doom 3 and Far Cry were basically the english versions. We also included the Counterstrike: Source Visual Stress Test in our impromptu test suite. But before we get to the benchmarks, let's talk a little bit about how you actually get SLI working.

Setting up SLI

NVIDIA's nForce4 SLI reference design calls for a slot to be placed on the motherboard that will handle how many PCI Express lanes go to the second x16 slot. Remember that despite the fact that there are two x16 slots on the motherboard, there are still only 16 total lanes allocated to them at most - meaning that each slot is electrically still only a x8, but with a physical x16 connector. While having a x8 bus connection means that the slots have less bandwidth than a full x16 implementation, the real world performance impact is absolutely nothing. In fact, gaming performance doesn't really change down to even a x4 configuration; the performance impact of a x1 configuration itself is even negligible.

Click to Enlarge

The SLI card slot looks much like a SO-DIMM connector:

Click to Enlarge

The card itself has two ways of being inserted; if installed in one direction the card will configure the PCI Express lanes so that only one of the slots is a x16. In the other direction, the 16 PCI Express lanes are split evenly between the two x16 slots. You can run a single graphics card in either mode, but in order to run a pair of cards in SLI mode you need to enable the latter configuration. There are ways around NVIDIA's card-based design to reconfigure the PCI Express lanes, but none of them to date are as elegant as they require a long row of jumpers.

Click to Enlarge

With two cards installed, a bridge PCB is used to connect the golden fingers atop both of the cards. Only GeForce 6600GT and higher cards will feature the SLI enabling golden fingers, although we hypothesize that nothing has been done to disable it on the lower-end GPUs other than a non-accommodating PCB layout. With a little bit of engineering effort we believe that the video card manufacturers could come up with a board design to enable SLI on both 6200 and 6600 non-GT cards. Although we've talked to manufacturers about doing this, we have to wait and see what the results are from their experiments.

As far as board requirements go, the main thing to make sure of is that both of your GPUs are identical. While clock speeds don't have to be the same, NVIDIA's driver will set the clocks on both boards to the lowest common denominator. It is not recommended that you combine different GPU types (e.g. a 6600GT and a 6800GT) although doing so may still be allowed, yet resulting in some rather strange results in certain cases.

You only need to connect a monitor to the first PCI Express card; despite the fact that you have two graphics cards, only the video outputs on the first card will work so anyone wanting to have a quad-display and SLI is somewhat out of luck. I say somewhat because if you toggle off SLI mode (a driver option), then the two cards work independently and you could have a 4-head display configuration. But with SLI mode enabled, the outputs on the second card go blank. While that's not too inconvenient, currently you need to reboot between SLI mode changes in software, which could get annoying for some that only want to enable SLI while in games and use 4-display outputs while not gaming.

We used a beta version of NVIDIA's 66.75 drivers with SLI support enabled for our benchmarks. The 66.75 driver includes a configuration panel for Multi-GPU as you can see below:

Clicking the check box requires a restart to enable (or disable) SLI, but after you've rebooted everything is good to go.

We mentioned before that the driver is very important in SLI performance, the reason behind this is that NVIDIA has implemented several SLI algorithms into their SLI driver to determine how to split up the rendering between the graphics cards depending on the application and load. For example, in some games it may make sense for one card to handle a certain percentage of the screen and the other card handle the remaining percentage, while in others it may make sense for each card to render a separate frame. The driver will alternate between these algorithms as well as even disabling SLI all-together, depending on the game. The other important thing to remember is that the driver is also responsible for the rendering split between the GPUs; each GPU rendering 50% of the scene doesn't always work out to be an evenly split workload between the two, so the driver has to best estimate what rendering ratio would put an equal load on both GPUs.

The Test

Our test configuration was as follows:

AMD Athlon 64 FX-55 (2.6GHz)

MSI K8N Neo4 Platinum/SLI

2 x 512MB Corsair DDR400

NVIDIA Graphics Cards:

NVIDIA GeForce 6600GT x 2
NVIDIA GeForce 6800GT x 2

NVIDIA 66.75 Drivers

Windows XP with DirectX 9.0c

Because of our limited time and the fact that we were thousands of miles away from our labs we could only test the cards that MSI had on hand at the time, which were NVIDIA-only.

Click to Enlarge

Doom 3 Performance

This benchmark requires no introduction - currently the most stressful game on today's GPUs, Doom 3 performance is the perfect area to look at the benefits of SLI.

Starting at 1024 x 768 we see that despite the low resolution, the GeForce 6800GT got a nice 46% performance boost from SLI. The performance improvement is much greater on the slower 6600GT as it is more GPU limited at 1024 x 768 than the 6800GT, weighing in at just under 72%.

Here the performance of two 6600GTs is equivalent of a single GeForce 6800GT.

Doom 3

As the resolution goes up so does the performance benefit from SLI. The 6800GT moved up to a 68% performance improvement, while the 6600GT only inched up to a 78.2% gain. The 6800GTs in SLI configuration actually make 1280 x 1024 with 4X AA smooth as butter under Doom 3.

Doom 3

At 1600 x 1200 we see some huge performance gains from SLI: 75.3% and 85% from the 6800GT and the 6600GT respectively. It's clear that SLI makes even the highest resolutions with AA enabled quite playable.

Doom 3

Counterstrike: Source VST Performance

With Half Life 2 less than a month away, performance under the Source VST is very important.

At 1024 x 768 the performance is already pretty impressive, but with SLI it is skyrocketed into a completely new level with the 6800GT showing a 23.8% increase in performance and the 6600GT going up by around 54%.

It's impressive to see that at 1024 x 768 with 4X AA the 6800GTs in SLI mode are still CPU limited by the fastest AMD chip on the planet.

Counterstrike: Source Visual Stress Test

At 1280 x 1024 we see something quite unusual, the 6800GT gains much more from SLI than the 6600GT. The 6800GT received a 63.5% performance boost from SLI while the 6600GT gets "only" a 45.7% improvement; given the beta nature of the drivers we'll avoid hypothesizing about why.

Counterstrike: Source Visual Stress Test

At 1600 x 1200 the performance gains peak just under 70%.

Counterstrike: Source Visual Stress Test

Far Cry Performance

For our final set of tests we've got Far Cry 1.1.

At 1024 x 768 the performance gains are still under 50% as the game is still CPU limited at such a "low" resolution. As expected, the GeForce 6600GT gets more of a performance boost from SLI as it is the slower card.

Far Cry 1.1

As we increase in resolution the performance benefits the 6800GT see are significantly higher, reaching a healthy 70.7% at 1280 x 1024.

Far Cry 1.1

What is truly impressive is the 105.9% performance improvement at 1600 x 1200 for the GeForce 6800GT in SLI mode - and we thought Doom 3 saw some impressive performance gains. At 1600 x 1200 with 4X AA Far Cry goes from mostly smooth with some stuttering to silky smooth performance thanks to SLI with the 6800GTs. The 6600GT doesn't fare as well, averaging at under 40 fps.

Far Cry 1.1

Final Words

The performance advantages due to SLI are nothing to be disappointed with, using two GPUs NVIDIA is able to deliver next-generation graphics performance with today's cards. Keep in mind that our numbers were taken at relatively high resolutions with 4X AA enabled; without AA enabled and at lower resolutions the performance gains from SLI become much lower as you are far more CPU bound.

The GeForce 6600GT is the prime candidate for the SLI poster child as it is the most affordable card with SLI support from NVIDIA. Unfortunately our tests here today are more geared towards the higher end cards as the 6600GT, even in SLI mode, is still generally outperformed by a single 6800GT. At lower resolutions or with AA disabled, the performance of two 6600GTs would definitely be more similar to that of a single 6800GT. But the important thing to keep in mind here isn't what you can do with two cheaper cards and SLI, but rather the upgrade potential SLI offers. Buying a $200 6600GT today and upgrading to another one several months down the road, at a potentially much lower price, is a great way of getting the performance you want today while at the same time having a cheap upgrade path for when tomorrow's games come out.

The GeForce 6800GT in SLI mode truly skyrocketed to a new level of performance, but a very costly one. With a pair of 6800GTs selling for about the price of most users' upgrade budgets, we once again see more potential in the upgrade value of SLI rather than the initial purchase value. However, if you can afford it, a pair of 6800GTs in SLI mode will definitely offer some serious performance in all of today's games. Interestingly enough, spending close to $1000 on graphics cards still won't let you play at 1600 x 1200 with 4X AA at over 100 fps in Doom 3; but if you're willing to settle, over 60 fps is a piece of cake.

Although motherboard and graphics support for SLI is definitely close to being ready, we are not so certain about the maturity of the drivers. NVIDIA's own tests were conducted under three applications: Doom 3, Halo and 3dmark 05. Although our own tests added two more benchmarks, they didn't run without their fair share of display issues. The complexity of the SLI driver and ensuring game compatibility is undoubtedly a major factor in the release date of SLI. We are also hearing that chipset availability is a bit on the limited side for nForce4 SLI, with most manufacturers planning on shipping boards in early 2005. ASUS and MSI both seem to be on track to a release by the end of 2004, which will definitely give them the lead if NVIDIA can get finalized drivers out in time.

All is not quiet on the ATI front though, rumor has it that they are also planning on some SLI-like solutions on both the chipset and GPU side. Given the flexibility of PCI Express to support multiple high-bandwidth slots for graphics, we would think that there's no reason (other than driver support) to not want to have SLI support within a product family. The introduction of SLI could lengthen the GPU product cycles as performance can be guaranteed for much longer, but it could also increase the expectations of upcoming GPUs like NV50 and R500. We would not be too surprised if supply issues of many of the popular SLI cards developed right before the launch of a new GPU to prevent a lackluster introduction.

In the end we're rather pleased with SLI as it promises to increase the life span of your graphics card investment, something that we've been dying to have for quite some time. We will be sure to do a full review on the final shipping SLI motherboards and GPUs when they are available, but until then we hope you've enjoyed our preview.

Very special thanks goes out to Vincent and Iris of MSI for putting themselves and their engineers through hell in order to make this review possible. You would not believe how difficult this little benchmarking opportunity was to put together :)

Log in

Don't have an account? Sign up now