Original Link: http://www.anandtech.com/show/1552
NVIDIA's GeForce 6 SLI: Demolishing Performance Barriersby Anand Lal Shimpi on November 23, 2004 10:23 AM EST
- Posted in
For months we’ve been waiting to take advantage of NVIDIA’s SLI and it’s looking like the tier one motherboard manufacturers will be doing their best to bring the first nForce4 SLI motherboards to market before the end of this year. So is SLI all it’s cracked up to be?
With a final board and final drivers, it’s time to look at SLI from a final perspective to see if NVIDIA squandered the opportunity to regain technology and performance leadership or if SLI is really everything it used to be…
How SLI Works
NVIDIA’s Scalable Link Interface (SLI) is based on the simple principle of symmetric distribution of load, meaning that the architecture depends on (and will only really work) if both GPUs get the exact same load as one another. The nature of NVIDIA’s SLI indicates that odd combinations such as cards with different clock speeds or GPU feature sets (e.g. 16-pipes + 8 pipes) will not work; NVIDIA’s driver will run all cards at the lowest common clock speed, but there’s nothing you can do about trying to get different GPUs to work in SLI mode, the driver simply won’t let you enable the option.
NVIDIA’s first task in assuring that the load distributed to both GPUs would be balanced and symmetrical was to equip their nForce4 SLI chipset with identical width PCI Express graphics slots. By default, PCI Express graphics cards use a x16 slot, which features 16 PCI Express lanes offering 8GB/s of total bandwidth. Instead of outfitting their chipsets with 16 more PCI Express lanes, NVIDIA simply allows the number of lanes to be reconfigurable to either a single x16 slot or two x8 slots, with the use of a little card on the motherboard itself. The physical slots themselves are both x16 slots, but electrically they can be configured to be two x8 slots. This won’t cause any compatibility issues with x16 cards, as they will just use fewer lanes for data transfers, and the real world performance impact is negligible in games, which is what NVIDIA is counting on.
The next trick is to make sure that the GPUs receive the exact same vertex data from the CPU, which is done by the CPU sending all vertex data to the primary GPU and then the primary GPU forwards it on to the secondary GPU. Once data arrives at the primary GPU via the PCI Express bus, all GPU to GPU communication is handled via NVIDIA’s video bridge. The video bridge is a bus that connects directly to the GPU and is used for transferring data from the frame buffer of one GPU directly to the next. NVIDIA isn’t offering too much information on the interface, other than saying that it is capable of transferring data at up to 10GB/s. While it is possible to have this GPU-to-GPU communication go over the PCI Express bus, NVIDIA insists that it would be silly to do so because of latency issues and bandwidth constraints, and has no plans in moving in that direction.
NVIDIA’s driver plays an important role in maintaining symmetry in the rendering by looking at the workload and making two key decisions: 1) determining rendering method, and depending on the rendering method, 2) determining the workload split between the two GPUs.
NVIDIA supports two main rendering methods: Alternate Frame Rendering (AFR) and Split Frame Rendering (SFR). As the names imply, AFR has each GPU render a separate frame (e.g. GPU 1 renders all odd frames and GPU 2 renders all even frames) while SFR splits up the rendering of a single frame amongst the two GPUs. NVIDIA’s driver does not determine whether to use AFR or SFR on the fly, instead NVIDIA’s software engineers have profiled the majority of the top 100 games and created profiles for each and every one, determining whether they should default to AFR or SFR mode in each game. NVIDIA’s driver defaults to AFR as long as there are no dependencies between frames; for example, in some games that use slow motion special effects the game itself doesn’t clear the frame buffer and will render the next frame on top of the previous frame, alpha blending the two frames together to get the slow motion effect – in this case there is a frame to frame dependency and AFR cannot be used.
If AFR can’t be used, the SFR is used but now the driver must determine how much of each frame to send to GPU 1 vs. GPU 2. Since the driver can count on both GPUs being the exact same speed (see why it’s important?), it makes an educated guess on what the load split should be. The educated guess comes through the use of a history table that stores the load each GPU was placed under for the past several frames. Based on the outcomes stored in this history table, NVIDIA’s driver will make a prediction of what the rendering split should be between the two GPUs for future frames and will adjust the load factor accordingly. This should all sound very familiar to anyone who has ever heard of a branch predictor in a CPU, and just like a branch predictor there is a penalty for incorrectly predicting. If NVIDIA’s driver predicts incorrectly one GPU will finish its rendering task much sooner than the other, giving it nothing to do but wait until the other GPU is done, thus reducing the overall performance potential of the SLI setup.
By now you can begin to see where the performance benefits of SLI come into play. With twice the GPU rendering power you effectively have a 32-pipe 6800GT with twice as much memory bandwidth if you pair two of the cards together, a configuration that you won’t see in a single card for quite some time. At the same time you should see that SLI does have a little bit of overhead associated with it, and at lower CPU-bound resolutions you can expect SLI to be slightly slower than a single card. Then again, you don’t buy an SLI setup to run at lower resolutions.
Once both GPUs have completed their rendering, whether in AFR or SFR mode, the secondary GPU sends its frame buffer to the primary GPU via NVIDIA’s video bridge. The important thing here is that the data is sent digitally, so there’s no loss in image quality as a result of SLI. The primary GPU recombines the data and outputs the final completed frame (or frames) through its outputs. Sounds simple enough, right?
Surprisingly enough, throughout all of our testing, we didn’t encounter any rendering issues in SLI mode. NVIDIA insists that they have tested quite a few of the top 100 games to ensure that there aren’t any issues with SLI mode and it does seem that they’ve done a good job with their driver. If the driver hasn’t been profiled with a game, it will default to single-GPU mode to avoid any rendering issues, but the user can always force SLI mode if they wish.
ASUS’ A8N-SLI Deluxe
ASUS and NVIDIA have been working very closely with each other on the nForce4 SLI project. NVIDIA took ASUS’ A8N-SLI Deluxe on tour with them, doing demonstrations to reviewers all over the world based on this one motherboard. Obviously the partnership has irritated a few of ASUS’ competitors, and thus it looks like Gigabyte and MSI are doing their best to get their competing boards out as soon as possible. But ASUS was the first to get us a final board and thus we have them in our review today.
The very first A8N-SLI Deluxe motherboard we received was horribly unstable and we spent the majority of our time just trying to get the thing to work. Our sample was one of 10 in the world and fortunately not a mass production sample. ASUS managed to get us another board in time for the publication of this review, and the updated board fixed all of our issues. We will be sure to do a full review on ASUS’ SLI motherboard featured here, but for now here’s some brief information about the board.
The A8N-SLI Deluxe is a very interesting solution from ASUS as it will be targeted at both the high end and mainstream Socket-939 markets. With a price point of around $180, ASUS is hoping that all types of users, from casual to hardcore gamers will flock to the A8N-SLI Deluxe to either take advantage of SLI immediately or have the security of a SLI upgrade path.
The board itself is as feature filled as you could possibly imagine. Featuring 3 x 32-bit PCI, 2 PCI Express x1 and 2 PCI Express x8 slots, the board is pretty balanced when it comes to add-in card expansion.
ASUS spread the two PCI Express x8 slots out a bit more than some manufacturers have planned to do, in order to improve cooling when running two cards in SLI mode. ASUS also supplies a bridge PCB appropriately sized to accommodate the distance between the two PCI Express connectors. The card that reconfigures the PCI Express lane arrangement from the chipset is wedged in between the two PCI Express x8 slots. The card can be a little difficult to get to at times, but with a bit of patience it’s not too big of a deal.
The actual nForce4 SLI chipset is placed between the two PCI Express x8 slots, but shifted down to be as far away from the heat producing GPUs as possible. The problem is that with 2-slot cards such as the GeForce 6800 Ultra there is not much clearance over the top of the chipset’s heatsink, which limited the size of the heatsink that ASUS could put on the motherboard. The end result is that while the heatsink and fan do the best job they can, the heatsink gets extremely hot. Just something to keep an eye out for.
The nForce4 SLI chipset on the A8N-SLI Deluxe
By using a separate Silicon Image SATA controller in conjunction with the nForce4 SLI’s built in SATA controller, the A8N-SLI supports a maximum of 8 SATA drives. Impressively enough, ASUS provides 4-pin molex to SATA power adapters and SATA cables for all of the ports. ASUS went one step further and also bundles a card that allows you to plug a SATA drive (and power) into your motherboard, externally without ever opening your case. By running two SATA ports and one power connector to a slot cutout you can plug any SATA drive and use it externally. Remember that since the nForce4 SLI chipset supports the SATA II specification, you can use this external port with hot pluggable SATA II drives.
In order to aid in power delivery to a power hungry SLI setup, ASUS implemented what they are calling their “EZ-PLUG” connector on the board. The EZ-PLUG is basically a 4-pin molex connector on the board itself that is designed to provide an additional 12V line to the graphics cards in SLI mode. Using the plug isn’t necessary (we tested both with and without it and in both cases it worked fine), but ASUS insists you use it in SLI mode to guarantee stability. If you don’t apply power to the EZ-PLUG and you are in SLI mode, a red LED lights up on the motherboard and a warning will appear at POST telling you that you forgot to supply power to the EZ-PLUG.
ASUS is expecting mass production of the A8N-SLI Deluxe to commence in the coming weeks; this is an extremely important motherboard for ASUS and they have extended their promise to us that it will be widely available before the holidays, most likely starting the first week of December.
We are pretty happy with what we’ve seen from ASUS with their A8N-SLI Deluxe, but we’ll save the full evaluation of the motherboard for our review of the board itself. With a working sample of the board in hand it looks like ASUS has worked out any issues we had with the first sample of the board, and it should make for a nice gift (for someone special or yourself of course) for the holiday gamer.
SLI – The Requirements
There’s been a lot of confusion as to what is required to run a SLI configuration, so we put together a quick list of the things you’ll need:
- Everything necessary to put together a working system, including SLI motherboard
- Two graphics cards with identical GPUs from the same manufacturer. Video BIOS revisions must also be identical. Note that if the cards run at different clock speeds, the driver will run both cards at the lower clock speed of the two. NVIDIA has announced their SLI certification program, which means that two SLI certified cards should have no problems working in tandem. Currently only NVIDIA cards will work in SLI mode although ATI plans on introducing SLI technology in 2005.
- A power supply capable of supplying adequate power to the system as well as both graphics cards. Note: you may need one or two 2 x 4-pin to 1 x 6-pin PCI Express power adapters if you are using two 6800GT or 6800 Ultra graphics card with a power supply that either has no or only one 6-pin PCI Express power connector.
- A SLI video bridge connector. This connector should be provided with your nForce4 SLI motherboard.
- NVIDIA drivers with SLI support. Currently the 66.93s are the only NVIDIA sanctioned drivers with SLI support, however NVIDIA is working on rolling in SLI support to all of their drivers, including the newly released 67.02 driver.
It’s no big surprise that you can’t use different, GPUs; in our tests we tried combining a 6800 Ultra with a 6600GT, but NVIDIA’s driver wouldn’t even let us enable SLI on the combination. When we tried to combine two different 6600GTs (non SLI certified) we could enable SLI through the driver, but there were tons of stability problems. Accessing the NVIDIA Control Panel would cause the system to lock up, presumably because the control panel had issues reading from two different video BIOSes. If we didn’t bother with the NVIDIA Control Panel and just tried to run a game we were met with video corruption issues and lockups. Right now it seems like the only option for SLI is to have two identical cards; in theory they can be from different manufacturers as long as the video BIOSes and all of the hardware specifications are identical. In order to make upgrading easier, NVIDIA introduced their SLI certification program which is designed to ensure compatibility between all identical-GPU cards going forward. Only time will tell whether or not this actually pans out to make upgrading to a SLI configuration easy.
One thing to make sure you have are sufficient power connectors coming off of your power supply. If you are using two 6600GTs then it’s not a big deal, since the cards themselves don’t require any external power. However, with two 6800GTs, each card is outfitted with a 6-pin PCI Express power connector, which must be used for proper/stable SLI operation. Since most power supplies only include one (or no) PCI Express power connectors, chances are that you’ll have to use a 4-pin molex to 6-pin PCI Express power adapter, which takes two regular 4-pin power connectors and combines them into a single 6-pin PCI Express connector. You should, in theory, use two separate power cables with the adapter (in order to avoid pulling too much current off of a single cable and violating the ATX spec) but in practice we had no issues with using two connectors off of a single cable to power one of the graphics cards. If you have no PCI Express power connectors on your power supply then you’d need four separate power connectors just to power your graphics cards, add another one for ASUS EZ-PLUG and then you can start thinking about powering up things like your hard drive and DVD drive. While purchasing a SLI motherboard will pave a nice upgrade path for you in the future, you may need to enable that future by upgrading your power supply as well.
We’ve already described the SLI setup process in our Preview of NVIDIA SLI Performance, but we will revisit it here today using the ASUS A8N-SLI Deluxe board as there are some differences.
The first step in enabling SLI is to reconfigure the PCI Express x16 lanes from the nForce4 SLI chipset into two x8 lanes, this is done by inserting the SLI card in the appropriate direction:
Next, you plug in both PCI Express graphics cards. They must be the same GPU type, but you can use cards from different manufacturers if you would like (although it is recommended to have the same BIOS revisions, etc…).
Third, connect the two PCI Express graphics cards using the ASUS supplied bridge PCB.
Fourth, connect the appropriate power connectors to both PCI Express graphics cards.
Fifth, connect power to ASUS’ on-board 4-pin power connector.
Finally, connect your monitor to either one of the outputs on the first PCI Express card and power up your system.
Once in Windows, using the 66.93 drivers, you simply enable SLI mode from NVIDIA’s control panel and reboot your system to enable SLI. Note that only your primary graphics card’s display outputs will be active in SLI mode.
Clicking the check box requires a restart to enable (or disable) SLI, but after you've rebooted everything is good to go.
SLI Power Consumption
We mentioned before that a strict requirement of SLI is that you have a powerful power supply; in order to further stress our point we performed a few system power tests where we measured peak power consumption of the power supply for our test bed at both idle and load states.
As you can see, at idle and load, two 6600GTs in SLI mode consume about as much power as a single 6800 Ultra. But in SLI mode, two 6800 Ultras require 35% more power than a system with just a single 6800 Ultra.
Due to our initial sample having so many problems and our replacement board arriving one day before this review went live, we weren’t able to run any benchmarks with ATI’s X800 XT on ASUS’ nForce4 board for comparison. Feel free to look at our other reviews to get an idea for how ATI stacks up to a single 6800 Ultra in order to estimate where ATI would fit in today’s performance tests.
Our system was configured as follows:
AMD Athlon 64 4000+
ASUS A8N-SLI Deluxe (nForce4 SLI) Motherboard
1GB OCZ DDR400 3-3-3-10 (our first board required the use of very lax timings)
2 x NVIDIA GeForce 6800 Ultra PCI Express cards (clocked at 400/1100)
2 x NVIDIA GeForce 6800GT PCI Express cards (clocked at 350/1000)
2 x NVIDIA GeForce 6600GT PCI Express cards (clocked at 500/500)
Windows XP Service Pack 2 with DirectX 9.0c
NVIDIA ForceWare 69.33 Drivers
Half Life 2 Performance
Given our recent focus on Half Life 2 performance, it should be no surprise that we start off our performance coverage with Valve’s latest title. The benchmarks we used for this review were created in house and are all documented (as well as available for download) in Parts 1 and 2 of our Half Life 2 GPU Roundups. One word of caution however, the benchmarks in those reviews used NVIDIA’s 67.02 drivers, however those drivers in particular do not yet have SLI support, thus we were forced to use older 66.93 drivers for this review – thus making our NVIDIA Half Life 2 numbers not directly comparable between these reviews.
Half Life 2: AT_canals_08
In this first demo we notice a couple of things; for starters, there is a small but consistent performance drop when enabling SLI. The reason behind this performance drop is because at lower resolutions we are still CPU limited with the higher end 6800 based GPUs, thus the additional overhead of splitting up the rendering and forwarding data from one GPU to the next for recombination ends up making SLI slightly slower than just a single GPU. That being said, no one would realistically have $800+ worth of video cards and run at 1024 x 768.
Looking at 1280 x 1024 there begins to be more of a performance benefit to SLI, but the benefits are not really significant until we hit 1600 x 1200. At 1600 x 1200 the 6800GT and Ultra are still somewhat CPU limited, but the 6600GT is far from it. Moving to two 6600GTs increases performance by 67% and delivers higher performance than a single 6800 Ultra, 9.5% faster to be exact. This scenario alone showcases the upgrade potential for SLI; purchasing a single 6600GT today allows you to run Half Life 2 at 1024 x 768 quite well, but adding a second card later on will let you run at higher resolutions (or more GPU intensive games). Assuming you can add that second card later on for less than you purchased the first one, your upgrade path actually puts you in a better situation overall than had you just purchased a single, more expensive card at the start. Keep an eye on this comparison between two 6600GTs and a single 6800GT/Ultra to evaluate the mainstream upgrade path benefits of SLI.
With 4X AA and 8X AF enabled, the situation changes dramatically. Although the performance gains are impressive, the pair of 6600GTs isn’t able to outperform a single 6800GT/Ultra. For 6800GT/Ultra owners, SLI enables smooth playability at high resolutions with AA and AF enabled. We play tested much of Half Life 2 with two 6800GTs at high resolutions with AA/AF enabled and for the most part the game was butter smooth; at 1600 x 1200 with 4X AA/8X AF enabled however there were some areas where even two 6800 Ultras would get choppy. But most resolutions and settings that were not smooth before on a single card were definitely playable thanks to SLI.
Half Life 2: AT_coast_05
In our at_coast_05 benchmark we see a fairly similar situation, with all of the GPUs being CPU bound at resolutions below 1600 x 1200. At 1600 x 1200 the 6600GT begins to drop off, but toss on a second card and it delivers performance slightly faster than that of a single 6800 Ultra.
With AA/AF enabled, the 6600GT actually does a lot better in our second demo, using SLI it manages to offer performance equal to that of a single 6800GT. The 6800GT and Ultra don’t need SLI too badly, even with AA/AF enabled, in this demo, only at 1600 x 1200 with AA/AF enabled does the 6800GT see a real performance gain from SLI.
Half Life 2: AT_coast_12
The at_coast_12 benchmark begins to show GPU limitations earlier on in the resolution scale than the previous two tests; at 1280 x 1024 we start seeing some performance benefits to SLI, but the real win once again comes at 1600 x 1200. At 1600 x 1200 the 6600GT, once paired up in SLI mode, enjoys a 44% increase in performance which allows it to outperform a single 6800 Ultra by 10%. SLI mode also allows the 6800GT and 6800 Ultra to both achieve 100+ fps frame rates in this benchmark.
With AA/AF enabled all of the cards do extremely well; the 6600GT SLI configuration ends up being faster than a single 6800GT/Ultra, and the latter two cards manage to remain mostly CPU limited at resolutions up to and including 1600 x 1200.
Half Life 2: AT_prison_05
Our fourth Half Life 2 benchmark shows results similar to what we saw in the last benchmark, with the margins of improvement increasing in this benchmark.
Half Life 2: AT_c17_12
Our final Half Life 2 benchmarks ends up being mostly CPU bound and thus SLI has little to offer, except at higher resolutions with AA/AF enabled.
Half Life 2 Performance Summary
In order to best characterize the performance improvement from SLI in Half Life 2 we averaged the performance gains across our five benchmarks at each resolution:
|Half Life 2 Average Performance Gain due to SLI|
1024 x 768
1280 x 1024
1600 x 1200
|NVIDIA GeForce 6800 Ultra||
|NVIDIA GeForce 6800GT||
|NVIDIA GeForce 6600GT||
At 1024 x 768 you see there’s basically no performance improvement with SLI enabled, and in some cases it’s actually slightly slower. Even at 1280 x 1024, only the 6600GT gets a decent performance improvement, but at 1600 x 1200 we see performance improvements across the board. The 6600GT has the most to gain at 1600 x 1200, moving up by about 40%, and offering performance greater than both a single 6800GT and 6800 Ultra.
With AA enabled, all of the cards make decent gains from enabling SLI. The 6600GT reaches its performance limit at 41%, but the 6800 Ultra manages a 48% gain at 1600 x 1200 and the 6800GT does even better with an average improvement of 54%.
|Half Life 2 Average Performance Gain due to SLI with 4X AA & 8X AF|
1024 x 768
1280 x 1024
1600 x 1200
|NVIDIA GeForce 6800 Ultra||
|NVIDIA GeForce 6800GT||
|NVIDIA GeForce 6600GT||
Overall SLI will make a pair of 6600GTs perform like an overclocked 6800 Ultra or it will let 6800GT owners run at even higher resolutions/AA modes under Half Life 2.
Doom 3 Performance
As you can expect, SLI offers no benefits at 1024 x 768, but as early as 1280 x 1024 we start to see some reasonable performance gains. The 6800 Ultra gets a 22% increase in performance, while the 6800GT gets a slightly bigger bump of 26% thanks to SLI. The big winner here is the GeForce 6600GT whose frame rate jumps 43% from 63.6 up to 91.1 thanks to SLI. Here we begin to see some of the upgrade potential of SLI, with two 6600GTs offering slightly greater performance than a single GeForce 6800 Ultra at 1280 x 1024.
At 1600 x 1200 the 6800 Ultra sees a 39% performance increase in SLI mode, breaking the 100 fps barrier in Doom 3’s built in demo. The 6800GT gets even more of a performance boost at 53%, bringing it to within striking distance of a SLI 6800 Ultra setup. The 6600GT also becomes much more playable at 1600 x 1200 with SLI enabled.
Enabling Antialiasing simply increases the benefits of SLI. Now at 1024 x 768 there is a performance advantage to having two GPUs, and for the 6800 Ultra that’s a 34% increase in performance. Once again, the margins of improvement get better as you move to slower GPUs – 36% for the 6800GT and 66% for the 6600GT. At 1024 x 768 with 4X AA the two 6600GTs manage to offer performance that’s just slightly faster than a single 6800GT.
Going up in resolution we continue to see some impressive gains, but what matters here isn’t that SLI results in a 63% performance increase for the 6800 Ultra and 72% for a 6800GT, what matters is that SLI makes 1280 x 1024 with 4X AA and 8X AF very smooth, something that was not possible with only a single card. Despite the performance improvement, two 6600GTs are not able to pull ahead of even a single 6800 Ultra in this test, which shows you some of the limits of SLI. While the 6600GT in SLI mode does much better than a single 6800 Ultra at “lower” resolutions like 1280 x 1024 with AA disabled, turning on antialiasing still preys on the bandwidth and fillrate limitations of an 8-pipe 6600GT with only 16GB/s of memory bandwidth.
What’s important to note here is that the recommendation varies greatly based on resolution. While the 6800GT does incredibly well paired up with another card, the 6600GT only offers better performance than a single 6800 Ultra at non-AA resolutions. As soon as you enable AA, even a pair of 6600GTs isn’t faster than a single 6800 Ultra (or GT).
Far Cry Performance
Once again, no surprise that at 1024 x 768 there’s no real performance gains from SLI – having two of the fastest GPUs around means that even an Athlon 64 4000+ isn’t fast enough to extract the full performance of two GPUs at 1024 x 768. The difference between Far Cry and a game like Doom 3 is that even at 1280 x 1024, SLI doesn’t help out much either. For the 6800 Ultra the performance gain due to SLI is barely 6% and only 10% on the 6800GT; while we would normally be happy with these types of performance gains, remember that they are at the expense of purchasing two cards. The 6600GT gains a reasonable 19% boost in SLI mode, but that’s not enough to make it faster than a single 6800GT or Ultra.
We start to see some performance gains at 1600 x 1200, 23% for the 6800 Ultra, 31% for the 6800GT and 43% for the 6600GT. Once again, only the 6600GT really gets an appreciable performance gain here, but it’s still not enough to outperform a single 6800GT – although it comes extremely close.
The real performance gains for the 6800 Ultra and 6800GT happen at 1600 x 1200 with 4X AA and 8X AF enabled, improving performance by close to 60% for the 6800 Ultra and by almost 70% for the 6800GT. Once again, SLI makes higher resolution AA/AF modes playable where they weren’t before. For the 6600GT, SLI brings the performance levels up to that of a single 6800GT, but no faster.
Unreal Tournament 2004 Performance
In UT2004 we see that even up to 1600 x 1200 we’re basically CPU bound with the higher end GPUs. The 6600GT is the only GPU that shows a performance improvement due to SLI, at 1600 x 1200 the gain is just under 16%.
What’s interesting to note is that SLI is slightly slower at CPU bound resolutions than just a single card configuration; we ran the single card configurations in PCI Express x8 mode (just like the SLI cards run), so the performance difference, although negligible, is completely due to SLI driver and GPU bridge overhead.
It isn’t until we hit 1600 x 1200 with 4X AA and 8X AF that we see a real performance gain from SLI.
Wolfenstein: Enemy Territory Performance
We included Wolfenstein: ET in our test suite for SLI for one reason and one reason alone, to look at an older game to show SLI’s impact on a title that ran very well even on a 6600GT.
You can see by the resolution scaling graph that we are completely CPU bound at all resolutions here, much like what we saw from UT2004. Only the 6600GT actually benefits from SLI, bringing it up to 6800GT speeds thanks to a 31% performance improvement.
With the 6800GT and Ultra, even at 1600 x 1200 with AA/AF enabled the performance gains from SLI are nothing major. The 6600GT continues to provide impressive performance gains, bringing it up to the level of a 6800GT thanks to SLI.
The important thing to take away from these numbers is that on today’s high end cards, SLI is very much a technology that is suited for the latest games as well as tomorrow’s titles. But for midrange cards SLI can definitely enable higher resolution and/or AA gaming.
Battlefield: Vietnam Performance
For our final benchmark we only looked at performance at 1600 x 1200 with 4X AA and 8X AF enabled.
The performance improvement due to SLI is pretty strong as you would expect at such a high resolution. The 6600GT moves up 57%, while the 6800GT and Ultra gain an impressive 71% from SLI mode.
The 6600GT in SLI mode comes close to the performance of a single 6800GT, but not equal unfortunately. Both the 6800GT and the 6800 Ultra reach very playable frame rates thanks to SLI.
From a performance standpoint, SLI is just about as good as it gets. If you have the budget for it, a pair of GeForce 6800GTs will let you run at 1600 x 1200 with 2X or 4X AA enabled in the latest games while still maintaining a very smooth gaming experience – something that no single card is able to do.
The GeForce 6600GT seemed to scale reasonably well, with a pair of 6600GTs outperforming a single 6800 Ultra in Doom 3 and Half Life 2. It doesn’t make too much sense to buy a pair of 6600GTs today however, as you’d be much better off getting a single 6800GT and upgrading to a second one down the road, which brings us to our next point, the upgrade value of SLI.
If NVIDIA is able to get their SLI certification program successful enough and if motherboard manufacturers are able to get SLI boards cheap enough, then the upgrade value of SLI is significant. We’ve already seen that going from a single $200 GeForce 6600GT to a pair of them offers performance greater than that of a single $400 GeForce 6800GT. Take into account that the price of these cards goes down over time and you’re looking at a pretty decent upgrade path for the future, requiring minimal investment today.
The upgrade path for 6800GT owners is even more enticing; if you’ve only got $400 to spend on a card today you can’t beat the 6800GT as a single card solution. Then, as the price of the 6800GT drops, it may become more attractive for you to upgrade to a second card rather than buying a next generation GPU. As long as we’re between DirectX cycles, SLI enables you to have the fastest most robust graphics setup out there without missing out on much.
While companies like ASUS, Gigabyte and MSI are working to get their boards out before the end of the year, it looks like the majority of manufacturers won’t have product on the streets until the first few months of 2005. We’d anticipate that by the middle of 2005, you’ll be able to purchase SLI motherboards for near mainstream Socket-939 prices, which should definitely drive for higher adoption and lower prices on SLI products (not to mention wider availability of NVIDIA certified SLI products).
The power requirements as well as the lack of NVIDIA certified SLI products out on the market today does trouble us a bit, but we’ll have to keep an eye out over the coming weeks to see how things change to better accommodate the introduction of SLI. At the same time, should SLI catch on, it has yet to be seen how NVIDIA and their partners will change pricing/availability strategies of older cards.
ATI will also have their own SLI chipsets and graphics cards in 2005, which should lend further credibility to SLI as a viable upgrade option.
The main thing to keep in mind that SLI is an option for those who want it; and whenever we have an option that offers a 40 – 70% performance increase where it counts, we welcome it with open arms (and wallets).