What resolutions do you need to be running?

If you're willing to spend over $1,500 on graphics cards alone, we're going to assume that you've already got a 30" display at your disposal. But in the off chance that you don't, will you see any benefit from having this much GPU power? We took a closer look at three of our benchmarks to find out.

Bioshock, the best 3-way SLI scaler we've seen today, paints a very clear picture. The 3-way 8800 Ultra setup is CPU bound until we hit 2560 x 1600, while the normal 2 card setup doesn't even come close to being CPU limited, even at 1680 x 1050.

What this tells us is that as long as the game is stressful enough, you'll see a benefit to a 3-way SLI setup even at low resolutions, just not as much as you would at higher resolutions. Pretty simple, right?

Unreal Tournament 3 shows absolutely no benefit to adding a third card, and even shows a slight performance decrease at 1680 x 1050. It isn't until 2560 x 1600 that we see any performance difference at all between the two and three card setups.

With Crysis we didn't adjust resolution, instead we varied the image quality settings: medium, high and very high. Just as with varying resolution, adjusting image quality settings increases the impact of 3-way SLI. Unfortunately where 3-way makes its biggest impact (very high quality), we're at an unplayable setting for much of the game.

What sort of a CPU do you need for this thing?

We've already established that at higher resolutions 3-way SLI can truly shine, but how ridiculous of a CPU do you need to run at those high detail settings?

The theory is that the better a game scales from 2 to 3 GPUs, the more GPU bound and less CPU bound it is. The worse a game scales, there's greater the chance that it's CPU bound (although there are many more reasons for poor scaling from 2 to 3 GPUs).

Clock speed Bioshock Oblivion Crysis
3.33GHz 103.8 49.0 43.2
2.66GHz 101.7 48.3 37.3
2.00GHz 90.9 47.3 30.9

 

In Bioshock, the difference in performance at 2.66GHz and 3.33GHz is negligible, but once we drop the clock speed to 2.0GHz you start to see performance drop off. What this tells us is that even at mid-2GHz clock speeds, even a 3-way 8800 Ultra setup is GPU bound in Bioshock. And even at 2.0GHz, the 3-way setup is far from fully CPU bound as performance is still better than the two card system with a 3.33GHz CPU.

Similarly, Oblivion isn't CPU bound at all. Even at 2.0GHz, we don't see a significant drop in performance.

Crysis does actually benefit from faster CPUs at our 1920 x 1200 high quality settings. Surprisingly enough, there's even a difference between our 3.33GHz and 2.66GHz setups. We suspect that the difference would disappear at higher resolutions/quality settings, but the ability to maintain a smooth frame rate would also disappear. It looks like the hardware to run Crysis smoothly at all conditions has yet to be released.

We feel kind of silly even entertaining this question, but yes, if you want to build a system with three 8800 Ultras, you don't need to spend $1000 on a CPU. You can get by with a 2.66GHz chip just fine.

Wanna 3-way? Power Consumption
Comments Locked

48 Comments

View All Comments

  • chizow - Tuesday, December 18, 2007 - link

    Derek Wilson in 8800GT Review:
    quote:

    For this test, we are using a high end CPU configured with 4GB of DDR2 in an NVIDIA 680i motherboard. While we are unable to make full use of the 4GB of RAM due to the fact that we're running 32-bit Vista, we will be switching to 64-bit within the next few months for graphics. Before we do so we'll have a final article on how performance stacks up between the 32-bit and 64-bit versions of Vista, as well as a final look at Windows XP performance.


    Completely valid point about using 32-bit vs. 64-bit and somewhat of a hot topic over in the video forums. Honestly you have $5000+ worth of hardware in front of you, yet getting a 64-bit version of Vista running benchmarks at resolutions/settings where 64-bit and 2GB+ would help the most is too difficult? C'mon guys, seriously this is the 2nd sub-par review in a row (512 GTS review was poor too).

    Also, could you clarify the bit about 680i boards being able to accomplish the same thing? Exactly what spurred this change in Tri-SLI support? Driver support? Seems Anand used 169.08 but I thought the 169.25 was the first to officially support Tri-SLI from the patch notes. Or has it always been supported and the 780i just hyping up a selling point that has been around for months? Also, the 780i article hinted there would be OC'ing tests with the chipset and I don't see any here. Going to come in a different article? Thanks.
  • blppt - Tuesday, December 18, 2007 - link

    Yeah, seriously. Especially since the 64bit Crysis executable does away with the texture streaming engine entirely...how can you make a serious "super high end ultimate system" benchmark without utilizing the most optimized, publicly available version of the game? Is it that the 64bit Vista drivers dont support 3-way SLI yet?

    Otherwise, putting together a monster rig with 3 $500 videocards and then testing it with 32bit vista seems rather silly....
  • Ryan Smith - Tuesday, December 18, 2007 - link

    Address space consumption isn't 1:1 with video memory, it's only correlated, and even less so in SLI configurations where some data is replicated between the cards. I'm not sure what exact value Anand had, but I'm confident Anand had more than 2GB of free address space.
  • JarredWalton - Tuesday, December 18, 2007 - link

    Testing at high resolutions with ultra-insane graphics settings serves one purpose: it makes hardware like Quad-SLI and Tri-SLI appear to be much better than it really is. NVIDIA recommended 8xAA for quad-SLI back in the day just to make sure the difference was large. It did make QSLI look a lot better, but when you stopped to examine the sometimes sub-20 FPS results it was far less compelling.

    Run at 4xAA on a 30" LCD at native resolution, and it's more than just a little difficult to see the image quality difference, with sometimes half the frame rate of 4xAA. A far better solution than maxing out every setting possible is to increase quality where it's useful. 4xAA is even debatable at 2560x1600 - certainly not required - and it's the first thing I turn off when my system is too slow for a game. Before bothering with 8xAA, try transparent supersampling AA. It usually addresses the same issue with much less impact on performance.

    At the end of the day, it comes down to performance. If you can't enable 8xAA without keeping frame rates above ~40 FPS (and minimums above 30 FPS), I wouldn't touch it. I play many games with 0xAA and rarely notice aliasing on a 30" LCD. Individual pixels are smaller than on 24", 20", 19", etc. LCDs so it doesn't matter as much, and the high resolution compensates for other areas. Crysis at 2560x1600 with Very High settings? The game is already a slide show, so why bother?
  • 0roo0roo - Tuesday, December 18, 2007 - link

    faster is faster, the best is expensive and sometimes frivolous. at that price point you arent thinking like a budget buyer anymore. like exotic cars, you can't be that rational about it. its simply power ...power NOW.
  • crimson117 - Tuesday, December 18, 2007 - link

    If it's true that it's all about power, then just find the most expensive cards you can buy, install them, and don't bother playing anything. Also, tip your salesman a few hundred bucks to make the purchase that much more expensive.
  • 0roo0roo - Tuesday, December 18, 2007 - link

    look, its not like you don't get any advantage from it. its not across the board at this point, but its still a nice boost for any 30" gamer.

    seriously, there are handbags that cost more than this sli stuff.
  • JarredWalton - Tuesday, December 18, 2007 - link

    Next up from AnandTech: Overclocked Handbags!

    Stay tuned - we're still working on the details...

Log in

Don't have an account? Sign up now