Power Consumption

This was probably the most fun we had in testing for this review: measuring the power consumption of the 3-way SLI setup. We plugged the entire system into our Extech power meter and bet on how much power it'd use. Let's just say that NVIDIA isn't too far off with its minimum PSU requirements, see for yourself.

At idle, our 3-way SLI testbed drew around 400W of power. To put that into perspective, this is more power than any of our normal CPU or GPU testbeds under full load...just sitting at the Windows desktop. The third 8800 Ultra manages to pull an extra 100W without trying. Now let's see how much power this thing needs when playing a game.

We started with Crysis, which is normally our most stressful game test. Normally doesn't apply here though as Crysis didn't scale well from 2 to 3 GPUs, the third graphics card only improved performance by a few percentage points at our most playable settings, thus power consumption won't hit its peak.

Averaging 660W of power, the 3-way SLI system is now using over 2x the power of a normal gaming system outfitted with an 8800 GT. But it gets better.

Bioshock gave us the best scaling we saw out of the lot, and thus the 3rd GPU is working its hardest in this test, meaning we should be able to get our Extech to produce some stunningly high numbers.

And stunningly high we got. The 3-way SLI system averaged 730W in Bioshock, and get this, we even saw the machine pull over 800W from the wall outlet.

Configuration Idle Power Bioshock Load Crysis Load
8800 Ultra x1 217W 329W 337W
8800 Ultra x2 300W 475W 520W
8800 Ultra x3 388W 730W 660W

 

I was talking to Matthew Witheiler, our first dedicated graphics editor here at AnandTech, I told him how much power the system used under load and while idle. His response? "JESUS". "No", I said, "not even Jesus needs this much power".

Resolution and CPU Scaling Final Words
Comments Locked

48 Comments

View All Comments

  • chizow - Tuesday, December 18, 2007 - link

    Derek Wilson in 8800GT Review:
    quote:

    For this test, we are using a high end CPU configured with 4GB of DDR2 in an NVIDIA 680i motherboard. While we are unable to make full use of the 4GB of RAM due to the fact that we're running 32-bit Vista, we will be switching to 64-bit within the next few months for graphics. Before we do so we'll have a final article on how performance stacks up between the 32-bit and 64-bit versions of Vista, as well as a final look at Windows XP performance.


    Completely valid point about using 32-bit vs. 64-bit and somewhat of a hot topic over in the video forums. Honestly you have $5000+ worth of hardware in front of you, yet getting a 64-bit version of Vista running benchmarks at resolutions/settings where 64-bit and 2GB+ would help the most is too difficult? C'mon guys, seriously this is the 2nd sub-par review in a row (512 GTS review was poor too).

    Also, could you clarify the bit about 680i boards being able to accomplish the same thing? Exactly what spurred this change in Tri-SLI support? Driver support? Seems Anand used 169.08 but I thought the 169.25 was the first to officially support Tri-SLI from the patch notes. Or has it always been supported and the 780i just hyping up a selling point that has been around for months? Also, the 780i article hinted there would be OC'ing tests with the chipset and I don't see any here. Going to come in a different article? Thanks.
  • blppt - Tuesday, December 18, 2007 - link

    Yeah, seriously. Especially since the 64bit Crysis executable does away with the texture streaming engine entirely...how can you make a serious "super high end ultimate system" benchmark without utilizing the most optimized, publicly available version of the game? Is it that the 64bit Vista drivers dont support 3-way SLI yet?

    Otherwise, putting together a monster rig with 3 $500 videocards and then testing it with 32bit vista seems rather silly....
  • Ryan Smith - Tuesday, December 18, 2007 - link

    Address space consumption isn't 1:1 with video memory, it's only correlated, and even less so in SLI configurations where some data is replicated between the cards. I'm not sure what exact value Anand had, but I'm confident Anand had more than 2GB of free address space.
  • JarredWalton - Tuesday, December 18, 2007 - link

    Testing at high resolutions with ultra-insane graphics settings serves one purpose: it makes hardware like Quad-SLI and Tri-SLI appear to be much better than it really is. NVIDIA recommended 8xAA for quad-SLI back in the day just to make sure the difference was large. It did make QSLI look a lot better, but when you stopped to examine the sometimes sub-20 FPS results it was far less compelling.

    Run at 4xAA on a 30" LCD at native resolution, and it's more than just a little difficult to see the image quality difference, with sometimes half the frame rate of 4xAA. A far better solution than maxing out every setting possible is to increase quality where it's useful. 4xAA is even debatable at 2560x1600 - certainly not required - and it's the first thing I turn off when my system is too slow for a game. Before bothering with 8xAA, try transparent supersampling AA. It usually addresses the same issue with much less impact on performance.

    At the end of the day, it comes down to performance. If you can't enable 8xAA without keeping frame rates above ~40 FPS (and minimums above 30 FPS), I wouldn't touch it. I play many games with 0xAA and rarely notice aliasing on a 30" LCD. Individual pixels are smaller than on 24", 20", 19", etc. LCDs so it doesn't matter as much, and the high resolution compensates for other areas. Crysis at 2560x1600 with Very High settings? The game is already a slide show, so why bother?
  • 0roo0roo - Tuesday, December 18, 2007 - link

    faster is faster, the best is expensive and sometimes frivolous. at that price point you arent thinking like a budget buyer anymore. like exotic cars, you can't be that rational about it. its simply power ...power NOW.
  • crimson117 - Tuesday, December 18, 2007 - link

    If it's true that it's all about power, then just find the most expensive cards you can buy, install them, and don't bother playing anything. Also, tip your salesman a few hundred bucks to make the purchase that much more expensive.
  • 0roo0roo - Tuesday, December 18, 2007 - link

    look, its not like you don't get any advantage from it. its not across the board at this point, but its still a nice boost for any 30" gamer.

    seriously, there are handbags that cost more than this sli stuff.
  • JarredWalton - Tuesday, December 18, 2007 - link

    Next up from AnandTech: Overclocked Handbags!

    Stay tuned - we're still working on the details...

Log in

Don't have an account? Sign up now