Power Consumption

This was probably the most fun we had in testing for this review: measuring the power consumption of the 3-way SLI setup. We plugged the entire system into our Extech power meter and bet on how much power it'd use. Let's just say that NVIDIA isn't too far off with its minimum PSU requirements, see for yourself.

At idle, our 3-way SLI testbed drew around 400W of power. To put that into perspective, this is more power than any of our normal CPU or GPU testbeds under full load...just sitting at the Windows desktop. The third 8800 Ultra manages to pull an extra 100W without trying. Now let's see how much power this thing needs when playing a game.

We started with Crysis, which is normally our most stressful game test. Normally doesn't apply here though as Crysis didn't scale well from 2 to 3 GPUs, the third graphics card only improved performance by a few percentage points at our most playable settings, thus power consumption won't hit its peak.

Averaging 660W of power, the 3-way SLI system is now using over 2x the power of a normal gaming system outfitted with an 8800 GT. But it gets better.

Bioshock gave us the best scaling we saw out of the lot, and thus the 3rd GPU is working its hardest in this test, meaning we should be able to get our Extech to produce some stunningly high numbers.

And stunningly high we got. The 3-way SLI system averaged 730W in Bioshock, and get this, we even saw the machine pull over 800W from the wall outlet.

Configuration Idle Power Bioshock Load Crysis Load
8800 Ultra x1 217W 329W 337W
8800 Ultra x2 300W 475W 520W
8800 Ultra x3 388W 730W 660W

 

I was talking to Matthew Witheiler, our first dedicated graphics editor here at AnandTech, I told him how much power the system used under load and while idle. His response? "JESUS". "No", I said, "not even Jesus needs this much power".

Resolution and CPU Scaling Final Words
Comments Locked

48 Comments

View All Comments

  • paydirt - Friday, February 15, 2008 - link

    Physics belong on the GPU, Crysis put them on the CPU. (search: AGEIA Crysis)

    This is partly why framerates stink in Crysis, because it is bogging down a processor that isn't designed to properly handle physics.
  • LtUh8meDoncha - Monday, January 7, 2008 - link

    So yeah. On the first page of these comments OrooOroo hit the nail on the head. If you bought 2 Ultras buying a third one (even at the end of its lifecycle) isn't going to bother you. It's like upgrading the twin turbos on a ferrari. No you don't need to but it would be cool if you did! There will always be honda drivers that look at you like your crazy but you're not buying it for them (although if you did I bet thier opinion on 3-way SLI would change).

    The article sounds like it was written by someone who knew they would have to return the product and go back to there 22"WS and single 8800GT setup. I love how he/she just brushed off the Bioshock results because they didn't support your arguement and then made some half-baked excuse about how cpu speed had something to do with it and removed it from "how many games benefit from 3xSli" off topic test. Stick to your benches. Thats all you have. If you say one part of your test is faulty why should I believe any of the others are working?

    Keep it simple. Just the facts. I bought 3-Way because simply put it IS faster(ugh... I already had 2xSLI GTX and I got the third on ebay for like $380 if that makes anyone on a budget feel any better). If you have to justify the cost you have no business even buying 1 Ultra much less 2 or even thinking (or talking) about the next gen top end because you're not going to buy that either. What your going to do is try to make excuses why no one should buy the card you can't afford until a year later when they come out with something thats in your price range and is almost as fast (eghehm.. 8800GT). You'd do better saving your money for some off brand 17" rims or really nice spinner hubcaps.
  • borisof007 - Thursday, January 3, 2008 - link

    No XBox or any console game will do well on PC platform (Assuming it was made for the console first), so shutup about it.

    Now, regarding the video cards, Tri SLI is a waste of money, end of discussion. We've beaten this horse for 5 pages now, we can all agree on this.

    Moving on, to differentiate between Nvidia and ATI is actually very easy.

    If you want high end performance, no matter the cost, go Nvidia dual SLI. If you want high end performance with cost in mind, but still want solid bang for your buck, go with ATI's 3850/3870 lineup in Crossfire. The 790 FX chipset is very nice and the 3850's offer dominating performance in its category and for its cost.

    Done.
  • borisof007 - Thursday, January 3, 2008 - link

    No XBox or any console game will do well on PC platform (Assuming it was made for the console first), so shutup about it.

    Now, regarding the video cards, Tri SLI is a waste of money, end of discussion. We've beaten this horse for 5 pages now, we can all agree on this.

    Moving on, to differentiate between Nvidia and ATI is actually very easy.

    If you want high end performance, no matter the cost, go Nvidia dual SLI. If you want high end performance with cost in mind, but still want solid bang for your buck, go with ATI's 3850/3870 lineup in Crossfire. The 790 FX chipset is very nice and the 3850's offer dominating performance in its category and for its cost.

    Done.
  • borisof007 - Thursday, January 3, 2008 - link

    No XBox or any console game will do well on PC platform (Assuming it was made for the console first), so shutup about it.

    Now, regarding the video cards, Tri SLI is a waste of money, end of discussion. We've beaten this horse for 5 pages now, we can all agree on this.

    Moving on, to differentiate between Nvidia and ATI is actually very easy.

    If you want high end performance, no matter the cost, go Nvidia dual SLI. If you want high end performance with cost in mind, but still want solid bang for your buck, go with ATI's 3850/3870 lineup in Crossfire. The 790 FX chipset is very nice and the 3850's offer dominating performance in its category and for its cost.

    Done.
  • LaZr - Thursday, December 27, 2007 - link

    Why bua a nvidia when it dosent run 3dmark 2008

    http://r800.blogspot.com/2007/12/3dmark-vantage-br...">http://r800.blogspot.com/2007/12/3dmark-vantage-br...

    Lack of dx 10.1

    DiggIt that fanboys!!!!!
  • falc0ne - Monday, December 24, 2007 - link

    the graphics brought are probably the best around these days but this WON'T SIMPLY JUSTIFY THE AMOUNT OF HARDWARE CONSUMED!
    C'mon guys..get real!
    In my view this path with multiple video cards....is one way wrong street...Multiple GPU on a single board YES! - that would be another story here
    Why Doom3 or H2 didn't require SLI or CF to work when they appeared?!
    So, CRYTEK thaks but...no thanks! It's not reasonable at all to pay double(to get a SLI config) to play a SINGLE GAME- which in my view is a better looking version of Far Cry - poor story/scenario also - poor idea...You are the one man, one hero, left in the North Pole with a tooth brush in underwear to survive after which you are transfered to an island to fight Rambo style - Me vs ALL- "bring it on you maggots, I'm gonna teach you all...!"
    Well this is the funny side of it- if you try to entertain yourself(yes games supposed to be entertaining, just not anymore) you won't be able to...cause you'll be preoccupied by surrounding enemies suit's battery and ammo depletion..weapons and ammo are scarce,and enemies die rather like in Hitman(very hard), you have to empty 3 clips to get 3 guys...wow so much fun..
    sorry for the somewhat off topic...

  • Pneumothorax - Thursday, December 20, 2007 - link

    In the closing comments the author is basically complaining about the stagnation of the GPU market. Nvidia with it's 1+ billion dollar cash should develop multi-core GPU dies instead of the same tried and tru $$$ approach of releasing year after year of >$500 video cards. Also notice since ATI is playing 2nd distant fiddle at the high end Nvidia has REALLY slowed down on their improvements. We're looking at a long dark ages in PC gaming until we get a viable competitor to Nvidia. Intel's delay on the 45nm mainstream chip release due to the Phenom failure is another sign we're heading back to >$900+ (remember those dreary P3/early P4 days, until Athlon's started cleaning Intel's clock) mainstream chips with stagnation on the cpu end also.
  • ViperV990 - Wednesday, December 19, 2007 - link

    I'm curious if it is possible to run three 8800GT's, each hooked up to its own monitor (say 20' UXGA LCD), for a nice triple-monitor setup. No SLI whatsoever. If this works as well as the Triplehead-2-Go from Matrox on the software side, I'd very much be interested in getting it.
  • araczynski - Wednesday, December 19, 2007 - link

    sadly, i think these kinds of things are what's rapidly getting rid of the 'fun' in staying in the pc gaming scene. i've been playing pc games since about 86 or so (so much longer than many these days, and yet not as long as many others), but only in the last few years have i been getting 'tired' of all the 'improvements' that hardware companies seem to come up with on a montly basis. not to mention the developers who keep giving them reasons to want to come up with new junk.

    i finally jumped into the console gaming world, have all 3 consoles, and quite frankly it feels much more relaxing these days to play a console game and know that it'll just 'work'.

    there seems to be less and less incentive to waste time with pc gaming every day. as soon as they get real mmo's going on the consoles the pc gaming scene will just fade away finally i think. and i'll be the first to say 'good riddance'.

    anyway, just venting. ignore me.

Log in

Don't have an account? Sign up now