The Test, Power, Temps, and Noise

CPU: Intel Core i7-920 @ 3.33GHz
Motherboard: Asus Rampage II Extreme
Chipset Drivers: Intel 9.1.1.1015 (Intel)
Hard Disk: OCZ Summit (120GB)
Memory: Patriot Viper DDR3-1333 3x2GB (7-7-7-20)
Video Cards: AMD Radeon HD 6990
AMD Radeon HD 6970
PowerColor Radeon HD 6970
EVGA GeForce GTX 590 Classified
NVIDIA GeForce GTX 580
Zotac GeForce GTX 580
Video Drivers: NVIDIA ForceWare 266.58
AMD Catalyst 11.4 Preview
OS: Windows 7 Ultimate 64-bit

With that out of the way, let’s start our look at power, temperature, and noise. We did include our jury-rigged triple-CF setup in these results for the sake of a comparison point, but please keep in mind that we’re not using a viable long-term setup, which is why we have starred the results. These results also include the GTX 590 from last week, which has its own handicap under FurMark due to NVIDIA’s OCP. This does not apply to the triple SLI setup, which we can bypass OCP on.

Given NVIDIA’s higher idle TDP, there shouldn’t be any surprises here. Three GTX 580s in SLI makes for a fairly wide gap of 37W – in fact even two GTX 580s in SLI is still 7W more than the triple 6970 setup. Multi-GPU configurations are always going to be a limited market opportunity, but if it were possible to completely power down unused GPUs, it would certainly improve the idle numbers.

With up to three GPUs, power consumption under load gets understandably high. For FurMark in particular we see the triple GTX 580 setup come just shy of 1200W due to our disabling of OCP – it’s an amusingly absurd number. Meanwhile the triple 6970 setup picks up almost nothing over the dual 6970, which is clearly a result of AMD’s drivers not having a 3-way CF profile for FurMark. Thus the greatest power load we can place on the triple 6970 is under HAWX, where it pulls 835W.

With three cards packed tightly together the middle card ends up having the most difficult time, so it’s that card which is setting the highest temperatures here. Even with that, idle temperatures only tick up a couple of degrees in a triple-GPU configuration.

Even when we forcibly wedge the 6970s apart, the triple 6970 setup still ends up being the warmest under Crysis – this being after Crysis temperatures dropped 9C from the separation. Meanwhile the triple GTX 580 gets quite warm on its own, but under Crysis and HAWX it’s nothing we haven’t seen before. FurMark is the only outlier here, where temperatures stabilized at 95C, 2C under GF110’s thermal threshold. It’s safe, but I wouldn’t recommend running FurMark all day just to prove it.

With a 3rd card in the mix idle noise creeps up some, but much like idle temperatures it’s not significantly more. For some perspective though, we’re still looking at idle noise levels equivalent to the GTX 560 Ti running FurMark, so it’s by no means a silent operation.

It turns out adding a 3rd card doesn’t make all that much more noise. Under HAWX the GTX 580 does get 3dB louder, but under FurMark the difference is under a dB. The triple 6970 setup does better under both situations, but that has more to do with our jury-rigging and the fact that FurMark doesn’t scale with a 3rd AMD GPU. Amusingly the triple 580 setup is still quieter under FurMark than the 6990 by nearly 3dB even though we’ve disabled OCP for the GTX 580, and for HAWX the difference is only .2dB in AMD’s favor. It’s simply not possible to do worse than the 6990 without overvolting/overclocking, it seems.

Fitting Three Video Cards in an ATX Case Crysis, BattleForge, Metro 2033, and HAWX
Comments Locked

97 Comments

View All Comments

  • Ryan Smith - Sunday, April 3, 2011 - link

    It took awhile, but we finally have 3 120Hz 1080P monitors on the way. So we'll be able to test Eyefinity, 3D Vision, and 3D Vision Surround; all of which have been neglected around here.
  • Kaboose - Sunday, April 3, 2011 - link

    I await these tests with breathless anticipation!
  • veri745 - Sunday, April 3, 2011 - link

    While this article was very well written, I think it is hardly worth it without the multi-monitor data. No-one (sane) is going to get 3x SLI/CF with a single monitor, so it's mostly irrelevant.

    The theoretical scaling comparison is interesting, but I'm a lot more interesting in the scaling at 3240x1920 or 5760x1080.
  • DanNeely - Sunday, April 3, 2011 - link

    This is definitely a step in the right direction; but with other sites having 3x 1920x1200 or even 3x 2560x1600 test setups you'll still be playing catchup.
  • RK7 - Sunday, April 3, 2011 - link

    Finally! I created account just to write that comment :) That's what's missing and what definitely needs to be tested! Especially 3D Vision Surround - it's good to know if it's worth to put so much money into such setup, because single card may be on the edge of performance for modern games in stereoscopic mode with single monitor (good example is Metro 2033, that blows mind when in 3D, but I found with single GTX 570@900MHz is playable only at 1600x900 in 3D with maximum settings without DoF and AA, and even in such case it could drop for some action scenes with heavy lighting to ~12 fps...). So if three cards can achieve a good scaling and provide performance per monitor for 3 monitors setup close to single card for one monitor, then we're there and it's worth it definitely, but if numbers will be alike to those for single monitor scaling, then folks should be aware that there's no way for maximum visual quality gaming with current hardware on 3 monitors...
  • Dustin Sklavos - Monday, April 4, 2011 - link

    Not completely neglected. I've added triple-monitor surround testing to my boutique desktop reviews whenever able. :)
  • Crazymech - Sunday, April 3, 2011 - link

    I'm having my doubts about the capabilities of the 920 OC'd to 3.33 GHz matched up with 3 of the most powerful single GPUs.

    I understand straying away from SB because of the lanes, but you could at least have upped the OC to 3,8-4, which many people do (and I would think most that considers a tripple setup would use).

    To underline it I point to the small differences between the 4.5 GHz 2600K and the lower overclocked one in the boutique builds reviews, with the highest clocked CPU coupled with weaker GPU's nipping at the heels of the more powerful GPU.

    I suggest you at least experiment in a single test (say metro for example.. or battlefield) what a higher clocked X58 (or the 980's 6 cores) could do to the setup.
    If I'm wrong, it would at least be good to know that.
  • BrightCandle - Sunday, April 3, 2011 - link

    The fact that sandy Bridge has a PCI-E lanes problem is grounds for testing the impact.

    Still I would rather see the numbers on X58 and triple screen gaming before seeing the impact that SB makes the performance of SLI/CF setups.
  • Ryan Smith - Sunday, April 3, 2011 - link

    For what it's worth, 3.33GHz is actually where this specific 920 tops out. It won't take 3.5GHz or higher, unfortunately.

    We'll ultimately upgrade our testbed to SNB - this article is the impetus for that - but that's not going to happen right away.
  • Crazymech - Monday, April 4, 2011 - link

    It wont take 3.5? Really? That amazes me.
    Though very unfortunate for the purpose of this test.

    The main focus is (of course) always on how the new GPU in relation to a standard CPU improves the framerate, but once in a while it would be interesting what different CPU's do to the GPU and FPS aswell. Like the old Doom III articles of showing Athlon dominating PenIV.

    Thanks for the answer anyhows :).

Log in

Don't have an account? Sign up now