The Test, Power, Temps, and Noise

CPU: Intel Core i7-920 @ 3.33GHz
Motherboard: Asus Rampage II Extreme
Chipset Drivers: Intel 9.1.1.1015 (Intel)
Hard Disk: OCZ Summit (120GB)
Memory: Patriot Viper DDR3-1333 3x2GB (7-7-7-20)
Video Cards: AMD Radeon HD 6990
AMD Radeon HD 6970
PowerColor Radeon HD 6970
EVGA GeForce GTX 590 Classified
NVIDIA GeForce GTX 580
Zotac GeForce GTX 580
Video Drivers: NVIDIA ForceWare 266.58
AMD Catalyst 11.4 Preview
OS: Windows 7 Ultimate 64-bit

With that out of the way, let’s start our look at power, temperature, and noise. We did include our jury-rigged triple-CF setup in these results for the sake of a comparison point, but please keep in mind that we’re not using a viable long-term setup, which is why we have starred the results. These results also include the GTX 590 from last week, which has its own handicap under FurMark due to NVIDIA’s OCP. This does not apply to the triple SLI setup, which we can bypass OCP on.

Given NVIDIA’s higher idle TDP, there shouldn’t be any surprises here. Three GTX 580s in SLI makes for a fairly wide gap of 37W – in fact even two GTX 580s in SLI is still 7W more than the triple 6970 setup. Multi-GPU configurations are always going to be a limited market opportunity, but if it were possible to completely power down unused GPUs, it would certainly improve the idle numbers.

With up to three GPUs, power consumption under load gets understandably high. For FurMark in particular we see the triple GTX 580 setup come just shy of 1200W due to our disabling of OCP – it’s an amusingly absurd number. Meanwhile the triple 6970 setup picks up almost nothing over the dual 6970, which is clearly a result of AMD’s drivers not having a 3-way CF profile for FurMark. Thus the greatest power load we can place on the triple 6970 is under HAWX, where it pulls 835W.

With three cards packed tightly together the middle card ends up having the most difficult time, so it’s that card which is setting the highest temperatures here. Even with that, idle temperatures only tick up a couple of degrees in a triple-GPU configuration.

Even when we forcibly wedge the 6970s apart, the triple 6970 setup still ends up being the warmest under Crysis – this being after Crysis temperatures dropped 9C from the separation. Meanwhile the triple GTX 580 gets quite warm on its own, but under Crysis and HAWX it’s nothing we haven’t seen before. FurMark is the only outlier here, where temperatures stabilized at 95C, 2C under GF110’s thermal threshold. It’s safe, but I wouldn’t recommend running FurMark all day just to prove it.

With a 3rd card in the mix idle noise creeps up some, but much like idle temperatures it’s not significantly more. For some perspective though, we’re still looking at idle noise levels equivalent to the GTX 560 Ti running FurMark, so it’s by no means a silent operation.

It turns out adding a 3rd card doesn’t make all that much more noise. Under HAWX the GTX 580 does get 3dB louder, but under FurMark the difference is under a dB. The triple 6970 setup does better under both situations, but that has more to do with our jury-rigging and the fact that FurMark doesn’t scale with a 3rd AMD GPU. Amusingly the triple 580 setup is still quieter under FurMark than the 6990 by nearly 3dB even though we’ve disabled OCP for the GTX 580, and for HAWX the difference is only .2dB in AMD’s favor. It’s simply not possible to do worse than the 6990 without overvolting/overclocking, it seems.

Fitting Three Video Cards in an ATX Case Crysis, BattleForge, Metro 2033, and HAWX
Comments Locked

97 Comments

View All Comments

  • A5 - Sunday, April 3, 2011 - link

    I'm guessing Ryan doesn't want to spend a month redoing all of their benchmarks over all the recent cards. Also the only one of your more recent games that would be at all relevant is Shogun 2 - SC2 runs well on everything, no one plays Arma 2, and the rest are console ports...
  • slickr - Sunday, April 3, 2011 - link

    apart from Shift, no game is console port.

    PC only is mafia 2, SC2, Arma 2, shogun 2, dead space is also not a console port. its PC port to consoles.
  • Ryan Smith - Sunday, April 3, 2011 - link

    We'll be updating the benchmark suite in the next couple of months as keeping with our twice a year schedule. Don't expect us to drop Civ V or Crysis, however.
  • Dustin Sklavos - Monday, April 4, 2011 - link

    Jarred and I have gone back and forth on this stuff to get our own suite where it needs to be, and the games Ryan's running have sound logic behind them. For what it's worth...

    Aliens vs. Predator isn't worth including because it doesn't really leverage that much of DX11 and nobody plays it because it's a terrible game. Crysis Warhead STILL stresses modern gaming systems. As long as it does that it'll be useful, and at least provides a watermark for the underwhelming Crysis 2.

    Battleforge and Shogun 2 I'm admittedly not sure about, same with HAWX and Shift 2.

    Civ 5 should stay, but StarCraft II should definitely be added. There's a major problem with SC2, though: it's horribly, HORRIBLY CPU bound. SC2 is criminally badly coded given how long it's been in the oven and doesn't scale AT ALL with more than two cores. I've found situations even with Sandy Bridge hardware where SC2 is more liable to demonstrate how much the graphics drivers and subsystem hit the CPU rather than how the graphics hardware itself performs. Honestly my only justification for including it in our notebook/desktop suites is because it's so popular.

    Mass Effect 2 to Dead Space 2 doesn't make any sense; Dead Space 2 is a godawful console port while Mass Effect 2 is currently one of the best if not THE best optimized Unreal Engine 3 games on the PC. ME2 should get to stay almost entirely by virtue of being an Unreal Engine 3 representative, ignoring its immense popularity.

    Wolfenstein is currently the most demanding OpenGL game on the market. It may seem an oddball choice, but it really serves the purpose of demonstrating OpenGL performance. Arma 2 doesn't fill this niche.

    Mafia II's easy enough to test that it couldn't hurt to add it.
  • JarredWalton - Monday, April 4, 2011 - link

    Just to add my two cents....

    AvP is a lousy game, regardless of benchmarks. I also toss HAWX and HAWX 2 into this category, but Ryan has found a use for HAWX in that it puts a nice, heavy load on the GPUs.

    Metro 2033 and Mafia II aren't all that great either, TBH, and so far Crysis 2 is less demanding *and* less fun than either of the two prequels. (Note: I finished both Metro and Mafia, and I'd say both rate something around 70%. Crysis 2 is looking about 65% right now, but maybe it'll pick up as the game progresses.)
  • c_horner - Sunday, April 3, 2011 - link

    I'm waiting for the day when someone actually reports on the perceived usability of Mutli-GPU setups in comparison to a single high-end GPU.

    What I mean is this: often times even though you might be receiving an arbitrarily larger frame count, the lag and overall smoothness of the games aren't anywhere near as playable and enjoyable as a game that can be run properly with a single GPU.

    Having tried SLI in the past I was left with a rather large distaste for plopping down the cost of another high end card. Not all games worked properly, not all games scaled well, some games would scale well in the areas it could render easily but minimum frame rates sucked etc. etc. and the list goes on.

    When are some of these review sites going to post subjective and real world usage information instead of a bunch of FPS comparisons?

    There's more to the story here.
  • semo - Sunday, April 3, 2011 - link

    I think this review covers some of your concerns. It seems that AMD with their latest drivers achieve a better min FPS score compared to nVidia.

    I've never used SLI myself but I would think that you wouldn't be able to notice the latency due to more than one GPU in game. Wouldn't such latencies be in the micro seconds?
  • SlyNine - Monday, April 4, 2011 - link

    And yet, those micro seconds seemed like macro seconds, Micro studder was one of the most annoying things ever! I hated my 8800GT SLI experience.

    Haven't been back to multi videocard setups since.
  • DanNeely - Monday, April 4, 2011 - link

    Look at HardOCP.com's reviews. Instead of FPS numbers from canned benches they play the games and list the highest settings that were acceptable. Minimum FPS levels and for SLI/xFire microstuttering problems can push their recommendations down because even when the average numbers look great the situation might actually not be playable.
  • robertsu - Sunday, April 3, 2011 - link

    How is microstuttering with 3 GPU's? Is there any in this new versions?

Log in

Don't have an account? Sign up now