The Test, Power, Temperature, & Noise

CPU: Intel Core i7-920 @ 3.33GHz
Motherboard: Asus Rampage II Extreme
Chipset Drivers: Intel 9.1.1.1015 (Intel)
Hard Disk: OCZ Summit (120GB)
Memory: Patriot Viper DDR3-1333 3x2GB (7-7-7-20)
Video Cards: AMD Radeon HD 6970
AMD Radeon HD 6950
NVIDIA GeForce GTX 580
NVIDIA GeForce GTX 570
NVIDIA GeForce GTX 560 Ti
EVGA GeForce GTX 560 Ti 2Win
Video Drivers: NVIDIA GeForce Driver 285.62
AMD Catalyst 11.9
OS: Windows 7 Ultimate 64-bit

As the gaming performance of the GTX 560 Ti 2Win is going to be rather straightforward – it’s a slightly overclocked GTX 560 Ti SLI – we’re going to mix things up and start with a look at the unique aspects of the card. The 2Win is the only dual-GPU GTX 560 Ti on the market, so its power/noise/thermal characteristics are rather unique.

GeForce GTX 560 Ti Series Voltage
2Win GPU 1 Load 2Win GPU 2 Load GTX 560 Ti Load
1.025v 1.05v 0.95v

It’s interesting to note that the 2Win does not have a common GPU voltage like other dual-GPU cards. EVGA wanted it to be 2 GTX 560 Tis in a single card and it truly is, right down to different voltages for each GPU. One of the GPUs on our sample runs at 1.025v, while the other runs at 1.05v. The latter is a bit higher than any other GF114 product we’ve seen, which indicates that EVGA may be goosing the 2Win a bit. The lower power consumption of GF114 (versus GF110 in the GTX 590) means that the 2Win doesn’t need to adhere to a strict voltage requirement to make spec.

Kicking things off as always is idle power. Unfortunately we don’t have a second reference GTX 560 Ti on hand, so we can’t draw immediate comparisons to a GTX 560 Ti SLI setup. However we believe that the 2Win should draw a bit less power in virtually all cases.

In any case, as to be expected with 2 GPUs the 2Win draws more power than any single GPU, even with GF114’s low idle power consumption. At 180W it’s 7W over the GTX 580, and actually 9W over the otherwise more powerful Radon HD 6990.  But at the same time it’s below any true dual-card setup.

Moving on to load power, we start with the venerable Crysis. As the 2Win is priced against the GTX 580, that's the card to watch for both power characteristics and gaming performance. Starting there, we can see that the 2Win setup draws significantly more power under Crysis – 496W versus 389W – which again is to be expected for a dual-GPU card. As we’ll see in the gaming performance section the 2Win is going to be notably faster than the GTX 580, but the cost will be power.

One thing that caught us off guard here was that power consumption is almost identical to the GTX 590 and the Radeon HD 6990.  At the end of the day those cards are a story about the benefits of aggressive chip binning, but it also means that the 2Win is drawing similar amounts of power for not nearly the performance. Given the 2Win’s much cheaper pricing these cards aren’t direct competitors, but it means the 2Win doesn’t have the same aggressive performance-per-watt profile we see in most other dual-GPU cards.

As NVIDIA continues to use OverCurrent Protection (OCP) for their GTX 500 series, FurMark is largely hit or miss depending on how it’s being throttled. In this case we’ve seen an interesting throttle profile that we haven’t experienced in past reviews: the 2Win would quickly peak at over 500W before retreating to anywhere between 450W and 480W before once again rising to over 500W and coming back down, with the framerate fluctuating with the power draw. This is in opposition to a hard cap, where we’d see power draw stay constant. 510W was the highest wattage we saw that was sustained for over 10 seconds. In this case it’s 40W less than the Radeon HD 6990, 50W more than the heavily capped GTX 590, and only 20W off of the GTX 580. In essence we can see the throttle working to keep power consumption not much higher than what we see with the games in our benchmark suite.

Thanks to the open air design of the 2Win, idle temperatures are quite good. It can’t match the GTX 560 Ti of course, but even with the cramped design of a dual-GPU card our warmest GPU only idles at 36C, below the GTX 580, and just about every other card for that matter.

Looking at GPU temperatures while running Crysis, the open air design again makes itself noticed. Our warmest GPU hits 81C, 10C warmer than a single GTX 560 Ti, but only 2C warmer than a GTX 580 in spite of the extra heat being generated. This also ends up being several degrees cooler than the 6990 and GTX 590, which makes the open air design apparent. The tradeoff is that 300W+ of heat are being dumped into the case, whereas the other dual-GPU cards dump only roughly half that. If we haven’t made it clear before we’ll make it clear now: you’ll need good case ventilation to keep the 2Win this cool.

Because of the OCP throttling keeping power consumption under FurMark so close to what it was under Crysis, our temperatures don’t change a great deal when looking at FurMark.  Just as with the power situation the temperature situation is spiky; 85C was the hottest spike before temperatures dropped back down to the low 80s. As a result 85C is in good company when it comes to FurMark.

We haven’t reviewed very many video cards with 3 (or more) fans, but generally speaking more fans result in more noise. The 2Win adheres to this rule of thumb, humming along at 43dB. This is slightly louder than a number of single-GPU cards, but still quiet enough that at least on our testbed it doesn’t make much of a difference. All the same it goes without saying that the 2Win is not for those of you seeking silence.

At 56dB the load noise chart makes the 2Win look very good, and to be honest I don’t entirely agree with the numbers. Objectively the 2Win is quieter than the GTX 580, but subjectively it has a slight whine to it that blowers simply don’t have. The 2Win may technically be quieter, but I’d say it’s more noticeable than the GTX 580 or similar cards. With that said it’s definitely still quieter and less noticeable than our lineup of multi-GPU configurations, and of course the poor GTX 480. Ultimately it’s quite quiet for a dual-GPU configuration (let alone one on a single card), but it has to make tradeoffs to properly cool a pair of GPUs.

Meet The EVGA GeForce GTX 560 Ti 2Win Crysis: Warhead, BattleForge, & Metro 2033
Comments Locked

56 Comments

View All Comments

  • Death666Angel - Saturday, November 5, 2011 - link

    "But keep in mind, nVidia has been pushing SLI hard for TEN YEARS "
    You mean 7 years?
  • Revdarian - Saturday, November 5, 2011 - link

    Microstutter is a thing with BOTH companies.

    The stutter (real stutter, nothing micro there) experienced in that review was because of the use of Caps on top of the driver that actually already had the crossfire fix, and if they would have contacted AMD about it (bitched at them properly actually, nothing wrong in bitching a bit directly on private to the company if you are a known reviewer), they would have got that answer.

    BTW microstutter depends from person to person, but you will feel it "easier" once the average fps of the dual card solution slips below 60fps.
  • Death666Angel - Saturday, November 5, 2011 - link

    Oh, and here are 2 tests that show CF to not be any worse than SLI concerning micro stutter:
    http://preview.tinyurl.com/6exquor
    http://preview.tinyurl.com/6kuutnl
    Got a spam notice and couldn't post with the normal links. I hope it works with tinyurl...
  • Fiah - Saturday, November 5, 2011 - link

    Micro-stuttering is very much a Nvidia problem as well, just look at the ominously green graphs here: http://techreport.com/articles.x/21516/5

    I'm not convinced that either camp will solve this problem anytime soon, as it is as much a game engine problem as a problem of the drivers/GPU.
  • marraco - Sunday, November 6, 2011 - link

    The article shows that Crossfire does worse than SLI.
  • Fiah - Sunday, November 6, 2011 - link

    Your point being?
  • Uritziel - Monday, November 7, 2011 - link

    Me too! Great performance for the price. The only thing they've not quite been able to tackle is Metro 2033 in 3D at highest settings and 1920x1080 res. Not a single issue with SLI either. Who knows when I'll need to upgrade.
  • Sabresiberian - Tuesday, November 8, 2011 - link

    If you read the article more thoroughly, you will see that it says results vary with application; microstuttering with Nvidia's SLI shows up more in other games and potentially more with different settings.

    Another thing I'll say is that microstuttering is one of those things that is terribly annoying to some people, and just isn't noticed at all by others. General reading though shows me it's a problem recognized by both Nvidia and AMD. Personally. I say it shows up most in multi-card solutions, but isn't entirely exclusive to them.

    My experience has only been with Crossfire, and I found it very distracting.

    I particularly find it annoying that someone would go to the trouble and expense of setting up a multi-card system and end up with worse performance. (We can talk about "performance" in terms of frame rates alone if and only if the quality does not deteriorate; id the quality does, then performance is worse, not better, even if the frame rate improves.) This is an issue that needs to be addressed much more aggressively by both companies, and I will say it does not impress me that it hasn't been solved by either one.

    It makes me long for the days when Matrox was a player in gaming graphics.

    ;)
  • Sabresiberian - Tuesday, November 8, 2011 - link

    You have to realize that the card also includes the cost of NF200 bridge chip, which allows non-SLI capable mainboards to actually use this card.

    From the article:

    "While there were some issues with this on the GTX 460 2WIn, this has apparently been resolved (the presence of NF200 shouldsatisfy all SLI license requirements in the first place). EVGA has said that the 2Win will work on non-SLI mobos, making it fully compatible with every motherboard."

    In other words, if it's got a PCIe x16 slot in it, it will work in your mainboard. Most dual-GPU cards can't do that.

    ;)
  • keitaro - Saturday, November 5, 2011 - link

    What's missing are performance numbers on Surround and Eyefinity resolutions. EVGA is also touting Surround 3D capability on this card and it is something to at least consider. I've seen so many single-monitor scores and these days they bore me. Get us some Surround/Eyefinity benchmark numbers so we can see how they fare when pressed with higher pixel count to render.

Log in

Don't have an account? Sign up now