Back at CES 2011 EVGA showed off an interesting concept card featuring 2 GF104 GPUs on a single board. NVIDIA has long designed multi-GPU cards using their high-end GPUs to carve out a market segment above their top single-GPU cards, but while NVIDIA promotes SLI across almost the entire GeForce spectrum it’s promoted as a multi-card option for anything other than those halo cards. Over the years a handful of AMD and NVIDIA’s board partners have struck out on their own and designed their own multi-GPU boards, and at CES 2011 EVGA joined that club.

The resulting product was the EVGA GeForce GTX 460 2Win, which combined 2 overclocked GTX 460s onto a single board. Unfortunately for EVGA, NVIDIA launched the GTX 560 Ti and its associated GF114 GPU mere weeks after CES 2011. GF104 was (and still is) a very capable GPU, but at the end of the day GF114 allowed the GTX 560 Ti to offer a 30% performance improvement for only a very slight increase in power consumption. The GTX 460 2Win did well enough for EVGA to continue with the design, but like the GTX 460 itself, it was clear that the 2Win design was never going to reach its full potential with GF104.

So now in November of 2011 EVGA is back with their next 2Win card: the EVGA GeForce GTX 560 Ti 2Win. Having replaced the GF104 GPUs with GF114 and tweaked the board to handle the extra power consumption, EVGA is giving it another shot. And this time they’re gunning for NVIDIA’s flagship single-GPU card, the GTX 580. Their proposition? For only a little more than the GTX 580 they can offer 30% better performance.

  EVGA GTX 560 Ti 2Win GTX 580 GTX 570 GTX 560 Ti
Stream Processors 2 x 384 512 480 384
Texture Address / Filtering 2 x 64/64 64/64 60/60 64/64
ROPs 2 x 32 48 40 32
Core Clock 850MHz 772MHz 732MHz 822MHz
Shader Clock 1700MHz 1544MHz 1464MHz 1644MHz
Memory Clock 1002MHz (4008MHz data rate) GDDR5 1002MHz (4008MHz data rate) GDDR5 950MHz (3800MHz data rate) GDDR5 1002Mhz (4008MHz data rate) GDDR5
Memory Bus Width 2 x 256-bit 384-bit 320-bit 256-bit
VRAM 2 x 1GB 1.5GB 1.25GB 1GB
FP64 1/12 FP32 1/8 FP32 1/8 FP32 1/12 FP32
Transistor Count 2 x 1.95B 3B 3B 1.95B
Manufacturing Process TSMC 40nm TSMC 40nm TSMC 40nm TSMC 40nm
Price Point $519 $489 $329 $229

EVGA advertises the GTX 560 Ti 2Win as a dual GTX 560 Ti card, and true to their word that’s what it is. It’s an important distinction to make between the 2Win and ultra high end mutli-GPU cards like the GTX 590 and Radeon HD 6990, as both of those are a best effort to squeeze two high-end GPUs into a single card while staying within a 375W power budget under normal operation. The end result is that NVIDIA and AMD have to heavily bin GPUs to find those that will perform at a suitably low voltage, and even then these cards aren’t clocked as high as the single-GPU behemoths they’re based on.

The 2Win on the other hand is exactly what it says on the label. Composed of 2 GF114 GPUs, the 2Win is a GTX 560 Ti SLI setup on a single card, with all of the specs and none of the compromises we see in ultra high end cards. In fact the 2Win is a factory overclocked card, if only slightly – its 850MHz core clock is a mild 3% higher than the 822MHz core clock of the baseline GTX 560 Ti, while the memory clock is identical at 1002MHz (4008MHz data rate). This is paired with 2GB of GDDR5, which is reduced to 1GB of effective VRAM due to the dual-GPU nature of the card.

When it comes to power consumption EVGA doesn’t officially specify a TDP for the 2Win, but given that it’s designed to be a true dual GTX 560 Ti its power requirements closely trend the GTX 560 Ti SLI. In this case that puts the load TDP around 340W, not accounting for any efficiencies gained from having 2 GPUs on a single card or the power consumption of a PCIe bridge chip. As a result this is fairly close to the GTX 590 and Radeon HD 6990, both of which are heavily binned to stay under 375W.

But the real story here of course is the performance for the price. We’ve seen the performance of the GTX 560 Ti SLI in the past, and the performance is quite remarkable. For some time now a pair of NVIDIA’s mid-tier video cards in SLI have been able to surpass a single high-end card, and this performance is the basis of the 2Win. EVGA promotes the 2Win as being more than 30% faster than the GTX 580 and this is something that’s easily achieved in games where SLI scales well.

At the same time the 2Win is priced close to the GTX 580 to further cement its competitive status. EVGA has put the MSRP of the 2Win at $519, which is anywhere between $50 more expensive than the very cheapest GTX 580 to roughly the same price as factory overclocked models. Ostensibly this makes the 2Win more expensive than the GTX 580, but not significantly so given that we’re talking about the high-end video card market. Overall this puts the 2Win in a very good position versus the GTX 580, so long as it can deliver on its 30% performance claims over the GTX 580.

Next to its performance against the GTX 580, the other uses EVGA are using to promote the 2Win are the benefits derived from having multiple GPUs on a single card: namely NVIDIA Surround support. As with the GTX 590, by having 2 GPUs on a single card EVGA can team together the display outputs on the GPUs to drive up to 4 displays, versus 2 displays on a single GPU. This gives the 2Win the ability to drive a triple monitor surround setup on its own, and with 2 GTX 560 Ti GPUs should have the horsepower to do so in most cases. 3D Vision Surround is also a viable possibility thanks to the 3 DL-DVI ports, but the performance hit from 3D Vision is likely more than the 2Win can handle.

Ultimately the 2Win’s status as a multi-GPU card composed of GTX 560 Tis puts it in a unique place. Next to the GTX 580, its only other meaningful competitors are the Radeon HD 6950 CF, and the regular GTX 560 Ti in SLI. The bad news for the 2Win is that these are both cheaper options than the 2Win – you’re paying a price premium to get it on a single card.

Meet The EVGA GeForce GTX 560 Ti 2Win
POST A COMMENT

56 Comments

View All Comments

  • Death666Angel - Saturday, November 05, 2011 - link

    "But keep in mind, nVidia has been pushing SLI hard for TEN YEARS "
    You mean 7 years?
    Reply
  • Revdarian - Saturday, November 05, 2011 - link

    Microstutter is a thing with BOTH companies.

    The stutter (real stutter, nothing micro there) experienced in that review was because of the use of Caps on top of the driver that actually already had the crossfire fix, and if they would have contacted AMD about it (bitched at them properly actually, nothing wrong in bitching a bit directly on private to the company if you are a known reviewer), they would have got that answer.

    BTW microstutter depends from person to person, but you will feel it "easier" once the average fps of the dual card solution slips below 60fps.
    Reply
  • Death666Angel - Saturday, November 05, 2011 - link

    Oh, and here are 2 tests that show CF to not be any worse than SLI concerning micro stutter:
    http://preview.tinyurl.com/6exquor
    http://preview.tinyurl.com/6kuutnl
    Got a spam notice and couldn't post with the normal links. I hope it works with tinyurl...
    Reply
  • Fiah - Saturday, November 05, 2011 - link

    Micro-stuttering is very much a Nvidia problem as well, just look at the ominously green graphs here: http://techreport.com/articles.x/21516/5

    I'm not convinced that either camp will solve this problem anytime soon, as it is as much a game engine problem as a problem of the drivers/GPU.
    Reply
  • marraco - Sunday, November 06, 2011 - link

    The article shows that Crossfire does worse than SLI. Reply
  • Fiah - Sunday, November 06, 2011 - link

    Your point being? Reply
  • Uritziel - Monday, November 07, 2011 - link

    Me too! Great performance for the price. The only thing they've not quite been able to tackle is Metro 2033 in 3D at highest settings and 1920x1080 res. Not a single issue with SLI either. Who knows when I'll need to upgrade. Reply
  • Sabresiberian - Tuesday, November 08, 2011 - link

    If you read the article more thoroughly, you will see that it says results vary with application; microstuttering with Nvidia's SLI shows up more in other games and potentially more with different settings.

    Another thing I'll say is that microstuttering is one of those things that is terribly annoying to some people, and just isn't noticed at all by others. General reading though shows me it's a problem recognized by both Nvidia and AMD. Personally. I say it shows up most in multi-card solutions, but isn't entirely exclusive to them.

    My experience has only been with Crossfire, and I found it very distracting.

    I particularly find it annoying that someone would go to the trouble and expense of setting up a multi-card system and end up with worse performance. (We can talk about "performance" in terms of frame rates alone if and only if the quality does not deteriorate; id the quality does, then performance is worse, not better, even if the frame rate improves.) This is an issue that needs to be addressed much more aggressively by both companies, and I will say it does not impress me that it hasn't been solved by either one.

    It makes me long for the days when Matrox was a player in gaming graphics.

    ;)
    Reply
  • Sabresiberian - Tuesday, November 08, 2011 - link

    You have to realize that the card also includes the cost of NF200 bridge chip, which allows non-SLI capable mainboards to actually use this card.

    From the article:

    "While there were some issues with this on the GTX 460 2WIn, this has apparently been resolved (the presence of NF200 shouldsatisfy all SLI license requirements in the first place). EVGA has said that the 2Win will work on non-SLI mobos, making it fully compatible with every motherboard."

    In other words, if it's got a PCIe x16 slot in it, it will work in your mainboard. Most dual-GPU cards can't do that.

    ;)
    Reply
  • keitaro - Saturday, November 05, 2011 - link

    What's missing are performance numbers on Surround and Eyefinity resolutions. EVGA is also touting Surround 3D capability on this card and it is something to at least consider. I've seen so many single-monitor scores and these days they bore me. Get us some Surround/Eyefinity benchmark numbers so we can see how they fare when pressed with higher pixel count to render. Reply

Log in

Don't have an account? Sign up now