Crysis: Warhead, BattleForge, & Metro 2033

As we mentioned previously, the EVGA GeForce GTX 560 Ti 2Win is a one-off product. At $520 It doesn’t have any specific competition – at least none that’s reciprocal – but EVGA likes to call it a GTX 580 competitor it’s definitely priced close enough to the GTX 580 to make that a meaningful comparison. As a multi-GPU product it also competes with multi-GPU multi-card setups, primarily the regular GTX 560 Ti SLI and the Radeon HD 6950 CF.

Starting as always with Crysis, we get a good setup for the rest of the benchmarks to come. The 2Win is well ahead of the similarly priced GTX 580, turning in a score 30% better than the 580. At the same time its performance relative to the GTX 560 Ti SLI is almost identical, owing to the 2Win’s 3% higher core clock. At no point here will the GTX 560 Ti SLI and 2Win ever separate by an appreciable margin; it’s really only faster on paper.

Looking at raw performance, we see that the 2Win turns in a solid performance at 2560 and is well above 60fps at 1920. SLI doesn’t change the fact that AMD and NVIDIA will regularly jockey for position depending on the game, so even with a strong showing here, the 6950 CF still ends up being quite a bit faster.

The Crysis minimum framerate test is one of the handful of tests we have right now that pays much attention to more than 1GB of VRAM. The fact that the 2Win does worse than the regular GTX 560 Ti SLI is not a mistake here – it’s a telltale sign of swapping into VRAM. The 2Win is generally capable of handling 2560 in terms of rendering power, but with 1GB of effective VRAM it can run into other bottlenecks first, as we see here.

Given that it’s a good deal more powerful than the GTX 580, the 2Win has no problem breezing through Battleforge. Even at 2560 it hits 88fps, 35% better than the GTX 580.

When it comes to SLI scaling Metro has always been a challenge, and as a result the 2Win loses some of its advantage here. It’s still ahead of the GTX 580 at 2560, but only by 16%. Again we’re looking at a potential lack of VRAM, but also the result of Metro’s significant workload at 2560. This is one of the only titles we currently use that is still a struggle for every card we have, and as a result it’s one of the few titles than the 2Win just isn’t cut out for at 2560. It’s only at 1920 that the 2Win can pick up enough speed to be playable, at which point its lead over the GTX 580 grows slightly to 20%.

The Test, Power, Temperature, & Noise HAWX, Civ V, Battlefield BC2, & STALKER
Comments Locked

56 Comments

View All Comments

  • Tchamber - Monday, November 7, 2011 - link

    I never understood why they advertise 2GB, but the review says 1GB of useable memory. Can some one explain that to me?
  • Ryan Smith - Monday, November 7, 2011 - link

    With current multi-GPU technology (SLI and Crossfire), each GPU has its own copy of the working set of data and assets. Furthermore they can only work from their local VRAM, and they cannot (practically) work from the other GPU's VRAM.

    As a result you must duplicate the working set and assets across each GPU, which means you get half the effective VRAM.

    At the end of the day boardmakers can advertise having 2GB because it does in fact physically have 2GB, but logically it only has 1GB to work with.
  • Tchamber - Friday, November 11, 2011 - link

    Thanks for clearing that up Ryan. So in any multil-gpu setup, the vram does not increase, unless the card itself has more of its own. Interesting limitation, so only compute power and bandwidth increase?
  • AnnonymousCoward - Monday, November 7, 2011 - link

    It seems like it might make sense for nvidia to base their architecture on having 4, 8, or 16 GPU dies on every board. This would improve yield across the entire low/medium/high end due to smaller die sizes, and it would give a huge boost to the high end (assuming power limits are figured out). In today's age of supercomputers having 4000 chips per rack, it does not seem optimal to put just 1 or 2 per video card PCB.
  • Sabresiberian - Tuesday, November 8, 2011 - link

    Great job, as usual; I have to agree with the conclusions made under "Final Thoughts". The only reason I'd go this route is if I needed the connectivity in terms of monitors and only had a single PCIe x16 slot on my mainboard. That being said, a 30% performance increase in your particular favorite game is nothing to ignore.

    One of the things I've been hoping is that EVGA would send Anandtech or Tomshardware (preferably both) a Classified card so one of these sites could run thorough overclocking tests on it. I highly doubt that the Classified could make up the 30% difference on air, but how much better than stock it can reach will be good to know before I buy.

    (It would also be interesting to know when AMD is going to release their next-gen GPU and whether or not it's going to be worth waiting a month or so for, but their recent CPU release puts them in the "I'm not holding my breath" category.)

    ;)
  • Wakanabi - Monday, February 6, 2012 - link

    I went with this card and I'll tell you why

    TEMPERATURE!!!

    I had two 560Ti's when I first built my pc, and having the sli bridge and the cards close together on a board, one card would be up at 65 to 70celcius under full load, and the other would be at 85 to 92!

    Anytime you have multiple graphics cards, the fans from the top card are pulling hot air directly from the other card's pcb backside.

    I had sold one of my 560s a while back for full price ($250) so instead of buy another one now, I sold the other for $200 and bought the 2Win. Now I only get up to 78celcius total. And once I change my case next week to a higher air flow case it will be even better.

    This is the best card I've ever had, better than two 6870s, a single gtx580 or even the 6990 I was using for mining. I have mine overclocked to 900Mhz and get another 10-12% increase in performance. Unbeatable as far as single cards go, especially considering the 6990 is $700 and the 590 is around that too

Log in

Don't have an account? Sign up now