DIRT 2, Mass Effect 2, Wolfenstein, & Compute Performance

DIRT 2 is another title modern cards can power on through. Even at 2560 the 2Win gets better than 100fps, turning in another large lead over the GTX 580.

Mass Effect 2 is a rather interesting test because it above all else appears to be texture bound rather than shader bound, which is a very fortunate scenario for the GTX 560 Ti, as it has nearly as much texture throughput as the GTX 580. As a result the 2Win with its two GPUs does exceptionally well here. At 2560 it offers 92fps, and more importantly it surpasses a GTX 580 by a hair over 50%. This is the exception rather than the rule of course, but it’s also a prime example of why dual-GPU cards can be a threat to high performance single-GPU cards like the GTX 580.

Wrapping up our gaming benchmarks is Wolfenstein multiplayer. The game is CPU limited at much beyond 120fps, and even at 2560 the 2Win nearly hits that mark.

Our final benchmark is the Civilization V leader texture compression benchmark, a compute performance benchmark measuring the ability of a DirectCompute program to decompress textures. While not a game in and of itself, it does a good job highlighting the 2Win’s biggest weakness: it’s only as good as SLI is. Texture compression isn’t something that can be split among GPUs, and as a result the 2Win is suddenly no better than a regular GTX 560 Ti. At these performance levels it isn’t an issue, but it’s not the only game using this kind of system. Rage is similar in application and in SLI limitations, which becomes an issue because Rage’s CUDA accelerated texture decoder really needs a GTX 570 or better.

HAWX, Civ V, Battlefield BC2, & STALKER Final Thoughts
Comments Locked

56 Comments

View All Comments

  • luv2liv - Friday, November 4, 2011 - link

    they cant make it physically bigger than this?
    im disappointed.

    /s
  • phantom505 - Friday, November 4, 2011 - link

    That was so lazy.... it looks like they took 3 case fans and tie strapped them to the top. I think I could have made that look better and I have no design experience whatsoever.
  • irishScott - Sunday, November 6, 2011 - link

    Well, it apparently works. That's good enough for me, but then again I don't have a side window.
  • Strunf - Monday, November 7, 2011 - link

    Side window and mirrors to see the the fans...I don't understand why people even comment on aesthetics it's not like they'll spend their time looking at the card.
  • phantom505 - Monday, November 7, 2011 - link

    If they were lazy here, where else were they lazy?
  • Sabresiberian - Tuesday, November 8, 2011 - link

    What is obviously lazy here is your lack of thinking and reading before you made your post.
  • Velotop - Saturday, November 5, 2011 - link

    I still have a GTX580 in shrink wrap for my new system build. Looks like it's a keeper.
  • pixelstuff - Saturday, November 5, 2011 - link

    Seems like they missed the mark on pricing. Shouldn't they have been able to price it at exactly 2x a GTX 560 Ti card, or $460. Theoretically they should be saving money on the PCB material, connectors, and packaging.

    Of course we all know that they don't set these price brackets on how much more card costs over the next model down. They set prices based on the maximum they think they could get someone to pay. Oh well. Probably would have sold like hot cakes otherwise.
  • Kepe - Saturday, November 5, 2011 - link

    In addition to just raw materials and manufacturing costs, you must also take in to account the amount of money poured in to the development of the card. This is a custom PCB and as such, takes quite a bit of resources to develop. Also, this is a low volume product that will not sell as many units as a regular 560Ti does, so all those extra R&D costs must be distributed over a small amount of products.
    R&D costs on reference designs such as the 560Ti are pretty close to 0 compared to something like the 560Ti 2Win.
  • Samus - Saturday, November 5, 2011 - link

    i've been running a pair of EVGA GTX460 768MB's in SLI with the superclock BIOS for almost 2 years. Still faster than just about any single card you can buy, even now, at a cost of $300 total when I bought them.

    I'm the only one of my friends that didn't need to upgrade their videocard for Battlefield 3. I've been completely sold on SLI since buying these cards, and believe me, I'd been avoiding SLI for years for the same reason most people do: compatibility.

    But keep in mind, nVidia has been pushing SLI hard for TEN YEARS with excellent drivers, frequent updates, and compatibility with a wide range of motherboards and GPU models.

    Micro-stutter is an ATI issue. It's not noticeable (and barely measurable) on nVidia cards.

    http://www.tomshardware.com/reviews/radeon-geforce...

    In reference to Ryan's conclusion, I'd say consider SLI for nVidia cards without hesitation. If you're in the ATI camp, get one of their beasts or run three-way cross-fire to eliminate micro-stutter.

Log in

Don't have an account? Sign up now