DIRT 2, Mass Effect 2, Wolfenstein, & Compute Performance

DIRT 2 is another title modern cards can power on through. Even at 2560 the 2Win gets better than 100fps, turning in another large lead over the GTX 580.

Mass Effect 2 is a rather interesting test because it above all else appears to be texture bound rather than shader bound, which is a very fortunate scenario for the GTX 560 Ti, as it has nearly as much texture throughput as the GTX 580. As a result the 2Win with its two GPUs does exceptionally well here. At 2560 it offers 92fps, and more importantly it surpasses a GTX 580 by a hair over 50%. This is the exception rather than the rule of course, but it’s also a prime example of why dual-GPU cards can be a threat to high performance single-GPU cards like the GTX 580.

Wrapping up our gaming benchmarks is Wolfenstein multiplayer. The game is CPU limited at much beyond 120fps, and even at 2560 the 2Win nearly hits that mark.

Our final benchmark is the Civilization V leader texture compression benchmark, a compute performance benchmark measuring the ability of a DirectCompute program to decompress textures. While not a game in and of itself, it does a good job highlighting the 2Win’s biggest weakness: it’s only as good as SLI is. Texture compression isn’t something that can be split among GPUs, and as a result the 2Win is suddenly no better than a regular GTX 560 Ti. At these performance levels it isn’t an issue, but it’s not the only game using this kind of system. Rage is similar in application and in SLI limitations, which becomes an issue because Rage’s CUDA accelerated texture decoder really needs a GTX 570 or better.

HAWX, Civ V, Battlefield BC2, & STALKER Final Thoughts
Comments Locked

56 Comments

View All Comments

  • Death666Angel - Saturday, November 5, 2011 - link

    "But keep in mind, nVidia has been pushing SLI hard for TEN YEARS "
    You mean 7 years?
  • Revdarian - Saturday, November 5, 2011 - link

    Microstutter is a thing with BOTH companies.

    The stutter (real stutter, nothing micro there) experienced in that review was because of the use of Caps on top of the driver that actually already had the crossfire fix, and if they would have contacted AMD about it (bitched at them properly actually, nothing wrong in bitching a bit directly on private to the company if you are a known reviewer), they would have got that answer.

    BTW microstutter depends from person to person, but you will feel it "easier" once the average fps of the dual card solution slips below 60fps.
  • Death666Angel - Saturday, November 5, 2011 - link

    Oh, and here are 2 tests that show CF to not be any worse than SLI concerning micro stutter:
    http://preview.tinyurl.com/6exquor
    http://preview.tinyurl.com/6kuutnl
    Got a spam notice and couldn't post with the normal links. I hope it works with tinyurl...
  • Fiah - Saturday, November 5, 2011 - link

    Micro-stuttering is very much a Nvidia problem as well, just look at the ominously green graphs here: http://techreport.com/articles.x/21516/5

    I'm not convinced that either camp will solve this problem anytime soon, as it is as much a game engine problem as a problem of the drivers/GPU.
  • marraco - Sunday, November 6, 2011 - link

    The article shows that Crossfire does worse than SLI.
  • Fiah - Sunday, November 6, 2011 - link

    Your point being?
  • Uritziel - Monday, November 7, 2011 - link

    Me too! Great performance for the price. The only thing they've not quite been able to tackle is Metro 2033 in 3D at highest settings and 1920x1080 res. Not a single issue with SLI either. Who knows when I'll need to upgrade.
  • Sabresiberian - Tuesday, November 8, 2011 - link

    If you read the article more thoroughly, you will see that it says results vary with application; microstuttering with Nvidia's SLI shows up more in other games and potentially more with different settings.

    Another thing I'll say is that microstuttering is one of those things that is terribly annoying to some people, and just isn't noticed at all by others. General reading though shows me it's a problem recognized by both Nvidia and AMD. Personally. I say it shows up most in multi-card solutions, but isn't entirely exclusive to them.

    My experience has only been with Crossfire, and I found it very distracting.

    I particularly find it annoying that someone would go to the trouble and expense of setting up a multi-card system and end up with worse performance. (We can talk about "performance" in terms of frame rates alone if and only if the quality does not deteriorate; id the quality does, then performance is worse, not better, even if the frame rate improves.) This is an issue that needs to be addressed much more aggressively by both companies, and I will say it does not impress me that it hasn't been solved by either one.

    It makes me long for the days when Matrox was a player in gaming graphics.

    ;)
  • Sabresiberian - Tuesday, November 8, 2011 - link

    You have to realize that the card also includes the cost of NF200 bridge chip, which allows non-SLI capable mainboards to actually use this card.

    From the article:

    "While there were some issues with this on the GTX 460 2WIn, this has apparently been resolved (the presence of NF200 shouldsatisfy all SLI license requirements in the first place). EVGA has said that the 2Win will work on non-SLI mobos, making it fully compatible with every motherboard."

    In other words, if it's got a PCIe x16 slot in it, it will work in your mainboard. Most dual-GPU cards can't do that.

    ;)
  • keitaro - Saturday, November 5, 2011 - link

    What's missing are performance numbers on Surround and Eyefinity resolutions. EVGA is also touting Surround 3D capability on this card and it is something to at least consider. I've seen so many single-monitor scores and these days they bore me. Get us some Surround/Eyefinity benchmark numbers so we can see how they fare when pressed with higher pixel count to render.

Log in

Don't have an account? Sign up now