HAWX, Civ V, Battlefield BC2, & STALKER

HAWX, in spite of its high framerates on modern cards , is still rather GPU limited. As a result of that limitation and superb SLI scaling the 2Win manages to generate 165fps even at 2560. In fact it’s second only to the GTX 570 SLI, and is a solid 30% ahead of the GTX 580.

NVIDIA has continued to work on their Civilization V performance since the last time we’ve taken a look at the high end, and as a result SLI scaling is looking really good. The 2Win nearly doubles the performance of a GTX 560 Ti, and even the GTX 580 has to take a backseat by 33%. Thanks to these further driver improvements the 2Win is capable of cracking 60fps, even at 2560.

Battlefield: Bad Company 2 is another title that scales well with SLI, further vaulting it over the GTX 580. At 74.5fps at 2560 it’s not only an extremely smooth experience, but 36% ahead of a GTX 580. At the same time this is another title where the Radeons give us a strong showing, leading to the 6950 CF passing the 2Win. Meanwhile our Waterfall benchmark shakes things up slightly, but not for the better for the 2Win. All of our results have a much narrower spread, and as a result the 2Win gives up much of its advantage.

STALKER is our other VRAM-hungry benchmark. The 2Win still beats a single GTX 580 by 17%, but it loses to the 6950 CF and GTX 570 by more than usual. Both of these setups have additional VRAM (2GB and 1.25GB respectively), allowing them to get the best of the 2Win.

The significance of this situation is that with the STALKER benchmark approaching 2 years old, it’s in many ways a taste of things to come. We’re not done with the subject of VRAM, but it’s clear we’re already seeing situations where the 2Win is being held back by a lack of VRAM.

Crysis: Warhead, BattleForge, & Metro 2033 DIRT 2, Mass Effect 2, Wolfenstein, & Compute Performance
Comments Locked

56 Comments

View All Comments

  • Tchamber - Monday, November 7, 2011 - link

    I never understood why they advertise 2GB, but the review says 1GB of useable memory. Can some one explain that to me?
  • Ryan Smith - Monday, November 7, 2011 - link

    With current multi-GPU technology (SLI and Crossfire), each GPU has its own copy of the working set of data and assets. Furthermore they can only work from their local VRAM, and they cannot (practically) work from the other GPU's VRAM.

    As a result you must duplicate the working set and assets across each GPU, which means you get half the effective VRAM.

    At the end of the day boardmakers can advertise having 2GB because it does in fact physically have 2GB, but logically it only has 1GB to work with.
  • Tchamber - Friday, November 11, 2011 - link

    Thanks for clearing that up Ryan. So in any multil-gpu setup, the vram does not increase, unless the card itself has more of its own. Interesting limitation, so only compute power and bandwidth increase?
  • AnnonymousCoward - Monday, November 7, 2011 - link

    It seems like it might make sense for nvidia to base their architecture on having 4, 8, or 16 GPU dies on every board. This would improve yield across the entire low/medium/high end due to smaller die sizes, and it would give a huge boost to the high end (assuming power limits are figured out). In today's age of supercomputers having 4000 chips per rack, it does not seem optimal to put just 1 or 2 per video card PCB.
  • Sabresiberian - Tuesday, November 8, 2011 - link

    Great job, as usual; I have to agree with the conclusions made under "Final Thoughts". The only reason I'd go this route is if I needed the connectivity in terms of monitors and only had a single PCIe x16 slot on my mainboard. That being said, a 30% performance increase in your particular favorite game is nothing to ignore.

    One of the things I've been hoping is that EVGA would send Anandtech or Tomshardware (preferably both) a Classified card so one of these sites could run thorough overclocking tests on it. I highly doubt that the Classified could make up the 30% difference on air, but how much better than stock it can reach will be good to know before I buy.

    (It would also be interesting to know when AMD is going to release their next-gen GPU and whether or not it's going to be worth waiting a month or so for, but their recent CPU release puts them in the "I'm not holding my breath" category.)

    ;)
  • Wakanabi - Monday, February 6, 2012 - link

    I went with this card and I'll tell you why

    TEMPERATURE!!!

    I had two 560Ti's when I first built my pc, and having the sli bridge and the cards close together on a board, one card would be up at 65 to 70celcius under full load, and the other would be at 85 to 92!

    Anytime you have multiple graphics cards, the fans from the top card are pulling hot air directly from the other card's pcb backside.

    I had sold one of my 560s a while back for full price ($250) so instead of buy another one now, I sold the other for $200 and bought the 2Win. Now I only get up to 78celcius total. And once I change my case next week to a higher air flow case it will be even better.

    This is the best card I've ever had, better than two 6870s, a single gtx580 or even the 6990 I was using for mining. I have mine overclocked to 900Mhz and get another 10-12% increase in performance. Unbeatable as far as single cards go, especially considering the 6990 is $700 and the 590 is around that too

Log in

Don't have an account? Sign up now