Civ V, Battlefield, STALKER, and DIRT 2

Civilization V continues to be the oddball among our benchmarks. Having started out as a title with low framerates and poor multi-GPU scaling, in recent months AMD and NVIDIA have rectified this some.  As a result it’s now possible to crack 60fps at 2560 with a pair of high-end GPUs, albeit with some difficulty. In our experience Civ V is a hybrid bottlenecked game – we have every reason to believe it’s bottlenecked by the CPU at certain points, but the disparity between NVIDIA and AMD’s performance indicates there’s a big difference in how the two are settings things up under the hood.

When we started using Bad Company 2 a year ago, it was actually a rather demanding benchmark; anything above 60fps at 2560 required SLI/CF. Today that’s still true, but at 52fps the GTX 580 comes close to closing that gap. On the flip side two GPUs can send scores quite a distance up, and three GPUs will push that over 120fps. Now if we could just get a 120Hz 2560 monitor…

The Bad Company 2 Waterfall benchmark is our other minimum framerate benchmark, as it provides very consistent results. NVIDIA normally does well here with one GPU, but with two GPUs the gap closes to the point where NVIDIA may be CPU limited as indicated by our 580SLI/590 scores. At three GPUs AMD falls just short of a 60fps minimum, while the triple GTX 580 setup drops in performance. This would indicate uneven performance scaling for NVIDIA with three GPUs.

STALKER is another title that is both shader heavy and potentially VRAM-intensive. When moving from 1GB cards to 2GB cards we’ve seen the average framerate climb a respectable amount, which may be why AMD does so well here with multiple GPUs given the 512MB advantage in VRAM. With three GPUs the GTX 580 can crack 60fps, but the 6970 can clear 90fps.

We’ve seen DiRT 2 become CPU limited with two GPUs at 1920, so it shouldn’t come as a surprise that with three GPUs a similar thing happens at 2560. Although we can never be 100% sure that we’re CPU limited versus just seeing poor scaling, the fact that our framerates top out at only a few FPS above our top 1920 scores is a solid sign of this.

  Radeon HD 6970 GeForce GTX 580
GPUs 1->2 2->3 1->3 1->2 2->3 1->3
Civilization V 168% 99% 167% 170% 95% 160%
Battlefield: BC2 Chase 200% 139% 278% 189% 129% 246%
Battlefield: BC2 Water 206% 131% 272% 148% 85% 125%
STALKER: CoP 189% 121% 231% 149% 104% 157%
DiRT 2 181% 120% 219% 177% 105% 186%

So what does multi-GPU scaling look like in this batch of games? The numbers favor AMD at this point, particularly thanks to STALKER. Throwing out a CPU limited DIRT 2, and the average FPS for an AMD card moving from one GPU to two GPUs is 185%; NVIDIA’s gains under the same circumstances are only 169%.

For the case of two GPUs, AMD’s worst showing is Civilization V at 168%, while for NVIDIA it’s STALKER at %149. In the case of Civilization V the close gains to NVIDIA (168% vs. 170%) hides the fact that the GTX 580 already starts out at a much better framerate, so while the gains are similar the final performance is not. STALKER meanwhile presents us with an interesting case where the GTX 580 and Radeon HD 6970 start out close and end up far apart; AMD has the scaling and performance advantage thanks to NVIDIA’s limited performance gains here.

As for scaling with three GPUs, as was the case with two GPUs the results are in AMD’s favor. We still see some weak scaling at times – or none as in the case of Civilization V – but AMD’s average gain of 120% over a dual-GPU configuration isn’t too bad. NVIDIA’s average gains are basically only half AMD’s though at 110%, owing to an even larger performance loss in Civilization V, and almost no gain in STALKER. Battlefield: Bad Company 2 is the only title that NVIDIA sees significant gains in, and while the specter of CPU limits always looms overhead, I’m not sure what’s going on in STALKER for NVIDIA; perhaps we’re looking at the limits of 1.5GB of VRAM?

Looking at minimum framerates though the Battlefield: Bad Company 2, the situation is strongly in AMD’s favor for both two and three GPUs, as AMD scales practically perfectly with two GPUs and relatively well with three GPUs. I strongly believe this has more to do with the game than the technology, but at the end of the day NVIDIA’s poor triple-GPU scaling under this benchmark really puts a damper on things.

Crysis, BattleForge, Metro 2033, and HAWX Mass Effect 2, Wolfenstein, and Civ V Compute
Comments Locked

97 Comments

View All Comments

  • Sabresiberian - Tuesday, April 5, 2011 - link

    I've been thinking for quite awhile that we need something different, and this is the primary reason why - I can't get all I want to install on any ATX mainboard I know of.

    ;)
  • Sabresiberian - Tuesday, April 5, 2011 - link

    I've always thought minimum frame rate is where the focus should be in graphics card tests (when looking at the frame rate performance aspect), instead of the average. It's the minimum frame rate that bothers people or even makes a game unplayable.

    Thanks!

    ;)
  • mapesdhs - Wednesday, April 6, 2011 - link


    I hate to say it but with the CPU at only 3.33, the results don't really mean that much. I know
    the 920 used can't go higher, but it just seems a bit pointless to do all these tests when the
    results can't really be used as the basis for making a purchasing decision because of a very
    probably CPU bottleneck. Surely it would have been sensible for an article like this to replace
    the 920 with a 950 and redo the oc to 4+. The 950 is good value now aswell. Or even the
    entry 6-core.

    Re slot spacing, perhaps if one insists on using P67 it can be hard to sort that out, but there
    *are* X58 boards which provide what one needs, eg. the Asrock X58 Extreme6 does have
    double-slot spacing between each PCIe slot, so 3 dual-slot cards would have a fully empty
    slot between each card for better cooling. Do other vendors make a board like this? I couldn't
    find one after a quick check on the Gigabyte or ASUS sites. Only down side is with all 3 slots
    used the Extreme6 operates slots 2 and 3 at 8x/8x; for many games this isn't an issue (depends
    on the game), but I'm sure some would moan nonetheless.

    Would be interesting to know how that would compare though, ie. a 4GHz 950 on an Extreme6
    for these tests.

    Unless I missed it somehow, I'm a tad surprised Gigabyte don't make an X58 board with this type
    of slot spacing, or do they?

    Ian.
  • xAlex79 - Thursday, April 14, 2011 - link

    I am a bit disapointed Ryan in the way you put your conclusions.

    At the start of the article you highlight how you are going to look at Trifire and Tri-Sli and compare how it does for the value.

    Yet at the end in your conclusion there isnt a single mention or even adjusted scores considering value at all. And that makes Nvidia look alot better than they should. It is as you completely forget that three 580s costs you 1500$ and that three 6970s costs you 900$.

    Based on that and the fact YOU stated you would take value into account (And personally I think posting any kind of review without value nowdays is just irresponsible and biased) I am very disapointed with an otherwise very good set of tests.

    I also understand that this is labeled "Part 1" and that the value might come into "Part 2" but you should have CLEARLY outlined that in your conclusion were that the case. And given the quality of reviews that we have come to expect from Anantech, the final numbers should ALWAYS include a value perspective.

    I will jsut outline that it is poor form and not very professional and that in the end the people you should care about are us, your readers. Not how you look or try to look for hardware manifacturers. If this was a mistake, you should correct it asap. It does not make you look good.
  • L1qu1d - Friday, April 15, 2011 - link

    I wonder why they didn't opt for the 270.51 Drivers and went with 3 month old drivers?

    Compared to the tested drivers:

    GeForce GTX 580:

    Up to 516% in Dragon Age 2 (SLI 2560x1600 8xAA/16xAF Very High, SSAO on)
    Up to 326% in Dragon Age 2 (1920x1200 8xAA/16xAF Very High, SSAO on)
    Up to 11% in Just Cause 2 (1920x1200 8xAA/16xAF, Concrete Jungle)
    Up to 11% in Just Cause 2 (SLI 2560x1600 8xAA/16xAF, Concrete Jungle)
    Up to 7% in Civilization V (1920x1200 4xAA/16xAF, Max settings)
    Up to 6% in Far Cry 2 (SLI 2560x1600 8xAA/16xAF, Max settings)
    Up to 5% in Civilization V (SLI 1920x1200 8xAA/16xAF, Max settings)
    Up to 5% in Left 4 Dead 2 (1920x1200 noAA/AF, Outdoor)
    Up to 5% in Left 4 Dead 2 (SLI 2560x1600 4xAA/16xAF, Outdoor)
    Up to 4% in H.A.W.X. 2 (SLI 1920x1200 8xAA/16xAF, Max settings)
    Up to 4% in Mafia 2 (SLI 2560x1600 AA on/16xAF, PhysX = High)
  • Fony - Thursday, April 28, 2011 - link

    taking forever for the Eyefinity/Surround testing.
  • vipergod2000 - Thursday, May 5, 2011 - link

    The one thing that erks me is that the i7-920 OCed to ~3.3Ghz - causing the scaling of 3 cards being greatly reduced as opposed to other forum users that have 3 or 4 cards in CFX or SLI but with fantastic scaling - but assured with a coupling a i7-2600k at 5ghz minimum or a 980x/990x at 4.6ghz+

Log in

Don't have an account? Sign up now