Civ V, Battlefield, STALKER, and DIRT 2

Civilization V continues to be the oddball among our benchmarks. Having started out as a title with low framerates and poor multi-GPU scaling, in recent months AMD and NVIDIA have rectified this some.  As a result it’s now possible to crack 60fps at 2560 with a pair of high-end GPUs, albeit with some difficulty. In our experience Civ V is a hybrid bottlenecked game – we have every reason to believe it’s bottlenecked by the CPU at certain points, but the disparity between NVIDIA and AMD’s performance indicates there’s a big difference in how the two are settings things up under the hood.

When we started using Bad Company 2 a year ago, it was actually a rather demanding benchmark; anything above 60fps at 2560 required SLI/CF. Today that’s still true, but at 52fps the GTX 580 comes close to closing that gap. On the flip side two GPUs can send scores quite a distance up, and three GPUs will push that over 120fps. Now if we could just get a 120Hz 2560 monitor…

The Bad Company 2 Waterfall benchmark is our other minimum framerate benchmark, as it provides very consistent results. NVIDIA normally does well here with one GPU, but with two GPUs the gap closes to the point where NVIDIA may be CPU limited as indicated by our 580SLI/590 scores. At three GPUs AMD falls just short of a 60fps minimum, while the triple GTX 580 setup drops in performance. This would indicate uneven performance scaling for NVIDIA with three GPUs.

STALKER is another title that is both shader heavy and potentially VRAM-intensive. When moving from 1GB cards to 2GB cards we’ve seen the average framerate climb a respectable amount, which may be why AMD does so well here with multiple GPUs given the 512MB advantage in VRAM. With three GPUs the GTX 580 can crack 60fps, but the 6970 can clear 90fps.

We’ve seen DiRT 2 become CPU limited with two GPUs at 1920, so it shouldn’t come as a surprise that with three GPUs a similar thing happens at 2560. Although we can never be 100% sure that we’re CPU limited versus just seeing poor scaling, the fact that our framerates top out at only a few FPS above our top 1920 scores is a solid sign of this.

  Radeon HD 6970 GeForce GTX 580
GPUs 1->2 2->3 1->3 1->2 2->3 1->3
Civilization V 168% 99% 167% 170% 95% 160%
Battlefield: BC2 Chase 200% 139% 278% 189% 129% 246%
Battlefield: BC2 Water 206% 131% 272% 148% 85% 125%
STALKER: CoP 189% 121% 231% 149% 104% 157%
DiRT 2 181% 120% 219% 177% 105% 186%

So what does multi-GPU scaling look like in this batch of games? The numbers favor AMD at this point, particularly thanks to STALKER. Throwing out a CPU limited DIRT 2, and the average FPS for an AMD card moving from one GPU to two GPUs is 185%; NVIDIA’s gains under the same circumstances are only 169%.

For the case of two GPUs, AMD’s worst showing is Civilization V at 168%, while for NVIDIA it’s STALKER at %149. In the case of Civilization V the close gains to NVIDIA (168% vs. 170%) hides the fact that the GTX 580 already starts out at a much better framerate, so while the gains are similar the final performance is not. STALKER meanwhile presents us with an interesting case where the GTX 580 and Radeon HD 6970 start out close and end up far apart; AMD has the scaling and performance advantage thanks to NVIDIA’s limited performance gains here.

As for scaling with three GPUs, as was the case with two GPUs the results are in AMD’s favor. We still see some weak scaling at times – or none as in the case of Civilization V – but AMD’s average gain of 120% over a dual-GPU configuration isn’t too bad. NVIDIA’s average gains are basically only half AMD’s though at 110%, owing to an even larger performance loss in Civilization V, and almost no gain in STALKER. Battlefield: Bad Company 2 is the only title that NVIDIA sees significant gains in, and while the specter of CPU limits always looms overhead, I’m not sure what’s going on in STALKER for NVIDIA; perhaps we’re looking at the limits of 1.5GB of VRAM?

Looking at minimum framerates though the Battlefield: Bad Company 2, the situation is strongly in AMD’s favor for both two and three GPUs, as AMD scales practically perfectly with two GPUs and relatively well with three GPUs. I strongly believe this has more to do with the game than the technology, but at the end of the day NVIDIA’s poor triple-GPU scaling under this benchmark really puts a damper on things.

Crysis, BattleForge, Metro 2033, and HAWX Mass Effect 2, Wolfenstein, and Civ V Compute
Comments Locked

97 Comments

View All Comments

  • Ryan Smith - Sunday, April 3, 2011 - link

    It took awhile, but we finally have 3 120Hz 1080P monitors on the way. So we'll be able to test Eyefinity, 3D Vision, and 3D Vision Surround; all of which have been neglected around here.
  • Kaboose - Sunday, April 3, 2011 - link

    I await these tests with breathless anticipation!
  • veri745 - Sunday, April 3, 2011 - link

    While this article was very well written, I think it is hardly worth it without the multi-monitor data. No-one (sane) is going to get 3x SLI/CF with a single monitor, so it's mostly irrelevant.

    The theoretical scaling comparison is interesting, but I'm a lot more interesting in the scaling at 3240x1920 or 5760x1080.
  • DanNeely - Sunday, April 3, 2011 - link

    This is definitely a step in the right direction; but with other sites having 3x 1920x1200 or even 3x 2560x1600 test setups you'll still be playing catchup.
  • RK7 - Sunday, April 3, 2011 - link

    Finally! I created account just to write that comment :) That's what's missing and what definitely needs to be tested! Especially 3D Vision Surround - it's good to know if it's worth to put so much money into such setup, because single card may be on the edge of performance for modern games in stereoscopic mode with single monitor (good example is Metro 2033, that blows mind when in 3D, but I found with single GTX 570@900MHz is playable only at 1600x900 in 3D with maximum settings without DoF and AA, and even in such case it could drop for some action scenes with heavy lighting to ~12 fps...). So if three cards can achieve a good scaling and provide performance per monitor for 3 monitors setup close to single card for one monitor, then we're there and it's worth it definitely, but if numbers will be alike to those for single monitor scaling, then folks should be aware that there's no way for maximum visual quality gaming with current hardware on 3 monitors...
  • Dustin Sklavos - Monday, April 4, 2011 - link

    Not completely neglected. I've added triple-monitor surround testing to my boutique desktop reviews whenever able. :)
  • Crazymech - Sunday, April 3, 2011 - link

    I'm having my doubts about the capabilities of the 920 OC'd to 3.33 GHz matched up with 3 of the most powerful single GPUs.

    I understand straying away from SB because of the lanes, but you could at least have upped the OC to 3,8-4, which many people do (and I would think most that considers a tripple setup would use).

    To underline it I point to the small differences between the 4.5 GHz 2600K and the lower overclocked one in the boutique builds reviews, with the highest clocked CPU coupled with weaker GPU's nipping at the heels of the more powerful GPU.

    I suggest you at least experiment in a single test (say metro for example.. or battlefield) what a higher clocked X58 (or the 980's 6 cores) could do to the setup.
    If I'm wrong, it would at least be good to know that.
  • BrightCandle - Sunday, April 3, 2011 - link

    The fact that sandy Bridge has a PCI-E lanes problem is grounds for testing the impact.

    Still I would rather see the numbers on X58 and triple screen gaming before seeing the impact that SB makes the performance of SLI/CF setups.
  • Ryan Smith - Sunday, April 3, 2011 - link

    For what it's worth, 3.33GHz is actually where this specific 920 tops out. It won't take 3.5GHz or higher, unfortunately.

    We'll ultimately upgrade our testbed to SNB - this article is the impetus for that - but that's not going to happen right away.
  • Crazymech - Monday, April 4, 2011 - link

    It wont take 3.5? Really? That amazes me.
    Though very unfortunate for the purpose of this test.

    The main focus is (of course) always on how the new GPU in relation to a standard CPU improves the framerate, but once in a while it would be interesting what different CPU's do to the GPU and FPS aswell. Like the old Doom III articles of showing Athlon dominating PenIV.

    Thanks for the answer anyhows :).

Log in

Don't have an account? Sign up now