Civ V, Battlefield, STALKER, and DIRT 2

Civilization V continues to be the oddball among our benchmarks. Having started out as a title with low framerates and poor multi-GPU scaling, in recent months AMD and NVIDIA have rectified this some.  As a result it’s now possible to crack 60fps at 2560 with a pair of high-end GPUs, albeit with some difficulty. In our experience Civ V is a hybrid bottlenecked game – we have every reason to believe it’s bottlenecked by the CPU at certain points, but the disparity between NVIDIA and AMD’s performance indicates there’s a big difference in how the two are settings things up under the hood.

When we started using Bad Company 2 a year ago, it was actually a rather demanding benchmark; anything above 60fps at 2560 required SLI/CF. Today that’s still true, but at 52fps the GTX 580 comes close to closing that gap. On the flip side two GPUs can send scores quite a distance up, and three GPUs will push that over 120fps. Now if we could just get a 120Hz 2560 monitor…

The Bad Company 2 Waterfall benchmark is our other minimum framerate benchmark, as it provides very consistent results. NVIDIA normally does well here with one GPU, but with two GPUs the gap closes to the point where NVIDIA may be CPU limited as indicated by our 580SLI/590 scores. At three GPUs AMD falls just short of a 60fps minimum, while the triple GTX 580 setup drops in performance. This would indicate uneven performance scaling for NVIDIA with three GPUs.

STALKER is another title that is both shader heavy and potentially VRAM-intensive. When moving from 1GB cards to 2GB cards we’ve seen the average framerate climb a respectable amount, which may be why AMD does so well here with multiple GPUs given the 512MB advantage in VRAM. With three GPUs the GTX 580 can crack 60fps, but the 6970 can clear 90fps.

We’ve seen DiRT 2 become CPU limited with two GPUs at 1920, so it shouldn’t come as a surprise that with three GPUs a similar thing happens at 2560. Although we can never be 100% sure that we’re CPU limited versus just seeing poor scaling, the fact that our framerates top out at only a few FPS above our top 1920 scores is a solid sign of this.

  Radeon HD 6970 GeForce GTX 580
GPUs 1->2 2->3 1->3 1->2 2->3 1->3
Civilization V 168% 99% 167% 170% 95% 160%
Battlefield: BC2 Chase 200% 139% 278% 189% 129% 246%
Battlefield: BC2 Water 206% 131% 272% 148% 85% 125%
STALKER: CoP 189% 121% 231% 149% 104% 157%
DiRT 2 181% 120% 219% 177% 105% 186%

So what does multi-GPU scaling look like in this batch of games? The numbers favor AMD at this point, particularly thanks to STALKER. Throwing out a CPU limited DIRT 2, and the average FPS for an AMD card moving from one GPU to two GPUs is 185%; NVIDIA’s gains under the same circumstances are only 169%.

For the case of two GPUs, AMD’s worst showing is Civilization V at 168%, while for NVIDIA it’s STALKER at %149. In the case of Civilization V the close gains to NVIDIA (168% vs. 170%) hides the fact that the GTX 580 already starts out at a much better framerate, so while the gains are similar the final performance is not. STALKER meanwhile presents us with an interesting case where the GTX 580 and Radeon HD 6970 start out close and end up far apart; AMD has the scaling and performance advantage thanks to NVIDIA’s limited performance gains here.

As for scaling with three GPUs, as was the case with two GPUs the results are in AMD’s favor. We still see some weak scaling at times – or none as in the case of Civilization V – but AMD’s average gain of 120% over a dual-GPU configuration isn’t too bad. NVIDIA’s average gains are basically only half AMD’s though at 110%, owing to an even larger performance loss in Civilization V, and almost no gain in STALKER. Battlefield: Bad Company 2 is the only title that NVIDIA sees significant gains in, and while the specter of CPU limits always looms overhead, I’m not sure what’s going on in STALKER for NVIDIA; perhaps we’re looking at the limits of 1.5GB of VRAM?

Looking at minimum framerates though the Battlefield: Bad Company 2, the situation is strongly in AMD’s favor for both two and three GPUs, as AMD scales practically perfectly with two GPUs and relatively well with three GPUs. I strongly believe this has more to do with the game than the technology, but at the end of the day NVIDIA’s poor triple-GPU scaling under this benchmark really puts a damper on things.

Crysis, BattleForge, Metro 2033, and HAWX Mass Effect 2, Wolfenstein, and Civ V Compute
Comments Locked

97 Comments

View All Comments

  • Castiel - Monday, April 4, 2011 - link

    Why didn't you just use a P67 board equipped with a NF200 chip for testing? Using X58 is a step in the wrong direction.
  • UrQuan3 - Monday, April 4, 2011 - link

    Mr Smith,
    When you do the multi-monitor SLI\Crossfire review, could you briefly go over different connection modes? The last time I messed with SLI, it forced all monitors to be connected to the first card. Since the cards in question only had two outputs, I had to turn off SLI to connect three monitors. This caused some strange problems for 3D software.

    Would you go over the options currently available in your next review?
  • Ryan Smith - Monday, April 4, 2011 - link

    When was this? That doesn't sound right; you need SLI to drive 3 monitors at the present time.
  • UrQuan3 - Thursday, April 7, 2011 - link

    Right this second I'm typing on a PC with 2 GTX 260s (not sure which revision) with two monitors plugged into the first and a third monitor plugged into the second. At the time, SLI would only allow monitors plugged into the first card. Of course, since IT doesn't trust us to do our own upgrades, I'm still running driver version 260.89.

    Of course, Windows supports multiple dissimilar cards with a monitor or two on each, even different brand cards. However, 3D support in this mode is, er, creative. In this mode most programs (games) can only drive one card's monitors. You can, however, have different programs running 3D on different cards' monitors.

    Since you'll have the hardware sitting on your desk, I'd love to see a quick test of the options.
  • BLHealthy4life - Monday, April 4, 2011 - link

    How the heck did you get 11.4 preview to work with crossfire??

    I have 6970 crossfire and I cannot for the life of me get 11.4p to work. I have used 11.2 and 11.3 with no problems. I removed previous drivers with ATI uninstaller followed by driver sweeper. Then I've installed 11.4 p 3/7 and 3/29 and neither one of them work.

    I even went as far as to do TWO fresh installs of W7 x64 Ultimate and then install 11.p and the f*cking driver breaks crossfire....
  • Ryan Smith - Monday, April 4, 2011 - link

    I'm afraid there's not much I can tell you. We did not have any issues with 11.4 and the 6970s whatsoever.
  • quattro_ - Monday, April 4, 2011 - link

    did you use DOF when benching METRO ? i find the HD6990's score high! i only get 37fps average : 980x @4.4 and single hd6990 stock clocks and 11.4 preview driver .
  • Ryan Smith - Monday, April 4, 2011 - link

    No, we do not. Metro is bad enough; DOF crushes performance.
  • ClagMaster - Monday, April 4, 2011 - link

    I will never understand why people will by 2 or 3 graphics cards, require a 1200W power supply, so they can get 10-20 fps or more subtile eye candy.

    There are some things that are beyond the point of reason and fall into the madness of Captain Ahab. This is just about as crazy as insisting on a 0.50 cal Browning Target rifle than a more sensible 0.308 Win Target rifle for 550m target shooting and white tail deer hunting. The 0.308 Win is less punishing on the body and pocketbook to shoot than the 0.50 Browning.

    I always believed in working with one (1) graphics card that takes up 1 slot and requires 65 to 85W of power. A 9600GT plays all my games on a 1600x1200 CRT just fine.
  • looper - Tuesday, April 5, 2011 - link

    Excellent post... well-said.

Log in

Don't have an account? Sign up now