Mass Effect 2, Wolfenstein, and Civ V Compute

Mass Effect 2 is a game we figured would be GPU limited by three GPUs, so it’s quite surprising that it’s not. It does look like there’s a limit at around 200fps, but we can’t hit that at 2560 even with three GPUs. You can be quite confident with two or more GPUs however that your framerates will be nothing short of amazing.

For that reason, and because ME2 is a DX9-only game, we also gave it a shot with SSAA on both the AMD and NVIDIA setups at 1920. Surprisingly it’s almost fluid in this test even with one GPU. Move to two GPUs and we’re looking at 86fps – again this is with 4x super sampling going on. I don’t think we’re too far off from being able to super sample a number of games (at least the console ports) with this kind of performance.

Wolfenstein is quite CPU limited even with two GPUs, so we didn’t expect much with three GPUs. In fact the surprising bit wasn’t the performance, it was the fact that AMD’s drivers completely blew a gasket with this game. It runs fine with two GPUs, but with three GPUs it will crash almost immediately after launching it. Short of a BSOD, this is the worst possible failure mode for an AMD setup, as AMD does not provide individual game settings for CF, unlike NVIDIA who allows for the enabling/disabling of SLI on a game-specific basis. As a result the only way to play Wolfenstein if you had a triple-GPU setup is to change CrossFire modes globally, which requires a hardware reconfiguration that takes several seconds and a couple of blank screens.

We only have one OpenGL game in our suite so we can’t isolate this as an AMD OpenGL issue or solely an issue with Wolfenstein. It’s disappointing to see AMD have this problem though.

We don’t normally look at multi-GPU numbers with our Civilization V compute test, but in this case we had the data so we wanted to throw it out there as an example of where SLI/CF and the concept of alternate frame rendering just doesn’t contribute much to a game. Texture decompression needs to happen on each card, so it can’t be divided up as rendering can. As a result additional GPUs reduce NVIDIA’s score, while two GPUs does end up helping AMD some only for a 3rd GPU to bring scores crashing down. None of this scores are worth worrying about – it’s still more than fast enough for the leader scenes the textures are for, but it’s a nice theoretical example.

  Radeon HD 6970 GeForce GTX 580
GPUs 1->2 2->3 1->3 1->2 2->3 1->3
Mass Effect 2 180% 142% 158% 195% 139% 272%
Mass Effect 2 SSAA 187% 148% 280% 198% 138% 284%
Wolfenstein 133% 0% 0% 151% 96% 145%

Since Wolfenstein is so CPU limited, the scaling story out of these games is really about Mass Effect 2. Again dual-GPU scaling is really good, both with MSAA and SSAA; NVIDIA in particular achieves almost perfect scaling. What makes this all the more interesting is that with three GPUs the roles are reversed, scaling is still strong but now it’s AMD achieving almost perfect scaling on Mass Effect 2 with SSAA, which is quite a feat given the uneven scaling of triple-GPU configurations overall. It’s just a shame that AMD doesn’t have a SSAA mode for DX10/DX11 games; if it was anything like their DX9 SSAA mode, it could certainly sell the idea of a triple GPU setup to users looking to completely eliminate all forms of aliasing at any price.

As for Wolfenstein, with two GPUs NVIDIA has the edge, but they also had the lower framerate in the first place. Undoubtedly being CPU limited even with two GPUs, there’s not much to draw from here.

Civ V, Battlefield, STALKER, and DIRT 2 Closing Thoughts
Comments Locked

97 Comments

View All Comments

  • A5 - Sunday, April 3, 2011 - link

    I'm guessing Ryan doesn't want to spend a month redoing all of their benchmarks over all the recent cards. Also the only one of your more recent games that would be at all relevant is Shogun 2 - SC2 runs well on everything, no one plays Arma 2, and the rest are console ports...
  • slickr - Sunday, April 3, 2011 - link

    apart from Shift, no game is console port.

    PC only is mafia 2, SC2, Arma 2, shogun 2, dead space is also not a console port. its PC port to consoles.
  • Ryan Smith - Sunday, April 3, 2011 - link

    We'll be updating the benchmark suite in the next couple of months as keeping with our twice a year schedule. Don't expect us to drop Civ V or Crysis, however.
  • Dustin Sklavos - Monday, April 4, 2011 - link

    Jarred and I have gone back and forth on this stuff to get our own suite where it needs to be, and the games Ryan's running have sound logic behind them. For what it's worth...

    Aliens vs. Predator isn't worth including because it doesn't really leverage that much of DX11 and nobody plays it because it's a terrible game. Crysis Warhead STILL stresses modern gaming systems. As long as it does that it'll be useful, and at least provides a watermark for the underwhelming Crysis 2.

    Battleforge and Shogun 2 I'm admittedly not sure about, same with HAWX and Shift 2.

    Civ 5 should stay, but StarCraft II should definitely be added. There's a major problem with SC2, though: it's horribly, HORRIBLY CPU bound. SC2 is criminally badly coded given how long it's been in the oven and doesn't scale AT ALL with more than two cores. I've found situations even with Sandy Bridge hardware where SC2 is more liable to demonstrate how much the graphics drivers and subsystem hit the CPU rather than how the graphics hardware itself performs. Honestly my only justification for including it in our notebook/desktop suites is because it's so popular.

    Mass Effect 2 to Dead Space 2 doesn't make any sense; Dead Space 2 is a godawful console port while Mass Effect 2 is currently one of the best if not THE best optimized Unreal Engine 3 games on the PC. ME2 should get to stay almost entirely by virtue of being an Unreal Engine 3 representative, ignoring its immense popularity.

    Wolfenstein is currently the most demanding OpenGL game on the market. It may seem an oddball choice, but it really serves the purpose of demonstrating OpenGL performance. Arma 2 doesn't fill this niche.

    Mafia II's easy enough to test that it couldn't hurt to add it.
  • JarredWalton - Monday, April 4, 2011 - link

    Just to add my two cents....

    AvP is a lousy game, regardless of benchmarks. I also toss HAWX and HAWX 2 into this category, but Ryan has found a use for HAWX in that it puts a nice, heavy load on the GPUs.

    Metro 2033 and Mafia II aren't all that great either, TBH, and so far Crysis 2 is less demanding *and* less fun than either of the two prequels. (Note: I finished both Metro and Mafia, and I'd say both rate something around 70%. Crysis 2 is looking about 65% right now, but maybe it'll pick up as the game progresses.)
  • c_horner - Sunday, April 3, 2011 - link

    I'm waiting for the day when someone actually reports on the perceived usability of Mutli-GPU setups in comparison to a single high-end GPU.

    What I mean is this: often times even though you might be receiving an arbitrarily larger frame count, the lag and overall smoothness of the games aren't anywhere near as playable and enjoyable as a game that can be run properly with a single GPU.

    Having tried SLI in the past I was left with a rather large distaste for plopping down the cost of another high end card. Not all games worked properly, not all games scaled well, some games would scale well in the areas it could render easily but minimum frame rates sucked etc. etc. and the list goes on.

    When are some of these review sites going to post subjective and real world usage information instead of a bunch of FPS comparisons?

    There's more to the story here.
  • semo - Sunday, April 3, 2011 - link

    I think this review covers some of your concerns. It seems that AMD with their latest drivers achieve a better min FPS score compared to nVidia.

    I've never used SLI myself but I would think that you wouldn't be able to notice the latency due to more than one GPU in game. Wouldn't such latencies be in the micro seconds?
  • SlyNine - Monday, April 4, 2011 - link

    And yet, those micro seconds seemed like macro seconds, Micro studder was one of the most annoying things ever! I hated my 8800GT SLI experience.

    Haven't been back to multi videocard setups since.
  • DanNeely - Monday, April 4, 2011 - link

    Look at HardOCP.com's reviews. Instead of FPS numbers from canned benches they play the games and list the highest settings that were acceptable. Minimum FPS levels and for SLI/xFire microstuttering problems can push their recommendations down because even when the average numbers look great the situation might actually not be playable.
  • robertsu - Sunday, April 3, 2011 - link

    How is microstuttering with 3 GPU's? Is there any in this new versions?

Log in

Don't have an account? Sign up now