Company of Heroes 2

Our second benchmark in our benchmark suite is Relic Games’ Company of Heroes 2, the developer’s World War II Eastern Front themed RTS. For Company of Heroes 2 Relic was kind enough to put together a very strenuous built-in benchmark that was captured from one of the most demanding, snow-bound maps in the game, giving us a great look at CoH2’s performance at its worst. Consequently if a card can do well here then it should have no trouble throughout the rest of the game.

Our first strategy game is also our first game that is flat out AFR incompatible, and as a result the only way to get the best performance out of Company of Heroes 2 is with the fastest single-GPU card available. To that end this is a very clear victory for the 290X, and in fact will be the largest lead for the 290X of all of our benchmarks. At 2560 it’s a full 29% faster than the GTX 780, which all but puts the 290X in a class of its own. This game also shows some of the greatest gains for the 290X over the 280X, with the 290X surpassing its Tahti based predecessor by an equally chart topping 41%. It’s not clear what it is at this time that Company of Heroes 2 loves about 290X in particular, but as far as this game is concerned AMD has put together an architecture that maps well to the game’s needs.

Briefly, because of a lack of AFR compatibility 4K is only barely attainable with any kind of GPU setup. In fact we’re only throwing in the scale-less SLI/CF numbers to showcase that fact. We had to dial down our quality settings to Low on CoH2 in order to get a framerate above 30fps; even though we can be more liberal about playable framerates on strategy games, there still needs to be a cutoff for average framerates around that point. As a result 280X, GTX Titan, and 290X are the only cards to make that cutoff, with 290X being the clear winner. But the loss in quality to make 4K achievable is hardly worth the cost.

 

Moving on to minimum framerates, we see that at its most stressful points that nothing, not even 290X, can keep its minimums above 30fps. For a strategy game this is bearable, but we certainly wouldn’t mind more performance. AMD will be pleased though, as their performance advantage over the GTX 780 is only further extended here; a 29% average performance advantage becomes a 43% minimum performance advantage at 2560.

Finally, while we don’t see any performance advantages from AFR on this game we did run our FCAT benchmarks anyhow to quickly capture the delta percentages. Company of Heroes 2 has a higher than average variance even among single cards, which results in deltas being above 5%. The difference between 5% and 7% is not going to be too significant in practice here, but along with AMD’s performance advantage they do have slightly more consistent frame times than the GTX 780. Though in both the case of the 280X and the 290X we’re looking at what are essentially the same deltas, so while the 290X improves on framerates versus the 280X, it doesn’t bring with it any improvements in frame time consistency.

Metro: Last Light Bioshock Infinite
Comments Locked

396 Comments

View All Comments

  • Spunjji - Friday, October 25, 2013 - link

    Word.
  • extide - Thursday, October 24, 2013 - link

    That doesn't mean that AMD can't come up with a solution that might even be compatible with G-Sync... Time will tell..
  • piroroadkill - Friday, October 25, 2013 - link

    That would not be in NVIDIA's best interests. If a lot of machines (AMD, Intel) won't support it, why would you buy a screen for a specific graphics card? Later down the line, maybe something like the R9 290X comes out, and you can save a TON of money on a high performing graphics card from another team.

    It doesn't make sense.

    For NVIDIA, their best bet at getting this out there and making the most money from it, is licencing it.
  • Mstngs351 - Sunday, November 3, 2013 - link

    Well it depends on the buyer. I've bounced between AMD and Nvidia (to be upfront I've had more Nvidia cards) and I've been wanting to step up to a larger 1440 monitor. I will be sure that it supports Gsync as it looks to be one of the more exciting recent developments.

    So although you are correct that not a lot of folks will buy an extra monitor just for Gsync, there are a lot of us who have been waiting for an excuse. :P
  • nutingut - Saturday, October 26, 2013 - link

    Haha, that would be something for the cartel office then, I figure.
  • elajt_1 - Sunday, October 27, 2013 - link

    This doesn't prevent AMD from making something similiar, if Nvidia decides to not make it open.
  • hoboville - Thursday, October 24, 2013 - link

    Gsync will require you to buy a new monitor. Dropping more money on graphics and smoothness will apply at the high end and for those with big wallets, but for the rest of us there's little point to jumping into Gsync.

    In 3-4 years when IPS 2560x1440 has matured to the point where it's both mainstream (cheap) and capable of delivering low-latency ghosting-free images, then Gsync will be a big deal, but right now only a small percentage of the population have invested in 1440p.

    The fact is, most people have been sitting on their 1080p screens for 3+ years and probably will for another 3 unless those same screens fail--$500+ for a desktop monitor is a lot to justify. Once the monitor upgrades start en mass, then Gsync will be a market changer because AMD will not have anything to compete with.
  • misfit410 - Thursday, October 24, 2013 - link

    G-String didn't kill anything, I'm not about to give up my Dell Ultrasharp for another Proprietary Nvidia tool.
  • anubis44 - Tuesday, October 29, 2013 - link

    Agreed. G-sync is a stupid solution to non-existent problem. If you have a fast enough frame rate, there's nothing to fix.
  • MADDER1 - Thursday, October 24, 2013 - link

    Mantle could be for either higher frame rate or more detail. Gsync sounds like just frame rate.

Log in

Don't have an account? Sign up now