Company of Heroes 2

Our second benchmark in our benchmark suite is Relic Games’ Company of Heroes 2, the developer’s World War II Eastern Front themed RTS. For Company of Heroes 2 Relic was kind enough to put together a very strenuous built-in benchmark that was captured from one of the most demanding, snow-bound maps in the game, giving us a great look at CoH2’s performance at its worst. Consequently if a card can do well here then it should have no trouble throughout the rest of the game.

Since Company of Heroes 2 is not an AFR friendly game, getting the best performance out of the game requires having the fastest GPU. While the GTX 780 Ti has a clear lead over the 290X across the average of our games, in this specific case it’s going to come up short, as AMD’s performance with this game is simply too high to be overcome without a significant performance advantage. Conversely this means that GTX 780 Ti and 290X are still close enough that NVIDIA won’t be able to sweep every game; in games where AMD still does exceptionally well, they’ll be able to close the gap and surpass the GTX 780 Ti.

Meanwhile, looking at a straight-up NVIDIA comparison, the GTX 780 Ti holds a slightly smaller than normal lead over its counterparts. At 5% faster than GTX Titan and 17% faster than GTX 780 it’s still the fastest of the cards, but it won’t pull ahead in this game by as much as it does elsewhere.

The minimum framerate story is largely the same. GTX 780 Ti is the fastest NVIDIA card, but it will trail the 290X by over 10% in both scenarios.

Metro: Last Light Bioshock Infinite
Comments Locked

302 Comments

View All Comments

  • yuko - Monday, November 11, 2013 - link

    for me neither of them is gamechanger ... gsync, shield ... nice stuff i don't need
    mantle: another nice approach to create an semi-closed-standard .. it's not that directX or opengl is allready existing and working quite good, no , we need another low level standard where amd creates the api (and to be honest, they would be quite stupid not optimizing it for their hardware).

    I cannot believe and hope that mantle will flop, it does no favor to customers and the industry. It's just good for the marketing but has no real world use.
  • Kamus - Thursday, November 7, 2013 - link

    Nope, it's confirmed for every frostbite 3 game coming out, that's at least a dozen so far, not to mention it's also officially coming to starcitizen, which runs on cryengine 3 I believe.
    But yes, even with those titles it's still a huge difference, obviously.

    That said, you can expect that any engine optimized for GCN on consoles could wind up with mantle support, since the hard work is already done. And in the case of star citizen... Well, that's a PC exclusive, and it's still getting mantle.
  • StevoLincolnite - Thursday, November 7, 2013 - link

    Mantle is confirmed for all Frostbite powered games.
    That is, Battlefield 4, Dragon Age 3, Mirrors Edge 2, Need for Speed, Mass Effect, StarWars Battlefront, Plant's vs Zombies: Garden Warfare and probably others that haven't been announced yet by EA.
    Star Citizen and Thief will also support Mantle.

    So that's EA, Cloud Imperium Games, Square Enix that will support the API and it hasn't even released yet.
  • ahlan - Thursday, November 7, 2013 - link

    And for Gsync you will need a new monitor with Gsync support. I won't buy a new monitor only for that.
  • jnad32 - Thursday, November 7, 2013 - link

    http://ir.amd.com/phoenix.zhtml?c=74093&p=irol...
    BOOM!
  • Creig - Friday, November 8, 2013 - link

    Gsync will only work on Kepler and above video cards.

    So if you have an older card, not only do you have to buy an expensive gsync capable monitor, you also need a new Kepler based video card as well. Even if you already own a Kepler video card, you still have to purchase a new gsync monitor which will cost you $100 more than an identical non-gsync monitor.

    Whereas Mantle is a free performance boost for all GCN video cards.

    Summary:
    Gsync cost - Purchase new computer monitor +$100 for gsync module.
    Mantle cost - Free performance increase for all GCN equipped video cards.

    Pretty easy to see which one offers the better value.
  • neils58 - Sunday, November 10, 2013 - link

    As you say Mantle is very exciting, but we don't know how much performance we are talking about yet. My thinking on saying that crossfire was AMD's only answer is that in order to avoid the stuttering effect of dropping below the Vsync rate, you have to ensure that the minimum framerate is much higher, which means adding more cards or turning down quality settings. If Mantle turns out to be a huge performance increase things might work out, but we just don't know.

    Sure, TN isn't ideal, but people with gaming priorities will already be looking for monitors with low input lag, fast refresh rates and features like backlight strobing for motion blur reduction, G-Sync will basically become a standard feature on a brands lineup of gaming oriented monitors. I think it'll come down in price a fair bit too once there are a few competing brands.

    It's all made things tricky for me, I'm currently on a 1920x1200 'VA monitor on a 5850 and was considering going up to a 1440p 27" screen (which would have required a new GPU purchase anyway) G-Sync adds enough value to Gaming TN's to push me over to them.
  • jcollett - Monday, November 11, 2013 - link

    I've got a large 27" IPS panel so I understand the concern. However, a good high refresh panel need not cost very much and still look great. Check out the ASUS VG248QE; been hearing good things about the panel and it is relatively cheap at about $270. I assume it would work with the G-Sync but I haven't confirmed that myself. I'll be looking for reviews of Battlefield 4 using Mantle this December as that could makeup a big part of the decision on my next card coming from Team Green or Red.
  • misfit410 - Thursday, November 7, 2013 - link

    I don't buy that it's a game changer, I have no intention of replacing my three Dell Ultrasharp monitors anytime soon, and even if I did I have no intention of dealing with buggy displayport as my only option to hook up a synced monitor.
  • Mr Majestyk - Thursday, November 7, 2013 - link

    +1

    I've got two high end Dell 27" monitors and it's a joke to think I'd swap them out for garbage TN monitors just to get G Sync.

    I don't see the 780 Ti as being any skin off AMD's nose. It's much dearer for very small gains and we haven't seen the custom AMD boards yet. For now I'd probably get the R9 290, assuming custom boards can greatly improve on cooling and heat.

Log in

Don't have an account? Sign up now