Company of Heroes 2

Our second benchmark in our benchmark suite is Relic Games’ Company of Heroes 2, the developer’s World War II Eastern Front themed RTS. For Company of Heroes 2 Relic was kind enough to put together a very strenuous built-in benchmark that was captured from one of the most demanding, snow-bound maps in the game, giving us a great look at CoH2’s performance at its worst. Consequently if a card can do well here then it should have no trouble throughout the rest of the game.

Company of Heroes 2 - 3840x2160 - Low Quality

Company of Heroes 2 - 2560x1440 - Maximum Quality + Med. AA

Company of Heroes 2 - 1920x1080 - Maximum Quality + Med. AA

Since CoH2 is not AFR compatible, the best performance you’re going to get out of it is whatever you can get out of a single GPU. In which case the GTX 980 is the fastest card out there for this game. AMD’s R9 290XU does hold up well though; the GTX 980 may have a lead, but AMD is never more than a few percent behind at 4K and 1440p. The lead over the GTX 780 Ti is much more substantial on the other hand at 13% to 22%. So NVIDIA has finally taken this game back from AMD, as it were.

Elsewhere against the GTX 680 this is another very good performance for the GTX 980, with a performance advantage over 80%.

On an absolute basis, at these settings you’re looking at an average framerate in the 40s, which for an RTS will be a solid performance.

Company of Heroes 2 - Min. Frame Rate - 3840x2160 - Low Quality

Company of Heroes 2 - Min. Frame Rate - 2560x1440 - Maximum Quality + Med. AA

Company of Heroes 2 - Min. Frame Rate - 1920x1080 - Maximum Quality + Med. AA

However when it comes to minimum framerates, GTX 980 can’t quite stay on top. In every case it is ever so slightly edged out by the R9 290XU by a fraction of a frame per second. AMD seems to weather the hardest drops in framerates just a bit better than NVIDIA does. Though neither card can quite hold the line at 30fps at 1440p and 4K.

Metro: Last Light Bioshock Infinite
Comments Locked

274 Comments

View All Comments

  • Viewgamer - Friday, September 19, 2014 - link

    To Ryan Smith. How can the GTX 980 possibly have a 165W TDP when it actually consumes 8 watts more than the 195W TDP GTX 680 !? please explain ? did Nvidia just play games with the figures to make them look more impressive ?
  • ArmedandDangerous - Friday, September 19, 2014 - link

    TDP =/= Power consumption although they are related. TDP is the amount of heat it will output.
  • Carrier - Friday, September 19, 2014 - link

    You're right, power consumption and heat output are related. That's because they're one and the same! What else could that electricity be converted to? Light? A massive magnetic field? Mechanical energy? (The fan, slightly, but the transistors aren't going anywhere.)
  • Laststop311 - Friday, September 19, 2014 - link

    no they aren't the same. Not all the electricity used is converted to heat. This is where the word EFFICIENCY comes into play. Yes it is related in a way but maxwell is more efficient with the electricity it draws using more of it and losing less of it to converted heat output. It's all in it's design.
  • bernstein - Friday, September 19, 2014 - link

    bullshit. since a gpu doesn't do chemical nor mechanical transformations all the energy used is converted to heat (by way of moving electrons around). efficiency in a gpu means how much energy is used for a fixed set of calculations (for example: flops)
  • Senpuu - Friday, September 19, 2014 - link

    It's okay to be ignorant, but not ignorant and belligerent.
  • bebimbap - Friday, September 19, 2014 - link

    there is "work" being done, as transistors have to "flip" by use of electrons. Even if you don't believe that "input energy =\= output heat" think of it this way
    100w incandescent bulb produces X amount of useful light
    18w florescent bulb also produces X amount of useful light

    in this sense the florescent bulb is much more efficient as it uses only 18w to produce the same light as the 100w incandescent. so if we say they produce the same amount of heat, then
    100w florescent would produce ~5x the light of a 100w incandescent.
  • Laststop311 - Saturday, September 20, 2014 - link

    ur so smart bro
  • Guspaz - Friday, September 19, 2014 - link

    The power draw figures in this article are overall system power draw, not GPU power draw. Since the 980 offers significantly more performance than the 680, it's cranking out more frames, which causes the CPU to work harder to keep up. As as result, the CPU power draw increases, counteracting the benefits of lower GPU power draw.
  • Carrier - Friday, September 19, 2014 - link

    I don't think that can explain the whole difference. It performs similarly to a 780 Ti in Crysis 3, so the difference in power consumption can only come from the card. The 980 is rated 85W less in TDP but consumes only 68W less at the wall. The discrepancy gets worse when you add losses in the power supply.

    My guess is the TDP is rated at nominal clock rate, which is cheating a little because the card consistently runs much higher than nominal because of the boost.

Log in

Don't have an account? Sign up now