Company of Heroes 2

Our second benchmark in our benchmark suite is Relic Games’ Company of Heroes 2, the developer’s World War II Eastern Front themed RTS. For Company of Heroes 2 Relic was kind enough to put together a very strenuous built-in benchmark that was captured from one of the most demanding, snow-bound maps in the game, giving us a great look at CoH2’s performance at its worst. Consequently if a card can do well here then it should have no trouble throughout the rest of the game.

Company of Heroes 2 - 1920x1080 - High Quality + Low AA

Company of Heroes 2 - 1920x1080 - Low Quality

Company of Heroes 2 is a game that has consistently favored AMD’s GPUs, so the outcome here is no surprise. The R7 260 is performance competitive with the otherwise much more expensive GTX 660, leaving the R7 265 to happy stomp all over every NVIDIA card on this chart.

On a side note, for Radeon 5700 series owners looking for an AMD card that can double their performance for under $150, it looks like AMD has finally delivered, as evidenced by the performance of the R7 265 in this and other games. We’re looking at a fairly consistent 100% increase in performance here, which this coming some 4 years after the 5770 launched at $159 is clearly a long wait, but it does happen to coincide nicely with the 4 year upgrade cycle that many video card buyers are on these days.

Company of Heroes 2 - Min. Frame Rate - 1920x1080 - High Quality + Low AA

Company of Heroes 2 - Min. Frame Rate - 1920x1080 - Low Quality

Metro: Last Light Bioshock Infinite
Comments Locked

52 Comments

View All Comments

  • just4U - Thursday, February 13, 2014 - link

    While you may be right... AMD/Ati does like throwing popular configurations into the mix.. The 265 reminds me a lot of the 4830 and while that card was fairly short lived it was a hot seller for them as it straddled two performance areas but came in at a nicer price point.
  • jabber - Friday, February 14, 2014 - link

    Indeed I swapped from being a longtime Nvidia user to AMD back in 2009 as I got fed up with Nvidia regurgitating the old 8800 chips three times in a row for the mid level.

    Stuff doesn't have to change radically performance wise but its nice to know new features are added and other things get revised and tweaked. A simple name change isn't enough really.
  • MrSpadge - Thursday, February 13, 2014 - link

    I'm actually happy they're finally making use of that last digit in their 3-number scheme. From my point of view they could have ditched the X altogether and make the R9-270X an R9-275 (or whatever is appropriate). And speaking of R9: they could have given the R7 265 the rating R9 265 to more closely connect it with R9 270. Or just drop that prefix as well, if the numbers don't overlap anyway and the R9/7/3 is not related to features either!

    Speaking about the cards:
    - boost clocks additional 25 MHz again? I have no idea why these are there. Make it 100+ MHz of leave it.
    - 1.175 V for a mere 925 MHz? The chip should be able to do 1.0 GHz at ~1.0 V, maybe 1.10 V for guaranteed clocks
    - same for R7 260 - that voltage is ridiculously high

    Anyway, the cards themselves are fine (just like the 7000 series) and the coolers really fit them.
  • silverblue - Thursday, February 13, 2014 - link

    The single GPU frame latency issue has been fixed for more than six months. I doubt it's going to become a problem again like with AMD's handling of 2D a while back.

    There are remarks concerning the availability of the R9 270 series and the inability for these parts to keep to their RRP, both of which may not be present if this was some sort of fanboy review.
  • Spuke - Thursday, February 13, 2014 - link

    Has it been 6 months? I thought they recently fixed that problem.
  • silverblue - Thursday, February 13, 2014 - link

    It was fixed in Cat 13.8 Beta 1, dated 1st August.
  • silverblue - Thursday, February 13, 2014 - link

    My bad - that's when CrossFire had its first fix. Apparently, single-GPU was fixed beforehand, though I can't find which driver version it was.
  • Solid State Brain - Thursday, February 13, 2014 - link

    Anandtech: it would be interesting if you tested idle power consumption in multi monitor scenarios. I think you will find out some surprises.
  • creed3020 - Thursday, February 13, 2014 - link

    Excellent point!

    I had a friend with a 6950 and he was furious that his video card would never idle down in gpu/memory frequencies when he had a second monitor connected.

    I personally have a 6850 and two 20" LCDs connected over DVI. I have not looked for the same behaviour but would not be surprised if it were the same.

    Power efficiencies are out the window once the user chooses to go multi-monitor to be more productive.e
  • Solid State Brain - Thursday, February 13, 2014 - link

    I have the same issue with my HD7770 to a lesser extent and my workaround for that is connecting my two secondary displays on the integrated Intel GPU. This saves a significant amount of power.

Log in

Don't have an account? Sign up now