Gaming Performance, Continued

The Witcher 3 - 2560x1440 - Ultra Quality (No Hairworks)

The Witcher 3 - 1920x1080 - Ultra Quality (No Hairworks)

The Division - 2560x1440 - Ultra Quality

The Division - 1920x1080 - Ultra Quality

Grand Theft Auto V - 2560x1440 - Very High Quality

Grand Theft Auto V - 1920x1080 - Very High Quality

Grand Theft Auto V - 99th Percentile Framerate - 2560x1440 - Very High Quality

Grand Theft Auto V - 99th Percentile Framerate - 1920x1080 - Very High Quality

While AMD’s launch drivers for the RX 480 have by and large been stable, the one outlier here has been Grand Theft Auto V. In the current drivers there is an issue that appears to affect the game’s built-in benchmark on GCN 1.1 and later cards, causing stuttering, reduced performance, and in the case of the 380X, complete crashes. AMD has told me that they’ve discovered the issue as well and will be issuing a fixed driver, but it was not ready in time for the review.

Hitman - 2560x1440 - Ultra Quality (DX11)

Hitman - 1920x1080 - Ultra Quality (DX11)

Hitman - 2560x1440 - Ultra Quality (DX12)

Hitman - 1920x1080 - Ultra Quality (DX12)

Continuing our look at gaming performance, it’s becoming increasingly clear that RX 480 trends closely to the last generation Radeon R9 390 and the GeForce GTX 970. Given their architectural similarity, in a lot of ways this is a repeat of 390 vs 970 in general; the two cards are sometimes equal, and sometimes far apart. But in the end, on average, they are close together on our 2016 benchmark suite.

For mainstream video card users, this means that last year’s enthusiast-level performance has come down to mainstream prices.

Gaming Performance Power, Temperature, & Noise
Comments Locked

449 Comments

View All Comments

  • Flunk - Thursday, June 30, 2016 - link

    The Crossfire reviews I've read have said the GTX 1070 is faster on average than RX 480 Crossfire, maybe you should go read those reviews.
  • Murloc - Tuesday, July 5, 2016 - link

    comparing crossfire/sli to a single gpu is really useless. Multigpu means lots of heat, noise, power consumption, driver and game support issues, and performance that is most certainly not doubled on many games.

    Most people want ONE video card and they're going to get the one with the best bang for buck.
  • R0H1T - Wednesday, June 29, 2016 - link

    For $200 I'll take this over the massive cash grab i.e. FE obviously!
  • Wreckage - Wednesday, June 29, 2016 - link

    Going down with the ship eh? It took AMD 2 years to compete with the 970. I guess we will have to wait until 2018 to see what they have to go against the 1070
  • looncraz - Wednesday, June 29, 2016 - link

    Two years to compete with the 970?

    The 970's only advantage over AMD's similarly priced GPUs was power consumption. That advantage is now gone - and AMD is charging much less for that level of performance.

    The RX480 is a solid GPU for mainstream 1080p gamers - i.e. the majority of the market. In fact, right now, it's the best GPU to buy under $300 by any metric (other than the cooler).

    Better performance, better power consumption, more memory, more affordable, more up-to-date, etc...
  • stereopticon - Wednesday, June 29, 2016 - link

    are you kidding me?! better power consumption?! its about the same as the 970... it used something like 13 lets watts while running crysis 3... if the gtx1060 ends up being as good this card for under 300 while consuming less watts i have no idea what AMD is gonna do. I was hoping for this to have a little more power (more along 980) to go inside my secondary rig.. but we will see how the 1060 performance.

    i still believe this is a good card for the money.. but the hype was definitely far greater than what the actual outcome was...
  • adamaxis - Wednesday, June 29, 2016 - link

    Nvidia measures power consumption by average draw. AMD measures by max.

    These cards are not even remotely equal.
  • dragonsqrrl - Wednesday, June 29, 2016 - link

    "Nvidia measures power consumption by average draw. AMD measures by max."

    That's completely false.
  • CiccioB - Friday, July 1, 2016 - link

    Didn't you know that when using AMD HW the watt meter switches to "maximum mode" while when applying the probes on nvidia HW it switched to "average mode"?

    Ah, ignorance, what a funny thing it is
  • dragonsqrrl - Friday, July 8, 2016 - link

    @CiccioB

    No I didn't, source? Are you suggesting that the presence of AMD or Nvidia hardware in a system has some influence over metering hardware use to measure power consumption? What about total system power consumption from the wall?

    At least in relation to advertised TDP, which is what my original comment was referring to, I know that what adamaxis said about avg and max power consumption is false.

Log in

Don't have an account? Sign up now