Synthetics Performance

Next up are synthetic tests.

Synthetic: TessMark, Image Set 4, 64x Tessellation


Synthetic: Beyond3D Suite - Pixel Fillrate


Synthetic: Beyond3D Suite - Integer Texture Fillrate (INT8)

Compute Professional Visualization and Rendering


View All Comments

  • peevee - Tuesday, February 12, 2019 - link

    "that the card operates at a less-than-native FP64 rate"

    The chip is capapble of 2 times higher f64 performance. Marketoids must die.
  • FreckledTrout - Thursday, February 7, 2019 - link

    Performance wise it did better than I expected. This card is pretty loud and runs a bit hot for my tastes. Nice review. Where are the 8K and 16K tests :)- Reply
  • IGTrading - Thursday, February 7, 2019 - link

    When drivers mature, AMD Radeon VII will beat the GF 2080.

    Just like Radeon Furry X beats the GF 980 and Radeon Vega 64 beats the GF 1080.

    When drivers mature and nVIDIA's blatant sabotage against its older cards (and AMD's cards) gets mitigated, the long time owner of the card will enjoy better performance.

    Unfortunately, on the power side, nVIDIA still has the edge, but I'm confident that those 16 GB of VRAM will really show their worth in the following year.
  • cfenton - Thursday, February 7, 2019 - link

    I'd rather have a card that performs better today than one that might perform better in two or three years. By that point, I'll already be looking at new cards.

    This card is very impressive for anyone who needs FP64 compute and lots of VRAM, but it's a tough sell if you primarily want it for games.
  • Benjiwenji - Thursday, February 7, 2019 - link

    AMD cards have traditional age much better than Nvidia. GamerNexus just re-benchmarked the 290x from 2013 on modern games and found it comparable to the 980, 1060, and 580.

    The GTX 980 came late 2014 with a $550USD tag, now struggles on 1440p.

    Not to mention that you can get a lot out of AMD cards if you're willing to tinker. My 56, which I got from Microcenter on Nov, 2017, for $330. (total steal) Now performs at 1080 level after BIOs flash + OC.
  • eddman - Friday, February 8, 2019 - link

    What are you talking about? GTX 980 still performs as it should at 1440.
  • Icehawk - Friday, February 8, 2019 - link

    My 970 does just fine too, I can play 1440p maxed or near maxed in everything - 4k in older/simpler games too (ie, Overwatch). I was planning on a new card this gen for 4k but pricing is just too high for the gains, going to hold off one more round... Reply
  • Gastec - Tuesday, February 12, 2019 - link

    That's because, as the legend has it, Nvidia is or was in the past gimping their older generation cards via drivers. Reply
  • kostaaspyrkas - Sunday, February 10, 2019 - link

    in same frame rates nvidia gameplay gives me a sense of choppiness...amd radeon more fluid gameplay... Reply
  • yasamoka - Thursday, February 7, 2019 - link

    This wishful in-denial conjecture needs to stop.

    1) AMD Radeon VII is based on the Vega architecture which has been on the platform since June 2017. It's been about 17 months. The drivers had more than enough time to mature. It's obvious that in certain cases there are clear bottlenecks (e.g. GTA V), but this seems to be the fundamental nature of AMD's drivers when it comes to DX11 performance in some games that perform a lot of draw calls. Holding out for improvements here isn't going to please you much.

    2) The Radeon Fury X was meant to go against the GTX 980Ti, not the GTX 980. The Fury, being slightly under the Fury X, would easily cover the GTX 980 performance bracket. The Fury X still doesn't beat the GTX 980Ti, particularly due to its limited VRAM where it even falls back in performance compared to the RX480 8GB and its siblings (RX580, RX590).

    3) There is no evidence of Nvidia's sabotage against any of its older cards when it comes to performance, and frankly your dig against GameWorks "sabotaging" AMD's cards performance is laughable when the same features, when enabled, also kill performance on Nvidia's own cards. PhysX has been open-source for 3 years and has now moved on to its 4th iteration, being used almost universally now in game engines. How's that for vendor lockdown?

    4) 16GB of VRAM will not even begin to show their worth in the next year. Wishful thinking, or more like licking up all the bad decisions AMD tends to make when it comes to product differentiation between their compute and gaming cards. It's baffling at this point that they still didn't learn to diverge their product lines and establish separate architectures in order to optimize power draw and bill of materials on the gaming card by reducing architectural features that are unneeded for gaming. 16GB are unneeded, 1TB/s of bandwidth is unneeded, HBM is expensive and unneeded. The RTX 2080 is averaging higher scores with half the bandwidth, half the VRAM capabity, and GDDR6.

    The money is in the gaming market and the professional market. The prosumer market is a sliver in comparison. Look at what Nvidia do, they release a mere handful of mascots every generation, all similar to one another (the Titan series), to take care of that sliver. You'd think they'd have a bigger portfolio if it were such a lucrative market? Meanwhile, on the gaming end, entire lineups. On the professional end, entire lineups (Quadro, Tesla).

    Get real.

Log in

Don't have an account? Sign up now