Final Words

While there are definitely more areas to investigate, what we've seen of the Radeon VII is still the first 7nm gaming GPU, and that is no small feat. But beyond that, bringing it to consumers allows a mid-generation option for buyers; and the more enthusiast-grade choices, the merrier. The Radeon VII may be a dual-use prosumer/gaming product at heart, but it still has to measure up to being the fastest gaming card of the Radeon stack.

At the risk of being redundant, I can’t help but emphasize how surprised both Ryan and I are that this card is even here at this time. We’re still very early into the 7nm generation, and prior to last month, AMD seemed content to limit the Vega 20 GPU to their server-grade Radeon Instinct cards. Instead a confluence of factors has come into place to allow AMD to bring a chip that, by their own admission was originally built for servers, to the consumer market as a mid-generation kicker. There isn’t really a good precedent for the Radeon VII and its launch, and this makes things quite interesting from tech enthusiast point of view.

Kicking off our wrap-up then, let's talk about the performance numbers. Against its primary competition, the GeForce RTX 2080, the Radeon VII ends up 5-6% behind in our benchmark suite. Unfortunately the only games that it takes the lead are in Far Cry 5 and Battlefield 1, so the Radeon VII doesn't get to ‘trade blows’ as much as I'm sure AMD would have liked to see. Meanwhile, not unlike the RTX 2080 it competes with, AMD isn't looking to push the envelope on price-to-performance ratios here, so the Radeon VII isn't undercutting the pricing of the 2080 in any way. This is a perfectly reasonable choice for AMD to make given the state of the current market, but it does mean that when the card underperforms, there's no pricing advantage to help pick it back up.

Comparing the performance uplift over the original RX Vega 64 puts Radeon VII in a better light, if not a bit of a surprising one. By the numbers, the latest Radeon flagship is around 24% faster at 1440p and 32% faster at 4K than its predecessor. So despite an interesting core configuration that sees the Radeon VII ship with fewer CUs than the RX Vega 64, the Radeon VII pulls well ahead. Reference-to-reference, this might even be grounds for an upgrade rather than a side-grade.

All told, AMD came into this launch facing an uphill battle, both in terms of technology and product positioning. And the results for AMD are mixed. While it's extremely difficult to extract the benefits of 16GB of VRAM in today's games, I'm not ready to write it off as unimportant quite yet; video card VRAM capacities haven't changed much in the last two and a half years, and perhaps it's time it should. However at this moment, AMD's extra VRAM isn't going to do much for gamers.

Content creation, on the other hand, is a more interesting story. Unlike games there is no standard workload here, so I can only speak in extremely broad strokes. The Radeon VII is a fast card with 16GB of VRAM; it's a card that has no parallel in the market. So for prosumers or other professional vizualization users looking to work on the cheap, if you have a workload that really does need more than the 8 to 11 gigabytes of VRAM found in similarly priced cards, then the Radeon VII at least warrants a bit of research. At which point we get into the merits of professional support, AMD's pro drivers, and what AMD will undoubtedly present to pro users down the line in a Radeon Pro-grade Vega 20 card.

As for AMD's technology challenges, the upside for the company is that the Radeon VII is definitely Vega improved. The downside for AMD is that the Radeon VII is still Vega. I won't harp too much on ray tracing here, or other gaming matters, because I'm not sure there's anything meaningful to say that we haven't said in our GeForce reviews. But at a broad level, Vega 20 introduces plenty of small, neat additions to the Vega architecture, even if they aren't really for consumers.

The bigger concern here is that AMD's strategy for configuring their cards hasn't really changed versus the RX Vega 64: AMD is still chasing performance above all else. This makes a great deal of sense given AMD's position, but it also means that the Radeon VII doesn't really try to address some of its predecessor's shortcomings, particularly against the competition. The Radeon VII has its allures, but power efficiency isn’t one of them.

Overall then, the Radeon VII puts its best foot forward when it offers itself as a high-VRAM prosumer card for gaming content creators. And at its $699 price point, that's not a bad place to occupy. However for pure gamers, it's a little too difficult to suggest this card instead of NVIDIA's better performing GeForce RTX 2080.

So where does this leave AMD? Fortunately for the Radeon rebels, their situation is improved even if the overall competitive landscape hasn’t been significantly changed. It's not a win for AMD, but being able to compete with NVIDIA at this level means just that: AMD is still competitive. They can compete on performance, and thanks to Vega 20 they have a new slew of compute features to work with. It's going to win AMD business today, and it's going to help prepare AMD for tomorrow for the next phase that is Navi. It's still an uphill battle, but with Radeon VII and Vega 20, AMD is now one more step up that hill.

Power, Temperature, and Noise
Comments Locked

289 Comments

View All Comments

  • repoman27 - Thursday, February 7, 2019 - link

    The Radeon Pro WX 7100 is Polaris 10, which does not do DSC. DSC requires fixed function encoding blocks that are not present in any of the Polaris or Vega variants. They do support DisplayPort 1.3 / 1.4 and HBR3, but DSC is an optional feature of the DP spec. AFAIK, the only GPUs currently shipping that have DSC support are NVIDIA's Turing chips.

    The CPU in the iMac Pro is a normal, socketed Xeon W, and you can max the RAM out at 512 GB using LRDIMMs if you're willing to crack the sucker open and shell out the cash. So making those things user accessible would be the only benefit to a modular Mac Pro. CPU upgrades are highly unlikely for that platform though, and I doubt Apple will even provide two DIMM slots per channel in the new Mac Pro. However, if they have to go LGA3647 to get an XCC based Xeon W, then they'd go with six slots to populate all of the memory channels. And the back of a display that is also 440 square inches of aluminum radiator is not necessarily a bad place to be, thermally. Nothing is open about Thunderbolt yet, by the way, but of course Apple could still add existing Intel TB3 controllers to an AMD design if they wanted to.

    So yeah, in order to have a product, they need to beat the iMac Pro in some meaningful way. And simply offering user accessible RAM and PCIe slots in a box that's separate from the display isn't really that, in the eyes of Apple at least. Especially since PCIe slots are far from guaranteed, if not unlikely.
  • halcyon - Friday, February 8, 2019 - link

    Apple cannot ship Mac Pro with a vacuum cleaner. That 43 dBA is isane. Even if Apple downclocked and undervolted the bios, I doubt they could make it very quiet.

    Also, I doubt AMD is willing to sell the tons of them at a loss.
  • dark_light - Thursday, February 7, 2019 - link

    Well written, balanced and comprehensive review that covers all the bases with just the right
    amount of detail.

    Thanks Nate Oh.

    Anandtech is still arguably the best site for this content. Kudos guys.
  • drgigolo - Thursday, February 7, 2019 - link

    So I got a 1080Ti at launch, because there was no other alternative at 4K. Finally we have an answer from AMD, unfortunately it's no faster than my almost 2 year old GPU, priced the same no less.

    I really think this would've benefitted from 128 rops, or 96.

    If they had priced this at 500 dollars, it would've been a much better bargain.

    I can't think of anyone who I would recommend this to.
  • sing_electric - Thursday, February 7, 2019 - link

    To be fair, you could almost say the same thing about the 2080, "I got a 1080 Ti at launch and 2 years later, Nvidia released a GPU that barely performs better if you don't care about gimmicks like ray tracing."

    People who do gaming and compute might be very well tempted, people who don't like Nvidia (or just do like AMD) might be tempted.

    Unfortunately, the cost of the RAM in this thing alone is probably nearly $350, so there's no way AMD could sell this thing for $500 (but it wouldn't surprise me if we see it selling a little under MSRP if there is plentiful supply and Nvidia can crank out enough 2080s).
  • eva02langley - Thursday, February 7, 2019 - link

    That was the whole point of RTX. Beside the 2080 TI, there was nothing new. You were having the same performances for around the same price than the last generation. There was no price disruption.
  • Oxford Guy - Thursday, February 7, 2019 - link

    Poor AMD.

    We're supposed to buy a clearly inferior product (look at that noise) just so they can sell leftover and defective Instincts?

    We're supposed to buy an inferior product because AMD's bad business moves have resulted in Nvidia being able to devalue the GPU market with Turing?

    Nope. We're supposed to either buy the best product for the money or sit out and wait for something better. Personally, I would jump for joy if everyone would put their money into a crowdfunded company, with management that refuses to become eaten alive by a megacorp, to take on Nvidia and AMD in the pure gaming space. There was once space for three players and there is space today. I am not holding my breath for Intel to do anything particularly valuable.

    Wouldn't it be nice to have a return to pure no-nonsense gaming designs, instead of this "you can buy our defective parts for high prices and feel like you're giving to charity" and "you can buy our white elephant feature well before its time has come and pay through the nose for it" situation.

    Capitalism has had a bad showing for some time now in the tech space. Monopolies and duopolies reign supreme.
  • eva02langley - Friday, February 8, 2019 - link

    Honestly, beside a RX570/580, no GPUs make sense right now.

    Funny that Polaris is still the best bang for the $ still today.
  • drgigolo - Saturday, February 9, 2019 - link

    Well, at least you can buy a 2080Ti, eventhough the 2080 is of course at the same price point as the 1080Ti. But I won't buy a 2080Ti either, it's too expensive and the performance increase is too small.

    The last decent AMD card I had, was the R9 290X. Had that for a few years until the 1080 came out, and then, replaced that to a 1080Ti when I got a Acer Predator XB321HK.

    I will wait until something better comes along. Would really like HDMI 2.1 output, so that I can use VRR on the upcoming LG OLED C9.
  • sing_electric - Thursday, February 7, 2019 - link

    Oh, also, FWIW: The other way of looking at it is "damn, that 1080 Ti was a good buy. Here I am 2 years later and there's very little reason for me to upgrade."

Log in

Don't have an account? Sign up now