Final Words

While there are definitely more areas to investigate, what we've seen of the Radeon VII is still the first 7nm gaming GPU, and that is no small feat. But beyond that, bringing it to consumers allows a mid-generation option for buyers; and the more enthusiast-grade choices, the merrier. The Radeon VII may be a dual-use prosumer/gaming product at heart, but it still has to measure up to being the fastest gaming card of the Radeon stack.

At the risk of being redundant, I can’t help but emphasize how surprised both Ryan and I are that this card is even here at this time. We’re still very early into the 7nm generation, and prior to last month, AMD seemed content to limit the Vega 20 GPU to their server-grade Radeon Instinct cards. Instead a confluence of factors has come into place to allow AMD to bring a chip that, by their own admission was originally built for servers, to the consumer market as a mid-generation kicker. There isn’t really a good precedent for the Radeon VII and its launch, and this makes things quite interesting from tech enthusiast point of view.

Kicking off our wrap-up then, let's talk about the performance numbers. Against its primary competition, the GeForce RTX 2080, the Radeon VII ends up 5-6% behind in our benchmark suite. Unfortunately the only games that it takes the lead are in Far Cry 5 and Battlefield 1, so the Radeon VII doesn't get to ‘trade blows’ as much as I'm sure AMD would have liked to see. Meanwhile, not unlike the RTX 2080 it competes with, AMD isn't looking to push the envelope on price-to-performance ratios here, so the Radeon VII isn't undercutting the pricing of the 2080 in any way. This is a perfectly reasonable choice for AMD to make given the state of the current market, but it does mean that when the card underperforms, there's no pricing advantage to help pick it back up.

Comparing the performance uplift over the original RX Vega 64 puts Radeon VII in a better light, if not a bit of a surprising one. By the numbers, the latest Radeon flagship is around 24% faster at 1440p and 32% faster at 4K than its predecessor. So despite an interesting core configuration that sees the Radeon VII ship with fewer CUs than the RX Vega 64, the Radeon VII pulls well ahead. Reference-to-reference, this might even be grounds for an upgrade rather than a side-grade.

All told, AMD came into this launch facing an uphill battle, both in terms of technology and product positioning. And the results for AMD are mixed. While it's extremely difficult to extract the benefits of 16GB of VRAM in today's games, I'm not ready to write it off as unimportant quite yet; video card VRAM capacities haven't changed much in the last two and a half years, and perhaps it's time it should. However at this moment, AMD's extra VRAM isn't going to do much for gamers.

Content creation, on the other hand, is a more interesting story. Unlike games there is no standard workload here, so I can only speak in extremely broad strokes. The Radeon VII is a fast card with 16GB of VRAM; it's a card that has no parallel in the market. So for prosumers or other professional vizualization users looking to work on the cheap, if you have a workload that really does need more than the 8 to 11 gigabytes of VRAM found in similarly priced cards, then the Radeon VII at least warrants a bit of research. At which point we get into the merits of professional support, AMD's pro drivers, and what AMD will undoubtedly present to pro users down the line in a Radeon Pro-grade Vega 20 card.

As for AMD's technology challenges, the upside for the company is that the Radeon VII is definitely Vega improved. The downside for AMD is that the Radeon VII is still Vega. I won't harp too much on ray tracing here, or other gaming matters, because I'm not sure there's anything meaningful to say that we haven't said in our GeForce reviews. But at a broad level, Vega 20 introduces plenty of small, neat additions to the Vega architecture, even if they aren't really for consumers.

The bigger concern here is that AMD's strategy for configuring their cards hasn't really changed versus the RX Vega 64: AMD is still chasing performance above all else. This makes a great deal of sense given AMD's position, but it also means that the Radeon VII doesn't really try to address some of its predecessor's shortcomings, particularly against the competition. The Radeon VII has its allures, but power efficiency isn’t one of them.

Overall then, the Radeon VII puts its best foot forward when it offers itself as a high-VRAM prosumer card for gaming content creators. And at its $699 price point, that's not a bad place to occupy. However for pure gamers, it's a little too difficult to suggest this card instead of NVIDIA's better performing GeForce RTX 2080.

So where does this leave AMD? Fortunately for the Radeon rebels, their situation is improved even if the overall competitive landscape hasn’t been significantly changed. It's not a win for AMD, but being able to compete with NVIDIA at this level means just that: AMD is still competitive. They can compete on performance, and thanks to Vega 20 they have a new slew of compute features to work with. It's going to win AMD business today, and it's going to help prepare AMD for tomorrow for the next phase that is Navi. It's still an uphill battle, but with Radeon VII and Vega 20, AMD is now one more step up that hill.

Power, Temperature, and Noise
Comments Locked

289 Comments

View All Comments

  • eva02langley - Thursday, February 7, 2019 - link

    I should not say realistic, I should say credible.
  • webdoctors - Thursday, February 7, 2019 - link

    Open source is NOT the only way a new standard can be adopted. Microsoft has been pushing DirectX 9/10/11, etc. and those are HUGELY popular standards. If MS is adopting it in their API, than yes it'll show up in PC games.

    Raytracing is not a gimmick, its been around since before you were born or Nvidia was even founded. It hasn't been "feasible" for real-time and as such as been largely ignored in gaming. Many other technologies were not feasible until they were and than got incorporated. Graphics is more than just getting 60FPS otherwise everything would just be black and white without shadows. Its about realism, which means proper lighting, shadows, physics.

    Ppl need to call out the price, if you're a regular joe who's just getting a card for gaming and not mining or business use, why would you buy this over the competition? They seriously need to drop the price by $100 or it'll be a tiny seller.
  • D. Lister - Friday, February 8, 2019 - link

    RTX is just Nvidia's way of doing DXR which is the IP of Microsoft. AMD has already announced specific development for it in future to be integrated in their GPU's. RT has been announced by both Sony and MS for their next consoles. Of course because of their use of AMD GPUs, the application of RT would be of a lower quality compared to what RTX can do. It is very much like the current console implementation of anti-aliasing, HBAO or tessellation, where on consoles you get a very basic level of those features, but on decent PCs they can be cranked up much higher.

    "The whole G-synch fiasco should have been enough to prove it."
    This is nothing like G-Sync. The problem with GSync is the extra cost. Now considering that the 2080 is the same price/performance as a Radeon VII, but has hardware DXR (RTX) as well, you're essentially getting the ray-tracing add-in for free.

    Thirdly, while many things can be faked with rasterization to be within the approximation of ray-tracing, it requires far greater work (not to mention, artistic talent) to do it. In rasterization, a graphics designer has to first guess what a certain reflection or shadow would look like and then painstakingly make something that could pass off for the real thing. Raytracing takes that guesswork out of the task. All you, as a developer, would need to do is place a light or a reflective surface and RT would do the rest with mathematical accuracy, resulting in higher quality, a much faster/smoother development, fewer glitches, and a much smaller memory/storage footprint for the final product.
  • D. Lister - Friday, February 8, 2019 - link

    A helpful link:

    https://blogs.msdn.microsoft.com/directx/2018/03/1...
  • Manch - Friday, February 8, 2019 - link

    RTX is a proprietary implementation that is compatible with DirectX RT. AMD may eventually do DirectX RT but it will be there own version. As far as consoles go, unless NAVI has some kind of RT implementation, youre right, no RT of any significance. At best it will be a simple PC graphics option that works in a few titles maybe like hair works lol.
  • eva02langley - Friday, February 8, 2019 - link

    It is ... a GAMEWORKS feature... as of now. RTX/DLSS are nothing more than 2 new gameworks features... that will just break games, once again to cripple the competition.

    The goal is not even to have RTX or DLSS, it is to force developers to use their proprietary tools to break game codes and sabotage the competition, like The Witcher 3.

    RTX is nothing good as of now. It is a tax, and it breaks performances. Let's talk about it when it can be implemented in real-time. until then, let's Nvidia feel the burden of it.
  • eddman - Friday, February 8, 2019 - link

    I do agree that these RTX/DLSS features absolutely do not justify the current prices and that nvidia should've waited for 7nm to mature before adding them, but let's not get so emotional.

    Gameworks are simply modules that can be added to a game and are not part of the main code. Also, its GPU based features can be disabled in options, as was the case in witcher 3.
  • TheinsanegamerN - Thursday, February 7, 2019 - link

    And by flinging insults you have shown yourself to be an immature fanboi that is desperately trying to defend his favorite GPU company.
  • eva02langley - Friday, February 8, 2019 - link

    I didn't insult anyone, I just spoke the truth about RTX. I am not defending AMD, I am condemning Nvidia. Little difference...

    To defend RTX as it is today, is being colored green all over. There is no way to defend it.
  • ballsystemlord - Thursday, February 7, 2019 - link

    I agree, Huang should have listed to himself when he said that Ray tracing would have been a thing in 10 years (but he wanted to bring it to market now).
    Remember when there were 2D and 3D accelerators?
    I say we should be able to choose 3D or Ray-tracing accelerators.

Log in

Don't have an account? Sign up now