Final Words

While there are definitely more areas to investigate, what we've seen of the Radeon VII is still the first 7nm gaming GPU, and that is no small feat. But beyond that, bringing it to consumers allows a mid-generation option for buyers; and the more enthusiast-grade choices, the merrier. The Radeon VII may be a dual-use prosumer/gaming product at heart, but it still has to measure up to being the fastest gaming card of the Radeon stack.

At the risk of being redundant, I can’t help but emphasize how surprised both Ryan and I are that this card is even here at this time. We’re still very early into the 7nm generation, and prior to last month, AMD seemed content to limit the Vega 20 GPU to their server-grade Radeon Instinct cards. Instead a confluence of factors has come into place to allow AMD to bring a chip that, by their own admission was originally built for servers, to the consumer market as a mid-generation kicker. There isn’t really a good precedent for the Radeon VII and its launch, and this makes things quite interesting from tech enthusiast point of view.

Kicking off our wrap-up then, let's talk about the performance numbers. Against its primary competition, the GeForce RTX 2080, the Radeon VII ends up 5-6% behind in our benchmark suite. Unfortunately the only games that it takes the lead are in Far Cry 5 and Battlefield 1, so the Radeon VII doesn't get to ‘trade blows’ as much as I'm sure AMD would have liked to see. Meanwhile, not unlike the RTX 2080 it competes with, AMD isn't looking to push the envelope on price-to-performance ratios here, so the Radeon VII isn't undercutting the pricing of the 2080 in any way. This is a perfectly reasonable choice for AMD to make given the state of the current market, but it does mean that when the card underperforms, there's no pricing advantage to help pick it back up.

Comparing the performance uplift over the original RX Vega 64 puts Radeon VII in a better light, if not a bit of a surprising one. By the numbers, the latest Radeon flagship is around 24% faster at 1440p and 32% faster at 4K than its predecessor. So despite an interesting core configuration that sees the Radeon VII ship with fewer CUs than the RX Vega 64, the Radeon VII pulls well ahead. Reference-to-reference, this might even be grounds for an upgrade rather than a side-grade.

All told, AMD came into this launch facing an uphill battle, both in terms of technology and product positioning. And the results for AMD are mixed. While it's extremely difficult to extract the benefits of 16GB of VRAM in today's games, I'm not ready to write it off as unimportant quite yet; video card VRAM capacities haven't changed much in the last two and a half years, and perhaps it's time it should. However at this moment, AMD's extra VRAM isn't going to do much for gamers.

Content creation, on the other hand, is a more interesting story. Unlike games there is no standard workload here, so I can only speak in extremely broad strokes. The Radeon VII is a fast card with 16GB of VRAM; it's a card that has no parallel in the market. So for prosumers or other professional vizualization users looking to work on the cheap, if you have a workload that really does need more than the 8 to 11 gigabytes of VRAM found in similarly priced cards, then the Radeon VII at least warrants a bit of research. At which point we get into the merits of professional support, AMD's pro drivers, and what AMD will undoubtedly present to pro users down the line in a Radeon Pro-grade Vega 20 card.

As for AMD's technology challenges, the upside for the company is that the Radeon VII is definitely Vega improved. The downside for AMD is that the Radeon VII is still Vega. I won't harp too much on ray tracing here, or other gaming matters, because I'm not sure there's anything meaningful to say that we haven't said in our GeForce reviews. But at a broad level, Vega 20 introduces plenty of small, neat additions to the Vega architecture, even if they aren't really for consumers.

The bigger concern here is that AMD's strategy for configuring their cards hasn't really changed versus the RX Vega 64: AMD is still chasing performance above all else. This makes a great deal of sense given AMD's position, but it also means that the Radeon VII doesn't really try to address some of its predecessor's shortcomings, particularly against the competition. The Radeon VII has its allures, but power efficiency isn’t one of them.

Overall then, the Radeon VII puts its best foot forward when it offers itself as a high-VRAM prosumer card for gaming content creators. And at its $699 price point, that's not a bad place to occupy. However for pure gamers, it's a little too difficult to suggest this card instead of NVIDIA's better performing GeForce RTX 2080.

So where does this leave AMD? Fortunately for the Radeon rebels, their situation is improved even if the overall competitive landscape hasn’t been significantly changed. It's not a win for AMD, but being able to compete with NVIDIA at this level means just that: AMD is still competitive. They can compete on performance, and thanks to Vega 20 they have a new slew of compute features to work with. It's going to win AMD business today, and it's going to help prepare AMD for tomorrow for the next phase that is Navi. It's still an uphill battle, but with Radeon VII and Vega 20, AMD is now one more step up that hill.

Power, Temperature, and Noise
Comments Locked

289 Comments

View All Comments

  • PeachNCream - Thursday, February 7, 2019 - link

    Sorry about that. The Radeon VII is very much out of the range of prices I'm willing to pay for any single component or even an whole system for that matter. I was zinging about the GPU being called high-end (which it rightfully is) because in another recent article, a $750 monitor was referred to as midrange. See:

    https://www.anandtech.com/show/13926/lg-launches-3...

    It was more to make a point about the inconsistency with which AT classifies products than an actual reflection of my own buying habits.

    As for my primary laptop, my daily driver is a Bay Trail HP Stream 11 running Linux so yeah, it's packing 2GB of RAM and 32GB of eMMC. I have a couple other laptops around which I use significantly less often that are older, but arguably more powerful. The Stream is just a lot easier to take from place to place.
  • Korguz - Friday, February 8, 2019 - link

    it could be.. that maybe the manufacturer refers it as a mid range product ( the monitor ) in their product stack.. and AT.. just calls it that, because of that ?

    :-)
  • eva02langley - Friday, February 8, 2019 - link

    I follow you on that. I bought a 1080 TI and I told myself this is the maximum I am willing to put for a GPU.

    I needed something for 4k and it was the only option. If Navi is 15% faster than Vega 64 for 300$, I am buying one on launch.
  • D. Lister - Saturday, February 9, 2019 - link

    But why would you want to spend $300 for a downgrade from your 1080Ti?
  • HollyDOL - Thursday, February 7, 2019 - link

    Purely on gaming field this can't really compete with RTX 2080 (unless some big enough perf change comes with new drivers soon)... it's performing almost same, but at a little bit more power, hotter and almost 10dB louder, which is quite a lot. Given that it won't be able to offer anything more (as oposed to possible adoptions of DXR) I would expect it not trying to compete for same price level RTX 2080 does.

    If it can get $50-$100 lower otoh, you get what many people asked for... kind of "GTX 2080" ... classic performance without ray tracing and DLSS extensions.

    With current price though It only makes sense if they are betting they can get enough compute buyers.
  • Oxford Guy - Thursday, February 7, 2019 - link

    Yeah, because losing your hearing to tinnitus is definitely worth that $50-100.
  • HollyDOL - Friday, February 8, 2019 - link

    Well, it's "lab conditions", it can always get dampened with good chasis or chasis position to reasonable levels and hopefully noone should be playing with head stuck inside the chasis... For me subjectively it would be too loud, but I wanted to give the card advantage of doubt, non-reference designs should hopefully get to lower levels.
  • Oxford Guy - Friday, February 8, 2019 - link

    1) The Nvidia card will be quieter in a chassis. So, that excuse fails.

    2) I am not seeing significant room for doubt. Fury X was a quiet product (except at idle which some complained about, and in terms of, at least in some cases, coil whine). AMD has chosen to move backward, severely, in the noise department with this product.

    This card has a fancy copper vapor chamber with flattened heatpipes and three fans. It also runs hot. So, how is it, at all, rational to expect 3rd-party cards to fix the noise problem? 3rd-party makers typically use 3 slot designs to increase clocks and they typically cost even more.
  • HollyDOL - Friday, February 8, 2019 - link

    Well, not really. If the quieter chassis cuts of enough dB to get it out of disturbing level it will be enough. Also depends on environment... If you play in loud environment (day, loud speakers) the noise won't be percieved as bad as if you play it during night with quiter speakers. Ie. what can be sufferable during day can turn in complete hell during night.

    That being said I am by any means not advocating +10dB, because it is a lot, but in the end it doesn't have to present so terrible obstacle.

    It is very early, there can always be a bug in drivers or bios causing this temp/noise issue or it can be a design problem that cannot be circumvented. But that will be seen only after some time. I remember bug in ForceWare causing my old GTX580 not dropping to 2D frequencies once it kicked in 3D (or was it on 8800GT, I don't really remember)... You had to restart the machine. Such things simply can happen, which doesn't make them any better ofc.
  • Oxford Guy - Friday, February 8, 2019 - link

    "If the quieter chassis cuts of enough dB to get it out of disturbing level it will be enough."

    Nope. I've owned the Antec P180. I have extensively modified cases and worked hard with placement to reduce noise.

    Your argument that the noise can simply be eliminated by putting it into a case is completely bogus. In fact, Silent PC Review showed that more airflow, from less restriction (i.e. a less closed-in case design) can substantially reduce GPU noise — the opposite of the P180 philosophy that Silent PC Review once advocated (and helped to design).

    The other problem for your argument is that it is 100% logically true that there is zero reason to purchase an inferior product. Since this GPU is not faster than a 2080 and costs the same there is zero reason to buy a louder GPU, since, in actuality, noise doesn't just get absorbed and disappear when you put it into a case. In fact, this site wrote a review of a Seasonic PSU that could be heard "from rooms away" and I can hear noisy GPUs through walls, too.

    "It is very early, there can always be a bug in drivers or bios causing this temp/noise issue"

    Then it shouldn't be on the market and shouldn't have been sampled. Alpha quality designs shouldn't be review subjects, particularly when they're being passed off as the full product.

Log in

Don't have an account? Sign up now