Benchmarking Testbed Setup

To preface, because of the SMU changes mentioned earlier, no third party utilities can read Radeon VII data, though patches are expected shortly. AIB partner tools such as MSI Afterburner should presumably launch with support. Otherwise, Radeon Wattman was the only monitoring tool possible, except we observed that the performance metric log recording and overlay sometimes caused issues with games.

On that note, a large factor in this review was the instability of press drivers. Known issues include being unable to downclock HBM2 on the Radeon VII, which AMD clarified was a bug introduced in Adrenalin 2019 19.2.1, or system crashes when the Wattman voltage curve is set to a single min/max point. There are also issues with DX11 game crashes, which we also ran into early on, that AMD is also looking at.

For these reasons, we won't have Radeon VII clockspeed or overclocking data for this review. To put simply, these types of issues are mildly concerning; while Vega 20 is new to gamers, it is not new to drivers, and if Radeon VII was indeed always in the plan, then game stability should have been a priority. Despite being a bit of a prosumer card, the Radeon VII is still the new flagship gaming card. There's no indication that these are more than simply teething issues, but it does seem to lend a little credence to the idea that Radeon VII was launched as soon as feasibly possible.

Test Setup
CPU Intel Core i7-7820X @ 4.3GHz
Motherboard Gigabyte X299 AORUS Gaming 7 (F9g)
PSU Corsair AX860i
Storage OCZ Toshiba RD400 (1TB)
Memory G.Skill TridentZ
DDR4-3200 4 x 8GB (16-18-18-38)
Case NZXT Phantom 630 Windowed Edition
Monitor LG 27UD68P-B
Video Cards AMD Radeon VII
AMD Radeon RX Vega 64 (Air)
AMD Radeon R9 Fury X
NVIDIA GeForce RTX 2080
NVIDIA GeForce RTX 2070
NVIDIA GeForce GTX 1080 Ti
Video Drivers NVIDIA Release 417.71
AMD Radeon Software 18.50 Press
OS Windows 10 x64 Pro (1803)
Spectre and Meltdown Patched

Thanks to Corsair, we were able to get a replacement for our AX860i. While the plan was to utilize Corsair Link as an additional datapoint for power consumption, for the reasons mentioned above it was not feasible for this time. On that note, power consumption figures will differ for earlier GPU 2018 Bench data.

In the same vein, for Ashes, GTA V, F1 2018, and Shadow of War, we've updated some of the benchmark automation and data processing steps, so results may vary at the 1080p mark compared to previous GPU 2018 data.

Meet the AMD Radeon VII Battlefield 1
Comments Locked

289 Comments

View All Comments

  • mapesdhs - Friday, February 8, 2019 - link

    It's going to be hillariously funny if Ryzen 3000 series reverses this accepted norm. :)
  • mkaibear - Saturday, February 9, 2019 - link

    I'd not be surprised - given anandtech's love for AMD (take a look at the "best gaming CPUs" article released today...)

    Not really "hilariously funny", though. More "logical and methodical"
  • thesavvymage - Thursday, February 7, 2019 - link

    It's not like itll perform any better though... Intel still has generally better gaming performance. There's no reason to artificially hamstring the card, as it introduces a CPU bottleneck
  • brokerdavelhr - Thursday, February 7, 2019 - link

    Once again - in gaming for the most part....try again with other apps and their is a marked difference. Many of which are in AMD's favor. try again.....
  • jordanclock - Thursday, February 7, 2019 - link

    In every scenario that is worth testing a VIDEO CARD, Intel CPUs offer the best performance.
  • ballsystemlord - Thursday, February 7, 2019 - link

    There choice of processor is kind of strange. An 8-core Intel on *plain* 14nm, now 2! years old, with rather low clocks at 4.3Ghz, is not ideal for a gaming setup. I would have used a 9900K or 2700X personally[1].
    For a content creator I'd be using a Threadripper or similar.
    Re-testing would be an undertaking for AT though. Probably too much to ask. Maybe next time they'll choose some saner processor.
    [1] 9900K is 4.7Ghz all cores. The 2700X runs at 4.0Ghz turbo, so you'd loose frequency, but then you could use faster RAM.
    For citations see:
    https://www.intel.com/content/www/us/en/products/p...
    https://images.anandtech.com/doci/12625/2nd%20Gen%...
    https://images.anandtech.com/doci/13400/9thGenTurb...
  • ToTTenTranz - Thursday, February 7, 2019 - link

    Page 3 table:
    - The MI50 uses a Vega 20, not a Vega 10.
  • Ryan Smith - Thursday, February 7, 2019 - link

    Thanks!
  • FreckledTrout - Thursday, February 7, 2019 - link

    I wonder why this card absolutely dominates in the "LuxMark 3.1 - LuxBall and Hotel" HDR test? Its pulling in numbers 1.7x higher than the RTX 2080 on that test. That's a funky outlier.
  • Targon - Thursday, February 7, 2019 - link

    How much video memory is used? That is the key. Since many games and benchmarks are set up to test with a fairly low amount of video memory being needed(so those 3GB 1050 cards can run the test), what happens when you try to load 10-15GB into video memory for rendering? Cards with 8GB and under(the majority) will suddenly look a lot slower in comparison.

Log in

Don't have an account? Sign up now