Far Cry 5 (DX11)

The latest title in Ubisoft's Far Cry series lands us right into the unwelcoming arms of an armed militant cult in Montana, one of the many middles-of-nowhere in the United States. With a charismatic and enigmatic adversary, gorgeous landscapes of the northwestern American flavor, and lots of violence, it is classic Far Cry fare. Graphically intensive in an open-world environment, the game mixes in action and exploration.

Far Cry 5 does support Vega-centric features with Rapid Packed Math and Shader Intrinsics. Far Cry 5 also supports HDR (HDR10, scRGB, and FreeSync 2). This testing was done without HD Textures enabled, an option that was recently patched in.

Far Cry 5 - 3840x2160 - Ultra Quality

Far Cry 5 - 2560x1440 - Ultra Quality

Far Cry 5 - 1920x1080 - Ultra Quality

The Far Cry 5 built-in benchmark isn't known for its sensitivity, but in this case the differences aren't drastic in how the Radeon VII takes the lead at 4K but slips behind at lower resolutions. For 4K and 1440p, the takeaway is that the Radeon VII, GTX 1080 Ti FE, and RTX 2080 are more-or-less in the same league.

Battlefield 1 Ashes of the Singularity: Escalation
Comments Locked

289 Comments

View All Comments

  • mapesdhs - Friday, February 8, 2019 - link

    It's going to be hillariously funny if Ryzen 3000 series reverses this accepted norm. :)
  • mkaibear - Saturday, February 9, 2019 - link

    I'd not be surprised - given anandtech's love for AMD (take a look at the "best gaming CPUs" article released today...)

    Not really "hilariously funny", though. More "logical and methodical"
  • thesavvymage - Thursday, February 7, 2019 - link

    It's not like itll perform any better though... Intel still has generally better gaming performance. There's no reason to artificially hamstring the card, as it introduces a CPU bottleneck
  • brokerdavelhr - Thursday, February 7, 2019 - link

    Once again - in gaming for the most part....try again with other apps and their is a marked difference. Many of which are in AMD's favor. try again.....
  • jordanclock - Thursday, February 7, 2019 - link

    In every scenario that is worth testing a VIDEO CARD, Intel CPUs offer the best performance.
  • ballsystemlord - Thursday, February 7, 2019 - link

    There choice of processor is kind of strange. An 8-core Intel on *plain* 14nm, now 2! years old, with rather low clocks at 4.3Ghz, is not ideal for a gaming setup. I would have used a 9900K or 2700X personally[1].
    For a content creator I'd be using a Threadripper or similar.
    Re-testing would be an undertaking for AT though. Probably too much to ask. Maybe next time they'll choose some saner processor.
    [1] 9900K is 4.7Ghz all cores. The 2700X runs at 4.0Ghz turbo, so you'd loose frequency, but then you could use faster RAM.
    For citations see:
    https://www.intel.com/content/www/us/en/products/p...
    https://images.anandtech.com/doci/12625/2nd%20Gen%...
    https://images.anandtech.com/doci/13400/9thGenTurb...
  • ToTTenTranz - Thursday, February 7, 2019 - link

    Page 3 table:
    - The MI50 uses a Vega 20, not a Vega 10.
  • Ryan Smith - Thursday, February 7, 2019 - link

    Thanks!
  • FreckledTrout - Thursday, February 7, 2019 - link

    I wonder why this card absolutely dominates in the "LuxMark 3.1 - LuxBall and Hotel" HDR test? Its pulling in numbers 1.7x higher than the RTX 2080 on that test. That's a funky outlier.
  • Targon - Thursday, February 7, 2019 - link

    How much video memory is used? That is the key. Since many games and benchmarks are set up to test with a fairly low amount of video memory being needed(so those 3GB 1050 cards can run the test), what happens when you try to load 10-15GB into video memory for rendering? Cards with 8GB and under(the majority) will suddenly look a lot slower in comparison.

Log in

Don't have an account? Sign up now