Final Fantasy XV (DX11)

Upon arriving to PC earlier this, Final Fantasy XV: Windows Edition was given a graphical overhaul as it was ported over from console, fruits of their successful partnership with NVIDIA, with hardly any hint of the troubles during Final Fantasy XV's original production and development.

In preparation for the launch, Square Enix opted to release a standalone benchmark that they have since updated. Using the Final Fantasy XV standalone benchmark gives us a lengthy standardized sequence to utilize OCAT. Upon release, the standalone benchmark received criticism for performance issues and general bugginess, as well as confusing graphical presets and performance measurement by 'score'. In its original iteration, the graphical settings could not be adjusted, leaving the user to the presets that were tied to resolution and hidden settings such as GameWorks features.

Since then, Square Enix has patched the benchmark with custom graphics settings and bugfixes to be more accurate in profiling in-game performance and graphical options, though leaving the 'score' measurement. For our testing, we enable or adjust settings to the highest except for NVIDIA-specific features and 'Model LOD', the latter of which is left at standard. Final Fantasy XV also supports HDR, and it will support DLSS at some later date.

Final Fantasy XV - 3840x2160 - Ultra Quality

Final Fantasy XV - 2560x1440 - Ultra Quality

Final Fantasy XV - 1920x1080 - Ultra Quality

Moving on to Final Fantasy XV, the Radeon VII's showing here is one of the least ideal scenarios. The game has historically performed well on NVIDIA hardware, so the RTX and GTX performance levels are well-known. With a lot of ground to cover from RX Vega 64's starting point, the Radeon VII does well in pushing up to a 34% speedup at 4K and 28% at 1440p. While that is enough to overtake the reference RTX 2070 at 4K/1440p, the RTX 2080 and GTX 1080 Ti FE remain out of reach.

Final Fantasy XV - 99th Percentile - 3840x2160 - Ultra Quality

Final Fantasy XV - 99th Percentile - 2560x1440 - Ultra Quality

Final Fantasy XV - 99th Percentile - 1920x1080 - Ultra Quality

Wolfenstein II Grand Theft Auto V
Comments Locked

289 Comments

View All Comments

  • tipoo - Sunday, February 10, 2019 - link

    It's MI50
  • vanilla_gorilla - Thursday, February 7, 2019 - link

    As a linux prosumer user who does light gaming, this card is a slam dunk for me.
  • LogitechFan - Friday, February 8, 2019 - link

    and a noisy one at that
  • BaneSilvermoon - Thursday, February 7, 2019 - link

    Meh, I went looking for a 16GB card about a week before they announced Radeon VII because gaming was using up all 8gb of VRAM and 14gb of system RAM. This card is a no brainer upgrade from my Vega 64.
  • LogitechFan - Friday, February 8, 2019 - link

    lemme guess, you're playing sandstorm?
  • Gastec - Tuesday, February 12, 2019 - link

    I was beginning to think that the "money" was in crytocurrency mining with video cards but I guess after the €1500+ RTX 2080Ti I should reconsider :)
  • eddman - Thursday, February 7, 2019 - link

    Perhaps but Turing is also a new architecture, so it's probable it'd get better with newer drivers too.

    Maxwell is from 2014 and still performs as it should.

    As for GPU-accelerated gameworks, obviously nvidia is optimizing it for their own cards only, but that doesn't mean they actively modify the code to make it perform worse on AMD cards; not to mention it would be illegal. (GPU-only gameworks effects can be disabled in game options if need be)

    Many (most?) games just utilize the CPU-only gameworks modules; no performance difference between cards.
  • ccfly - Tuesday, February 12, 2019 - link

    you joking right ?
    1st game they did just that is crysis (they hide modely under water so ati card will render these too
    and be slower
    and after that they cheat full time ...
  • eddman - Tuesday, February 12, 2019 - link

    No, I'm not.

    There was no proof of misconduct in crysis 2's case, just baseless rumors.

    For all we know, it was an oversight on crytek's part. Also, DX11 was an optional feature, meaning it wasn't part of game's main code, as I've stated.
  • eddman - Tuesday, February 12, 2019 - link

    ... I mean an optional toggle for crysis 2. The game could be run in DX9 mode.

Log in

Don't have an account? Sign up now