Final Fantasy XV (DX11)

Upon arriving to PC earlier this, Final Fantasy XV: Windows Edition was given a graphical overhaul as it was ported over from console, fruits of their successful partnership with NVIDIA, with hardly any hint of the troubles during Final Fantasy XV's original production and development.

In preparation for the launch, Square Enix opted to release a standalone benchmark that they have since updated. Using the Final Fantasy XV standalone benchmark gives us a lengthy standardized sequence to utilize OCAT. Upon release, the standalone benchmark received criticism for performance issues and general bugginess, as well as confusing graphical presets and performance measurement by 'score'. In its original iteration, the graphical settings could not be adjusted, leaving the user to the presets that were tied to resolution and hidden settings such as GameWorks features.

Since then, Square Enix has patched the benchmark with custom graphics settings and bugfixes to be more accurate in profiling in-game performance and graphical options, though leaving the 'score' measurement. For our testing, we enable or adjust settings to the highest except for NVIDIA-specific features and 'Model LOD', the latter of which is left at standard. Final Fantasy XV also supports HDR, and it will support DLSS at some later date.

Final Fantasy XV - 2560x1440 - Ultra Quality

Final Fantasy XV - 1920x1080 - Ultra Quality

Final Fantasy XV - 99th Percentile - 2560x1440 - Ultra Quality

Final Fantasy XV - 99th Percentile - 1920x1080 - Ultra Quality

Final Fantasy V is another strong title for NVIDIA across the board, and the GTX 1660 Ti comes very close to the RX Vega 64, let alone surpassing the RX 590 and RX Vega 56.

The GTX 960 is clearly out of its element, and given the 99th percentiles it's fair to say that the 2GB framebuffer shoulders a good amount of the blame. By comparison, this makes the GTX 1660 Ti look exceedingly good at offering basically triple the performance (and amusingly, triple the VRAM).

Wolfenstein II Grand Theft Auto V
Comments Locked

157 Comments

View All Comments

  • Psycho_McCrazy - Tuesday, February 26, 2019 - link

    Given that 21:9 monitors are also making great inroads into the gamer's purchase lists, can benchmark resolutions also include 2560.1080p, 3440.1440p and (my wishlist) 3840.1600p benchies??
  • eddman - Tuesday, February 26, 2019 - link

    2560x1080, 3440x1440 and 3840x1600

    That's how you right it, and the "p" should not be used when stating the full resolution, since it's only supposed to be used for denoting video format resolution.

    P.S. using 1080p, etc. for display resolutions isn't technically correct either, but it's too late for that.
  • Ginpo236 - Tuesday, February 26, 2019 - link

    a 3-slot ITX-sized graphics card. What ITX case can support this? 0.
  • bajs11 - Tuesday, February 26, 2019 - link

    Why can't they just make a GTX 2080Ti with the same performance as RTX 2080Ti but without useless RT and dlss and charge something like 899 usd (still 100 bucks more than gtx 1080ti)?
    i bet it will sell like hotcakes or at least better than their overpriced RTX2080ti
  • peevee - Tuesday, February 26, 2019 - link

    Do I understand correctly that this thing does not have PCIe4?
  • CiccioB - Thursday, February 28, 2019 - link

    No, they have not a PCIe4 bus.
    Do you think they should have?
  • Questor - Wednesday, February 27, 2019 - link

    Why do I feel like this was a panic plan in an attempt to bandage the bleed from RTX failure? No support at launch and months later still abysmal support on a non-game changing and insanely expensive technology.

    I am not falling for it.
  • CiccioB - Thursday, February 28, 2019 - link

    Yes, a "panic plan" that required about 3 years to create the chips.
    3 years ago they already know that they would have panicked at the RTX cards launch and so they made the RT-less chip as well. They didn't know that the RT could not be supported in performance with the low number of CUDA core low level cards have.
    They didn't know that the concurrent would have played with the only weapon it was left to it to battle, that is prize as they could not think that the concurrent was not ready with a beefed up architecture capable of the sa functionalities.
    So, yes, they panicked for sure. They were not prepared to anything of what is happening,
  • Korguz - Friday, March 1, 2019 - link

    " that required about 3 years to create the chips.
    3 years ago they already know that they would have panicked at the RTX cards launch and so they made the RT-less chip as well. They didn't know that the RT could not be supported in performance with the low number of CUDA core low level cards have. "

    and where did you read this ? you do understand, and realize... is IS possible to either disable, or remove parts of an IC with out having to spend " about 3 years " to create the product, right ? intel does it with their IGP in their cpus, amd did it back in the Phenom days with chips like the Phenom X4 and X3....
  • CiccioB - Tuesday, March 5, 2019 - link

    So they created a TU116, a completely new die without RT and Tensor Core, to reduce the size of the die and lose about 15% of performance with respect to the 2060 all in 3 months because they panicked?
    You probably have no idea of what are the efforts to create a 280mm^2 new die.
    Well, by this and your previous posts you don't have idea of what you are talking about at all.

Log in

Don't have an account? Sign up now