Final Fantasy XV (DX11)

Upon arriving to PC earlier this, Final Fantasy XV: Windows Edition was given a graphical overhaul as it was ported over from console, fruits of their successful partnership with NVIDIA, with hardly any hint of the troubles during Final Fantasy XV's original production and development.

In preparation for the launch, Square Enix opted to release a standalone benchmark that they have since updated. Using the Final Fantasy XV standalone benchmark gives us a lengthy standardized sequence to utilize OCAT. Upon release, the standalone benchmark received criticism for performance issues and general bugginess, as well as confusing graphical presets and performance measurement by 'score'. In its original iteration, the graphical settings could not be adjusted, leaving the user to the presets that were tied to resolution and hidden settings such as GameWorks features.

Since then, Square Enix has patched the benchmark with custom graphics settings and bugfixes to be more accurate in profiling in-game performance and graphical options, though leaving the 'score' measurement. For our testing, we enable or adjust settings to the highest except for NVIDIA-specific features and 'Model LOD', the latter of which is left at standard. Final Fantasy XV also supports HDR, and it will support DLSS at some later date.

Final Fantasy XV - 2560x1440 - Ultra Quality

Final Fantasy XV - 1920x1080 - Ultra Quality

Final Fantasy XV - 99th Percentile - 2560x1440 - Ultra Quality

Final Fantasy XV - 99th Percentile - 1920x1080 - Ultra Quality

Final Fantasy V is another strong title for NVIDIA across the board, and the GTX 1660 is in a very comfortable slot between the GTX 1660 Ti and RX 590. The GTX 1060 6GB lost out to the RX 590 here, but here the GTX 1660 overtakes the RX 590. By the looks of the 99th percentiles, the GTX 1060 3GB is clearly struggling with its limited 3GB framebuffer at 1440p.

Wolfenstein II Grand Theft Auto V
Comments Locked

77 Comments

View All Comments

  • Qasar - Thursday, March 14, 2019 - link

    it also depends on if the consoles even have any games some one would want to play... for me.. those games are not on consoles.. they are on a comp... not worth it for me to by a console as it would just sit under my tv unused..
  • D. Lister - Saturday, March 16, 2019 - link

    @eva02langley: "...console hardware is more efficient since it is dedicated for gaming only."

    smh... console hardware used to be more efficient for gaming when console hardware was composed of custom parts. Now, consoles use essentially the same parts as PCs, so that argument doesn't work anymore.

    Fact of the matter is, consoles remain competitive in framerate by either cutting down on internal resolution, or graphic quality features, like AA, AF, AO, or in many cases, both res and features. Take a look at the face-offs conducted by the Digital Foundry over at Eurogamers.net.
  • D. Lister - Saturday, March 16, 2019 - link

    @eva02langley: I also find it rather ironic that you, who has often criticized NVidia for not being open-sourced enough with their technologies, are making a case here for consoles that are completely proprietary and closed-off systems.
  • maroon1 - Thursday, March 14, 2019 - link

    Faster, consume much less power, smaller and produce less noise than RX590 which cost same

    Even if you ignore the performance advantage, the GTX 1660 is still better out of the two. No reason to buy big power hungry GPU when it has no performance advantage
  • 0ldman79 - Thursday, March 14, 2019 - link

    What is it with Wolfenstein that kills the 900 series?

    I mean they're still competitive in almost everything else, but Wolfenstein just buries the 900 series horribly. If it's that bad I'm glad I'm not addicted to that series. I had thought about picking up a copy, but damn...
  • Opencg - Thursday, March 14, 2019 - link

    they may be using some async techniques. the famous example is doom where many 900 series saw worse performance on vulkan due to async being a cpu based driver side implementation.
  • Dribble - Thursday, March 14, 2019 - link

    I think it's because it has FP16 code in the shaders - which Turing and newer AMD have hardware support for, but Pascal doesn't. It was AMD's trump card until Turing so you'll find a few AMD sponsored games use FP16.
  • Ryan Smith - Thursday, March 14, 2019 - link

    "What is it with Wolfenstein that kills the 900 series?"

    Memory capacity. It really wants more than 4GB when all of its IQ settings are cranked up, which leaves everything below the GTX 980 Ti a bit short on space.
  • AustinPowersISU - Thursday, March 14, 2019 - link

    Used GTX 1070 still makes the most sense. You can easily get one for less than this card and have much better performance.

    Nvidia needs to do better.
  • eva02langley - Thursday, March 14, 2019 - link

    It is still their best offering in term of price/performance from Turing. However, yeah, that should have been done way before.

Log in

Don't have an account? Sign up now