Final Fantasy XV (DX11)

Upon arriving to PC earlier this, Final Fantasy XV: Windows Edition was given a graphical overhaul as it was ported over from console, fruits of their successful partnership with NVIDIA, with hardly any hint of the troubles during Final Fantasy XV's original production and development.

In preparation for the launch, Square Enix opted to release a standalone benchmark that they have since updated. Using the Final Fantasy XV standalone benchmark gives us a lengthy standardized sequence to utilize OCAT. Upon release, the standalone benchmark received criticism for performance issues and general bugginess, as well as confusing graphical presets and performance measurement by 'score'. In its original iteration, the graphical settings could not be adjusted, leaving the user to the presets that were tied to resolution and hidden settings such as GameWorks features.

Since then, Square Enix has patched the benchmark with custom graphics settings and bugfixes for better accuracy in profiling in-game performance and graphical options, though leaving the 'score' measurement. For our testing, we enable or adjust settings to the highest except for NVIDIA-specific features and 'Model LOD', the latter of which is left at standard. Final Fantasy XV also supports HDR, and it will support DLSS at some later date.

Final Fantasy XV - 3840x2160 - Ultra QualityFinal Fantasy XV - 2560x1440 - Ultra QualityFinal Fantasy XV - 1920x1080 - Ultra Quality

At 1080p and 1440p, the RTX 2060 (6GB) returns to its place between the GTX 1080 and GTX 1070 Ti. Final Fantasy is less favorable to the Vega cards so the RTX 2060 (6GB) is already faster than the RX Vega 64. With the relative drop in 4K performance, there are more hints of 6GB being potentially insufficient.

Final Fantasy XV - 99th Percentile - 3840x2160 - Ultra QualityFinal Fantasy XV - 99th Percentile - 2560x1440 - Ultra QualityFinal Fantasy XV - 99th Percentile - 1920x1080 - Ultra Quality

 

Wolfenstein II Grand Theft Auto V
Comments Locked

134 Comments

View All Comments

  • B3an - Monday, January 7, 2019 - link

    More overpriced useless shit. These reviews are very rarely harsh enough on this kind of crap either, and i mean tech media in general. This shit isn't close to being acceptable.
  • PeachNCream - Monday, January 7, 2019 - link

    Professionalism doesn't demand harshness. The charts and the pricing are reliable facts that speak for themselves and let a reader reach conclusions about the value proposition or the acceptability of the product as worthy of purchase. Since opinions between readers can differ significantly, its better to exercise restraint. These GPUs are given out as media samples for free and, if I'm not mistaken, other journalists have been denied pre-NDA-lift samples by blasting the company or the product. With GPU shortages all around and the need to have a day one release in order to get search engine placement that drives traffic, there is incentive to tenderfoot around criticism when possible.
  • CiccioB - Monday, January 7, 2019 - link

    It all depends on what is your definition of "shit".
    Shit may be something that for you costs too much (so shit is Porche, Lamborghini and Ferrari, but for some else, also Audi, BMW and Mercedes and for some one else also all C cars) or may be something that does not work as expected or under perform with respect to the resources it has.
    So for someone else it may be shit a chip that with 230mm^q, 256GB/s of bandwidth and 240W perform like a chip that is 200mm^2, 192GB/s of bandwidth and uses half the power.
    Or it may be a chip that with 480mm^2, 8GB of latest HBM technology and more than 250W perform just a bit better than a 314mm^2 chip with GDDR5X and that uses 120W less.

    On each one its definition of "shit" and what should be bought to incentive real technological progress.
  • saiga6360 - Tuesday, January 8, 2019 - link

    It's shit when your Porsche slows down when you turn on its fancy new features.
  • Retycint - Tuesday, January 8, 2019 - link

    The new feature doesn't subtract from its normal functions though - there is still an appreciable performance increase despite the focus on RTS and whatnot. Plus, you can simply turn RTS off and use it like a normal GPU? I don't see the issue here
  • saiga6360 - Tuesday, January 8, 2019 - link

    If you feel compelled to turn off the feature, then perhaps it is better to buy the alternative without it at a lower price. It comes down to how much the eye candy is worth to you at performance levels that you can get from a sub $200 card.
  • CiccioB - Tuesday, January 8, 2019 - link

    It's shit when these fancy new features are kept back by the console market that has difficult at handling less than half the polygons that Pascal can, let alone the new Turing CPUs.
    The problem is not the technology that is put at disposal, but it is the market that is held back by obsolete "standards".
  • saiga6360 - Tuesday, January 8, 2019 - link

    You mean held back by economics? If Nvidia feels compelled to sell ray tracing in its infancy for thousands of dollars, what do you expect of console makers who are selling the hardware for a loss? Consoles sell games, and if the games are compelling without the massive polygons and ray tracing then the hardware limitations can be justified. Besides, this hardly can be said of modern consoles that can push some form of 4K gaming at 30fps of AAA games not even being sold on PC. Ray tracing is nice to look at but it hardly justifies the performance penalties at the price point.
  • CiccioB - Wednesday, January 9, 2019 - link

    The same may be said for 4K: fancy to see but 4x the performance vs FulllHD is too much.
    But as you can se, there are more and more people looking for 4K benchmarks to decide which card to buy.
    I would trade better graphics vs resolution any day.
    Raytraced films on bluray (so in FullHD) are way much better than any rasterized graphics at 4K.
    The path for graphics quality has been traced. Bear with it.
  • saiga6360 - Wednesday, January 9, 2019 - link

    4K vs ray tracing seems like an obvious choice to you but people vote with their money and right now, 4K is far less cost prohibitive for the eye-candy choice you can get. One company doing it alone will not solve this, especially at such cost vs performance. We got to 4K and adaptive sync because it is an affordable solution, it wasn't always but we are here now and ray tracing is still just a fancy gimmick too expensive for most. Like it or not, it will take AMD and Intel to get on board for ray tracing on hardware across platforms, but before that, a game that truly shows the benefits of ray tracing. Preferably one that doesn't suck.

Log in

Don't have an account? Sign up now