Gaming Tests: Final Fantasy XV

Upon arriving to PC, Final Fantasy XV: Windows Edition was given a graphical overhaul as it was ported over from console. As a fantasy RPG with a long history, the fruits of Square-Enix’s successful partnership with NVIDIA are on display. The game uses the internal Luminous Engine, and as with other Final Fantasy games, pushes the imagination of what we can do with the hardware underneath us. To that end, FFXV was one of the first games to promote the use of ‘video game landscape photography’, due in part to the extensive detail even at long range but also with the integration of NVIDIA’s Ansel software, that allowed for super-resolution imagery and post-processing effects to be applied.

In preparation for the launch of the game, Square Enix opted to release a standalone benchmark. Using the Final Fantasy XV standalone benchmark gives us a lengthy standardized sequence to record, although it should be noted that its heavy use of NVIDIA technology means that the Maximum setting has problems - it renders items off screen. To get around this, we use the standard preset which does not have these issues. We use the following settings:

  • 720p Standard, 1080p Standard, 4K Standard, 8K Standard

For automation, the title accepts command line inputs for both resolution and settings, and then auto-quits when finished. As with the other benchmarks, we do as many runs until 10 minutes per resolution/setting combination has passed, and then take averages. Realistically, because of the length of this test, this equates to two runs per setting.

AnandTech Low Resolution
Low Quality
Medium Resolution
Low Quality
High Resolution
Low Quality
Medium Resolution
Max Quality
Average FPS
95th Percentile

All of our benchmark results can also be found in our benchmark engine, Bench.

Gaming Tests: Final Fantasy XIV Gaming Tests: World of Tanks
Comments Locked

279 Comments

View All Comments

  • schujj07 - Wednesday, March 31, 2021 - link

    Intel 10nm is not TSMC 7nm.
  • watzupken - Wednesday, March 31, 2021 - link

    "What the he'll is that supposed to mean that you can't you can't get the frequency at 10 nm and therefore you have to stick with the 14 nm node? That's pure nonsense, AND is at 7 nm and they are getting the target frequencies. Maybe stop spreading the Coolaid and call a spade a spade...."

    I am not sure how true this is, but the clockspeed for early versions of 10nm were abysmal. If you look at the first gen of 10nm chip from Intel, Cannon Lake, not just is clockspeed low, but specs is bad. Second gen 10nm, Ice Lake, and you see similar trend of very low clockspeed. I am using an i5 Ice Lake U that is advertised with a base clock of 1Ghz. It is only with 10nm Super Fin (third gen) where you start seeing higher clockspeed. Also, yield with early 10nm is certainly an issue, or they will not have to push out Rocket Lake @ 14nm, while laptops and servers/ workstations (only recently) are on 10nm. I suspect Intel is pushing their 10nm towards the same path as their current 14nm, feed it with more power and push clockspeed as high as possible. I will not be surprise that Alder Lake may bring better performance with a max of 8 big cores, but power consumption wise may only see marginal improvements at load. Light load may not expose the power inefficiency because of the small cores will pick up the load.
  • boozed - Tuesday, March 30, 2021 - link

    There's some weirdness going on in at least one, possibly two of the FFXV 95th percentile graphs
  • watzupken - Wednesday, March 31, 2021 - link

    I feel I have to give Intel the credit of moving forward with a 14nm Rocket Lake, instead of hanging around like they did for the last 5 years with the same Skylake chip but boosted with steroids. But evidently, 14nm is becoming a burden to their progress. I know Intel supporters will claim that 14nm is capable of competing with 7nm. On the surface, yes. But at the cost of massive power draw and heat output with regression in performance as compared to the previous i9 in some cases. I would say that i5 would still be a chip worth considering, but not the i7 or i9 if you your main use case is gaming. At the respective price points, looking just at the price of an Intel i7 or i9 Rocket Lake chip appears to be cheap, but if you consider you need some hardcore motherboard and cooling to keep the chip chugging at the a high all core clockspeed, the cost actually skyrockets.
    Personally after looking at a number of reviews of Rocket Lake, it seems to me its a product that is too little and too late. Plus, if you are going for an i7 or i9, your upgrade path is dead since there will be no Rocket Lake with a higher core count. At least on the AMD camp, if you settled for a Ryzen 5 or 7, one may still have the option to scoop up a Ryzen 9 if prices come down with the introduction of Zen 4. In the absence of AMD chips at MSRP, I guess I will only recommend a Rocket Lake i5 because of the significant improvement over last gen. Otherwise, I don't think most will lose out much by going for the discounted Comet Lake chips.
  • Hifihedgehog - Wednesday, March 31, 2021 - link

    LOL. Keep dreaming...

    https://i.imgflip.com/53vqce.jpg
  • 529th - Wednesday, March 31, 2021 - link

    No chipset fans for their PCIe 4.0?
  • JMC2000 - Wednesday, March 31, 2021 - link

    Intel 500-series chipsets don't have PCI-E 4.0, only the CPU does.

    https://ark.intel.com/content/www/us/en/ark/produc...
  • yeeeeman - Wednesday, March 31, 2021 - link

    One of the few tech sites that remained professional and didn't use click baity titles or disrespect intel.
    Rocket is clearly a stop gap and a product that doesn't make sense, but it is what it is and as a professional tech writer you should treat it with decency not write insulting words and call it a poop like hardware unboxed did for example.
  • XabanakFanatik - Wednesday, March 31, 2021 - link

    Ok Piednoel
  • Qasar - Wednesday, March 31, 2021 - link

    go see how well gamers nexus liked this cpu.
    intel deserves ALL the flack they get for this cpu, its a joke, and a dud.

Log in

Don't have an account? Sign up now