Gaming: Final Fantasy XV

Upon arriving to PC earlier this, Final Fantasy XV: Windows Edition was given a graphical overhaul as it was ported over from console, fruits of their successful partnership with NVIDIA, with hardly any hint of the troubles during Final Fantasy XV's original production and development.

In preparation for the launch, Square Enix opted to release a standalone benchmark that they have since updated. Using the Final Fantasy XV standalone benchmark gives us a lengthy standardized sequence to record, although it should be noted that its heavy use of NVIDIA technology means that the Maximum setting has problems - it renders items off screen. To get around this, we use the standard preset which does not have these issues.

Square Enix has patched the benchmark with custom graphics settings and bugfixes to be much more accurate in profiling in-game performance and graphical options. For our testing, we run the standard benchmark with a FRAPs overlay, taking a 6 minute recording of the test.

AnandTech CPU Gaming 2019 Game List
Game Genre Release Date API IGP Low Med High
Final Fantasy XV JRPG Mar
2018
DX11 720p
Standard
1080p
Standard
4K
Standard
8K
Standard

All of our benchmark results can also be found in our benchmark engine, Bench.

Game IGP Low Medium High
Average FPS
95th Percentile

Gaming: World of Tanks enCore Gaming: Shadow of War
Comments Locked

143 Comments

View All Comments

  • Cellar Door - Tuesday, November 13, 2018 - link

    The best part is that an i7 part(9800X) is more expensive then a i9 part(9900k). Intel smoking some good stuff.
  • DigitalFreak - Tuesday, November 13, 2018 - link

    You're paying more for those extra 28 PCI-E lanes
  • Hixbot - Tuesday, November 13, 2018 - link

    And much more L3. It's also interesting that HEDT is no longer behind in process node.
  • Hixbot - Tuesday, November 13, 2018 - link

    And AVX512
  • eastcoast_pete - Tuesday, November 13, 2018 - link

    @Ian: Thanks, good overview and review!
    Agree on the "iteration when an evolutionary upgrade was needed"; it seems that Intel's development was a lot more affected by its blocked/constipated transition to 10 nm (now scrapped), and the company's attention was also diverted by its forays into mobile (didn't work out so great) and looking for progress elsewhere (Altera acquisition). This current "upgrade" is mainly good for extra PCI-e lanes (nice to have more), but it's performance is no better than the previous generation. If the new generation chips from AMD are halfway as good as they promise, Intel will loose a lot more profitable ground in the server and HEDT space to AMD.
    @Ian, and all: While Intel goes on about their improved FinFet 14 nm being the reason for better performance/Wh, I wonder how big the influence of better heat removal through the (finally again) soldered heat spreader is? Yes, most of us like to improve cooling to be able to overclock more aggressively, but shouldn't better cooling also improve the overall efficiency of the processor? After all, semiconductors conduct more current as they get hotter, leading to ever more heat and eventual "gate crashing". Have you or anybody else looked at performance/Wh between, for example, an i7 8700 with stock cooler and pasty glued heat spreader vs. the same processor with proper delidding, liquid metal replacement and a great aftermarket cooler, both at stock frequencies? I'd expect the better cooled setup to have more performance/Wh, but is that the case?
  • Arbie - Tuesday, November 13, 2018 - link

    The "Competition" chart is already ghastly for Intel. Imagine how much worse it will be when AMD moves to 7 nm with Zen 2.
  • zepi - Tuesday, November 13, 2018 - link

    How about including some kind of DB test?

    I think quite a few people are looking at these workstation class CPU's to develop BI things and it might quite helpful to actually measure results with some SQL / NoSQL / BI-suites. Assuming bit more complex parallel SQL executions with locking could show some interesting differences between NUMA-Threadrippers and Intels.
  • GreenReaper - Wednesday, November 14, 2018 - link

    It's a good idea, Phoronix does them so in the short term you could probably look there.
  • jospoortvliet - Friday, November 16, 2018 - link

    But then make sure it is realistic, not running in cache or such... A real db suitable for these chips is terabytes, merely keeping the index in ram... rule of thumb: if your index fits in cache your database doesn't need this CPU ;-)
  • FunBunny2 - Tuesday, November 13, 2018 - link

    I guess I can run my weather simulation in Excel on my personal machine now. neato.

Log in

Don't have an account? Sign up now