The NVIDIA GeForce RTX 2080 Ti & RTX 2080 Founders Edition Review: Foundations For A Ray Traced Future
by Nate Oh on September 19, 2018 5:15 PM EST- Posted in
- GPUs
- Raytrace
- GeForce
- NVIDIA
- DirectX Raytracing
- Turing
- GeForce RTX
Final Fantasy XV (DX11)
Upon arriving to PC earlier this, Final Fantasy XV: Windows Edition was given a graphical overhaul as it was ported over from console, fruits of their successful partnership with NVIDIA, with hardly any hint of the troubles during Final Fantasy XV's original production and development.
In preparation for the launch, Square Enix opted to release a standalone benchmark that they have since updated. Using the Final Fantasy XV standalone benchmark gives us a lengthy standardized sequence to utilize OCAT. Upon release, the standalone benchmark received criticism for performance issues and general bugginess, as well as confusing graphical presets and performance measurement by 'score'. In its original iteration, the graphical settings could not be adjusted, leaving the user to the presets that were tied to resolution and hidden settings such as GameWorks features.
Since then, Square Enix has patched the benchmark with custom graphics settings and bugfixes to be much more accurate in profiling in-game performance and graphical options, though leaving the 'score' measurement. For our testing, we enable or adjust settings to the highest except for NVIDIA-specific features and 'Model LOD', the latter of which is left at standard. Final Fantasy XV also supports HDR, and it will support DLSS at some date.
Final Fantasy XV | 1920x1080 | 2560x1440 | 3840x2160 |
Average FPS | |||
99th Percentile |
NVIDIA, of course, is working closely with Square Enix, and the game is naturally expected to run well on NVIDIA cards in general, but the 1080 Ti truly lives up to its gaming flagship reputation in matching the RTX 2080. With Final Fantasy XV, the Founders Edition power and clocks again prove highly useful in the 2080 FE pipping the 1080 Ti, while the 2080 Ti FE makes it across the psychological 60fps mark at 4K.
337 Comments
View All Comments
dustwalker13 - Sunday, September 23, 2018 - link
Way too expensive. If those cards were the same price as the 1080 / TI it would be a generation change.This actually is a regression, the price has increased out of every proportion even if you completely were to ignore that newer generations are EXPECTED to be a lot faster than the older ones for essentially the same price (plus a bit of inflation max).
paying 50% more for a 30% increase is a simple ripoff. no one should buy these cards ... sadly a lot of people will let themselves get ripped off once more by nvidia.
and no: raytracing is not an argument here, this feature is not supported anywhere and by the time it will be adopted (if ever) years will have gone bye and these cards will be old and obsolete. all of this is just marketing and hot air.
mapesdhs - Thursday, September 27, 2018 - link
There are those who claim buying RTX to get the new features is sensible for future proofing; I don't understand why they ignore that the performance of said features on these cards is so poor. NVIDIA spent years pushing gamers into high frequency monitors, 4K and VR, now they're trying to flip everyone round the other way, pretend that sub-60Hz 1080p is ok.And btw, it's a lot more than 50%. Where I am (UK) the 2080 Ti is almost 100% more than the 1080 Ti launch price. It's also crazy that the 2080 Ti does not have more RAM, it should have had at least 16GB. My guess is NVIDIA knows that by pushing gamers back down to lower resolutions, they don't need so much VRAM. People though have gotten used to the newer display tech, and those who've adopted high refresh displays physically cannot go back.
milkod2001 - Monday, September 24, 2018 - link
I wonder if NV did not hike prices so much on purpose so they don't have to lower existing GTX prices that much. There is still ton of GTX in stock.poohbear - Friday, September 28, 2018 - link
Why the tame conclusion? Do us a favor will u? Just come out and say you'd have to be mental to pay these prices for features that aren't even available yet and you'd be better off buying previous gen video cards that are currently heavily discounted.zozaino - Thursday, October 4, 2018 - link
i really want to use itzozaino - Thursday, October 4, 2018 - link
i really want to use it https://subwaysurfers.vip/https://psiphon.vip/
https://hillclimbracing.vip/
Luke212 - Wednesday, October 24, 2018 - link
Why does the 2080ti not use the tensor cores for GEMM? I have not seen any benchmark anywhere showing it working. It would be a good news story if Nvidia secretly gimped the tensor cores.