The NVIDIA GeForce RTX 2080 Ti & RTX 2080 Founders Edition Review: Foundations For A Ray Traced Future
by Nate Oh on September 19, 2018 5:15 PM EST- Posted in
- GPUs
- Raytrace
- GeForce
- NVIDIA
- DirectX Raytracing
- Turing
- GeForce RTX
Total War: Warhammer II (DX11)
Last in our 2018 game suite is Total War: Warhammer II, built on the same engine of Total War: Warhammer. While there is a more recent Total War title, Total War Saga: Thrones of Britannia, that game was built on the 32-bit version of the engine. The first TW: Warhammer was a DX11 game was to some extent developed with DX12 in mind, with preview builds showcasing DX12 performance. In Warhammer II, the matter, however, appears to have been dropped, with DX12 mode still marked as beta, but also featuring performance regression for both vendors.
It's unfortunate because Creative Assembly themselves have acknowledged the CPU-bound nature of their games, and with re-use of game engines as spin-offs, DX12 optimization would have continued to provide benefits, especially if the future of graphics in RTS-type games will lean towards low-level APIs.
There are now three benchmarks with varying graphics and processor loads; we've opted for the Battle benchmark, which appears to be the most graphics-bound.
Total War Warhammer II |
1920x1080 | 2560x1440 | 3840x2160 |
Average FPS |
At 1080p, the cards quickly run into the CPU bottleneck, which is to be expected with top-tier video cards and the CPU intensive nature of RTS'es. The Founders Edition power and clock tweaks prove less useful here at 4K, but the models are otherwise in keeping with the expected 1-2-3 linup of 2080 Ti, 2080, and 1080 Ti, with the latter two roughly on par and the 2080 Ti pushing further.
337 Comments
View All Comments
dustwalker13 - Sunday, September 23, 2018 - link
Way too expensive. If those cards were the same price as the 1080 / TI it would be a generation change.This actually is a regression, the price has increased out of every proportion even if you completely were to ignore that newer generations are EXPECTED to be a lot faster than the older ones for essentially the same price (plus a bit of inflation max).
paying 50% more for a 30% increase is a simple ripoff. no one should buy these cards ... sadly a lot of people will let themselves get ripped off once more by nvidia.
and no: raytracing is not an argument here, this feature is not supported anywhere and by the time it will be adopted (if ever) years will have gone bye and these cards will be old and obsolete. all of this is just marketing and hot air.
mapesdhs - Thursday, September 27, 2018 - link
There are those who claim buying RTX to get the new features is sensible for future proofing; I don't understand why they ignore that the performance of said features on these cards is so poor. NVIDIA spent years pushing gamers into high frequency monitors, 4K and VR, now they're trying to flip everyone round the other way, pretend that sub-60Hz 1080p is ok.And btw, it's a lot more than 50%. Where I am (UK) the 2080 Ti is almost 100% more than the 1080 Ti launch price. It's also crazy that the 2080 Ti does not have more RAM, it should have had at least 16GB. My guess is NVIDIA knows that by pushing gamers back down to lower resolutions, they don't need so much VRAM. People though have gotten used to the newer display tech, and those who've adopted high refresh displays physically cannot go back.
milkod2001 - Monday, September 24, 2018 - link
I wonder if NV did not hike prices so much on purpose so they don't have to lower existing GTX prices that much. There is still ton of GTX in stock.poohbear - Friday, September 28, 2018 - link
Why the tame conclusion? Do us a favor will u? Just come out and say you'd have to be mental to pay these prices for features that aren't even available yet and you'd be better off buying previous gen video cards that are currently heavily discounted.zozaino - Thursday, October 4, 2018 - link
i really want to use itzozaino - Thursday, October 4, 2018 - link
i really want to use it https://subwaysurfers.vip/https://psiphon.vip/
https://hillclimbracing.vip/
Luke212 - Wednesday, October 24, 2018 - link
Why does the 2080ti not use the tensor cores for GEMM? I have not seen any benchmark anywhere showing it working. It would be a good news story if Nvidia secretly gimped the tensor cores.