The NVIDIA GeForce RTX 2080 Ti & RTX 2080 Founders Edition Review: Foundations For A Ray Traced Future
by Nate Oh on September 19, 2018 5:15 PM EST- Posted in
- GPUs
- Raytrace
- GeForce
- NVIDIA
- DirectX Raytracing
- Turing
- GeForce RTX
The 2018 GPU Benchmark Suite & the Test
Another year marks another update to our GPU benchmark suite. This time, however, is more in line with a maintenance update than it is a complete overhaul. Although we've done some extended compute and deep learning benchmarking in the past year, and even some HDR gaming impressions, our compute and synthetic lineup remains largely the same. But before getting into the details, let's start with the bulk of benchmarking, and the biggest reason for these cards anyhow: games.
Joining the 2018 game list is Far Cry 5, Wolfenstein II, Final Fantasy XV and Middle-earth: Shadow of War. We are also bringing in F1 2018 and Total War: Warhammer II. Returning from last year is Battlefield 1, Ashes of the Singularity: Escalation, and Grand Theft Auto V. All-in-all, these games span multiple genres, differing graphics workloads, and contemporary APIs, with a nod towards modern and relatively intensive games.
AnandTech GPU Bench 2018 Game List | ||||
Game | Genre | Release Date | API(s) | |
Battlefield 1 | FPS | Oct. 2016 | DX11 (DX12) |
|
Far Cry 5 | FPS | Mar. 2018 | DX11 | |
Ashes of the Singularity: Escalation | RTS | Mar. 2016 | DX12 (DX11, Vulkan) |
|
Wolfenstein II: The New Colossus | FPS | Oct. 2017 | Vulkan | |
Final Fantasy XV: Windows Edition | JRPG | Mar. 2018 | DX11 | |
Grand Theft Auto V | Action/Open world | Apr. 2015 | DX11 | |
Middle-earth: Shadow of War | Action/RPG | Sep. 2017 | DX11 | |
F1 2018 | Racing | Aug. 2018 | DX11 | |
Total War: Warhammer II | RTS | Sep. 2017 | DX11 (DX12) |
That said, Ashes as a DX12 trailblazer may not be as hot and fresh as it once was, especially considering that the pace of DX12 and Vulkan adoption in new games has waned. The circumstances are worth an investigation on their own, but the learning curve required in modern low-level API and the subsequent return may not be convincing right now. As a more general remark, most developers and publishers tend not to advertise or document DX12 support as much as they used to, nor is it clearly labelled in game specifications as many times DX11 is the unmentioned default.
Particularly for NVIDIA and GeForce RTX, pushing DXR and raytracing means pushing DX12, of which DXR is a component. The API has a backstop in the form of Xbox consoles and Windows 10, and if multi-GPU is to make a comeback, whether that's via compatible workloads (VR), flexible usage (ray tracing workload topologies), or just the plain old inevitability of Moore's Law. So this is less likely to be the slow end of DX12.
In terms of data collection, measurements were gathered either using built-in benchmark tools or with AMD's open-source Open Capture and Analytics Tool (OCAT), which is itself powered by Intel's PresentMon. 99th percentiles were obtained or calculated in a similar fashion, as OCAT natively obtains 99th percentiles. In general, we prefer 99th percentiles over minimums, as they more accurately represent the gaming experience and filter out any artificial outliers.
We've also swapped out Blenchmark, which seems to have been abandoned in terms of updates, in favor of a BMW render from the Blender Institute Cycles Benchmark, and a more recent one from a Cycles benchmark developer on Blenderartists.org. There were concerns with Blenchmark's small tile size, which is not very applicable to GPUs, and in terms of usability we also ran into some GPU detection errors which were linked to inaccurate Blenchmark Python code.
Otherwise, we are also keeping an eye on a few trends and upcoming developments:
- MLPerf machine learning benchmark suite
- Blender Benchmark
- Futuremark's 3DMark DirectX Raytracing benchmark
- DXR and Vulkan raytracing extension support in games
Another point is that we do not have a permanent HDR monitor for our testbed, which would be necessary to incorporate HDR game testing in the near future; 5 games in our list actually support HDR. And as we look at technologies that enhance or alter image quality (e.g. HDR, Turing's DLSS), we will want to find a better way of comparing differences. This is particularly tricky with HDR as screenshots are inapplicable and even taking accurate photographs will most likely be viewed on an SDR screen. With DLSS, there is a built-in reference quality based on 64x supersampling, which in deep learning terms is the 'ground truth'; an intuitive solution would be to use a neural network based method of analyzing quality differences, but that is likely beyond our scope.
The following tech demos and test applications were provided via NVIDIA:
- Star Wars 'Reflections' Demo (includes real time ray tracing and DLSS support)
- Final Fantasy XV Official Benchmark (includes DLSS support)
- Asteroids Demo (features mesh shading and variable LOD)
- Epic Infiltrator Demo (features DLSS)
The Testbed
Because NVIDIA is not productizing any other reference-quality GeForce RTX 2080 Ti and 2080 card besides the Founders Editions, which are non-reference by specifications, we've gone ahead and emulated the true reference specifications with a 90MHz downclock and lowering the TDP by roughly 10W. This is to keep comparisons standardized and apples-to-apples, as we always look at reference-to-reference results.
In a classic case of Murphy's Law, our usual PSU started malfunctioning around the time of the review, but given the time constraints we couldn't do a 1:1 replacement in time. As it is a digital PSU, we were beginning to use it for PCIe power readings to augment system measurements, but for now we will have to stick power draw at the wall. For the time being, we've swapped it out with another high-quality and high-wattage PSU.
CPU: | Intel Core i7-7820X @ 4.3GHz |
Motherboard: | Gigabyte X299 AORUS Gaming 7 (F9g) |
Power Supply: | EVGA 1000 G3 |
Hard Disk: | OCZ Toshiba RD400 (1TB) |
Memory: | G.Skill TridentZ DDR4-3200 4 x 8GB (16-18-18-38) |
Case: | NZXT Phantom 630 Windowed Edition |
Monitor: | LG 27UD68P-B |
Video Cards: | AMD Radeon RX Vega 64 (Air Cooled) NVIDIA GeForce RTX 2080 Ti Founders Edition NVIDIA GeForce RTX 2080 Founders Edition NVIDIA GeForce GTX 1080 Ti Founders Edition NVIDIA GeForce GTX 1080 Founders Edition NVIDIA GeForce GTX 980 Ti NVIDIA GeForce GTX 980 |
Video Drivers: | NVIDIA Release 411.51 Press AMD Radeon Software Adrenalin Edition 18.9.1 |
OS: | Windows 10 Pro (April 2018 Update) |
Spectre/Meltdown Mitigations | Yes, both |
337 Comments
View All Comments
Hixbot - Friday, September 21, 2018 - link
I'm not sure how midrange 2070/2060 cards will sell if they're not a significant value in performance/price compared to 1070/1060 cards. If AMD offer no competition, Nvidia should still compete with itselfWwhat - Saturday, September 22, 2018 - link
It's interesting that every comment I've seen says a similar thing and that nobody thinks of uses outside of gaming.I would think that for real raytracers and Adobe's graphics and video software for instance the tensor and RT cores would be very interesting.
I wonder though if open source software will be able to successfully use that new hardware or that Nvidia is too closed for it to get the advantages you might expect.
And apart from raytracers and such there is also the software science students use too.
And with the interest in AI currently by students and developers it might also be an interesting offering.
Although that again relies on Nvidia playing ball a bit.
michaelrw - Wednesday, September 19, 2018 - link
"where paying 43% or 50% more gets you 27-28% more performance"1080 Ti can be bought in the $600 range, wheres the 2080 Ti is $1200 .. so I'd say thats more than 43-50% price increase..at a minimum we're talking a 71% increase, at worst 100% (Launch MSRP for 1080 Ti was $699)
V900 - Wednesday, September 19, 2018 - link
Which is the wrong way of looking at it.NVIDIA didn’t just increase the price for shit and giggles, the Turing GPUs are much more expensive to fab, since you’re talking about almost 20 BILLION transistors squeezed into a few hundred mm2.
Regardless: Comparing the 2080 with the 1080, and claiming there is a 70% price increase, is a bogus logic in the first place, since the 2080 brings a number of things to the table that the 1080 isn’t even capable of.
Find me a 1080ti with DLSS and that is also capable of raytracing, and then we can compare prices and figure out if there’s a price increase or not.
imaheadcase - Wednesday, September 19, 2018 - link
In brings it to the table..on paper more like it. You literally listed the two things that are not really shown AT ALL.mscsniperx - Wednesday, September 19, 2018 - link
No, actually YOUR logic is bogus. Find me a DLSS or Raytracing game to bench.. You can't. There is a reason for that. Raytracing will require a Massive FPS hit, Nvidia knows this and is delaying you from seeing that as damage control.Yojimbo - Wednesday, September 19, 2018 - link
There are no ray tracing games because the technology is new, not because NVIDIA is "delaying them". As far as DLSS, I think those games will appear faster than ray tracing.Andrew LB - Thursday, September 20, 2018 - link
Coming soon:Darksiders III from Gunfire Games / THQ Nordic
Deliver Us The Moon: Fortuna from KeokeN Interactive
Fear The Wolves from Vostok Games / Focus Home Interactive
Hellblade: Senua's Sacrifice from Ninja Theory
KINETIK from Hero Machine Studios
Outpost Zero from Symmetric Games / tinyBuild Games
Overkill's The Walking Dead from Overkill Software / Starbreeze Studios
SCUM from Gamepires / Devolver Digital
Stormdivers from Housemarque
Ark: Survival Evolved from Studio Wildcard
Atomic Heart from Mundfish
Dauntless from Phoenix Labs
Final Fantasy XV: Windows Edition from Square Enix
Fractured Lands from Unbroken Studios
Hitman 2 from IO Interactive / Warner Bros.
Islands of Nyne from Define Human Studios
Justice from NetEase
JX3 from Kingsoft
Mechwarrior 5: Mercenaries from Piranha Games
PlayerUnknown’s Battlegrounds from PUBG Corp.
Remnant: From The Ashes from Arc Games
Serious Sam 4: Planet Badass from Croteam / Devolver Digital
Shadow of the Tomb Raider from Square Enix / Eidos-Montréal / Crystal Dynamics / Nixxes
The Forge Arena from Freezing Raccoon Studios
We Happy Few from Compulsion Games / Gearbox
Funny how the same people who praised AMD for being the first to bring full DX12 support yet only 15 games in the first two years used it, are the same people sh*tting on nVidia for bringing a far more revolutionary technology that's going to be in far more games in a shorter time span.
jordanclock - Thursday, September 20, 2018 - link
Considering AMD was the first to bring support to an API that all GPUs could have support for, DLSS is not a comparison. DLSS is an Nvidia-only feature and Nvidia couldn't manage to have even ONE game on launch day with DLSS.Manch - Thursday, September 20, 2018 - link
AMD spawned Mantle which then turned into Vulcan. Also pushed MS to dev DX12 as it was in both their interests. These APIs can be used by all.DLSS while potentially very cool, is as Jordan said proprietary. Like hair works and other crap ot will get light support but devs when it comes to feature sets will spend most of their effort building to common ground. With consoles being AMD GPU based, guess where that will be.
If will be interesting how AMD will ultimatley respond. Ie gsync/freesync CUDA/OpenCL, etc.
As Nvidia has stated, these features are designed to work with how current game engines already function so they dont (the devs) have to reinvent the wheel. Ultimately this meanz the integration wont be very deep at least not for awhile.
For consumers the end goal is always better graphics at the same price point when new releases happen.
Not that these are bad cards, just expensove and two very key features are unavailable, and that sucks. Hopefully the situation will change sooner rather than later.