The NVIDIA GeForce RTX 2080 Ti & RTX 2080 Founders Edition Review: Foundations For A Ray Traced Future
by Nate Oh on September 19, 2018 5:15 PM EST- Posted in
- GPUs
- Raytrace
- GeForce
- NVIDIA
- DirectX Raytracing
- Turing
- GeForce RTX
Battlefield 1 (DX11)
Battlefield 1 returns from the 2017 benchmark suite, the 2017 benchmark suite with a bang as DICE brought gamers the long-awaited AAA World War 1 shooter a little over a year ago. With detailed maps, environmental effects, and pacy combat, Battlefield 1 provides a generally well-optimized yet demanding graphics workload. The next Battlefield game from DICE, Battlefield V, completes the nostalgia circuit with a return to World War 2, but more importantly for us, is one of the flagship titles for GeForce RTX real time ray tracing, although at this time it isn't ready.
We use the Ultra preset is used with no alterations. As these benchmarks are from single player mode, our rule of thumb with multiplayer performance still applies: multiplayer framerates generally dip to half our single player framerates. Battlefield 1 also supports HDR (HDR10, Dolby Vision).
Battlefield 1 | 1920x1080 | 2560x1440 | 3840x2160 |
Average FPS | |||
99th Percentile |
At this point, the RTX 2080 Ti is fast enough to touch the CPU bottleneck at 1080p, but it keeps its substantial lead at 4K. Nowadays, Battlefield 1 runs rather well on a gamut of cards and settings, and in optimized high-profile games like these, the 2080 in particular will need to make sure that the veteran 1080 Ti doesn't edge too close. So we see the Founders Edition specs are enough to firmly plant the 2080 Founders Edition faster than the 1080 Ti Founders Edition.
The outlying low 99th percentile reading for the 2080 Ti occurred on repeated testing, and we're looking into it further.
337 Comments
View All Comments
beisat - Thursday, September 20, 2018 - link
Very nice review, by far the best one I've read. Thanks for that.How likely do you think the launch of another generation is in 2019 from Nvidia / and or something competitive from AMD based on 7nm?
I currently have gtx970, skipped the Pascal generation and was waiting for Turing. But I don't like being an early adopter and feel that for pure rasterisation, these cards aren't worth it. Yes they are more powerful then the 10er series I skipped, but they also costs more - so performance pro $$$ is similar, and I'm not willing to pay the same amout of $$$ for the same performance as I would have 2 years ago.
Guess I'll just have to stick it out with my 970 at 1080p?
dguy6789 - Thursday, September 20, 2018 - link
RTX 2080 Ti and 2080 are highly disappointing.V900 - Thursday, September 20, 2018 - link
That’s a rather debatable take that most hardware sites and tech-journalists would disagree with.But would do they know, amirite?
dguy6789 - Friday, September 21, 2018 - link
Just about every review of these cards states that right now they're disappointing and we need to wait and see how ray tracing games pan out to see if that will change.We waited this many years to have the smallest generation to generation performance jump we have ever seen. Price went way up too. The cards are hotter and use a more power which makes me question how long they last before they die.
The weird niche Nvidia "features" these cards have will end up like PhysX.
The performance you get for what you pay for a 2080 or 2080 Ti is simply terrible.
dguy6789 - Friday, September 21, 2018 - link
Not to mention that Nvidia's stock was just downgraded due to the performance of the 2080 and 2080 Ti.mapesdhs - Thursday, September 27, 2018 - link
V900, you've posted a lot stuff here that was itself debatable, but that comment was just nonsense. I don't believe for a moment you think most tech sites think these cards are a worthy buy. The vast majority of reviews have been generally or heavily negative. I therefore conclude troll.hammer256 - Thursday, September 20, 2018 - link
Oof, still on the 12nm process. Which frankly is quite remarkable how much rasterization performance they were able to squeeze out, while putting in the tensor and ray tracing cores. The huge dies are not surprising in that regard. In the end, architectural efficiency can only go so far, and the fundamental limit is still on transistor budget.With that said, I'm guessing there's going to be a 7nm refresh pretty soon-ish? I would wait...
V900 - Thursday, September 20, 2018 - link
You might have to wait a long time then.Don’t see a 7nm refresh on the horizon. Maybe in a year, probably not until 2020.
*There isn’t any HP/high density 7nm process available right now. (The only 7nm product shipping right now is the A12. And that’s a low power/mobile process. The 7nm HP processes are all in various form of pre-production/research.
*Price. 7nm processes are going to be expensive. And the Turing dies are gigantic, and already expensive to make on its current node. That means that Nvidia will most likely wait with a 7nm Turing until proces have come down, and the process is more mature.
*And then there’s the lack of competition: AMD doesn’t have anything even close to the 2080 right now, and won’t for a good 3 years if Navi is a mid-range GPU. As long as the 2080Ti is the king of performance, there’s no reason for Nvidia to rush to a smaller process.
Zoolook - Thursday, September 27, 2018 - link
Kirin 980 has been shipping for a while, should be in stores in two weeks, we know that atleast Vega was sampling in June, so it depends on the allocation at TSMC it's not 100% Apple.Antoine. - Thursday, September 20, 2018 - link
The assumption under which this article operates that RTX2080 should be compared to GTX1080 and RTX2080TI to GTX1080TI is a disgrace. It allows you to be overly satisfied with performance evolutions between GPUS with a vastly different price tag! It just shows that you completely bought the BS renaming of Titan into Ti's. Of course the next gen Titan is going to perform better than the previous generation's Ti ! Such a gullible take on these new products cannot be by sheer stupidity alone.