Compute & Synthetics

Moving on to the low-level compute guts of the cards, we take a look at compute and synthetic results starting with tensor core accelerated GEMM.

Compute: General Matrix Multiply Single Precision (SGEMM)

While using binaries compiled for Volta, Turing is backwards compatible in that respect as it is in the same compute capability family (sm_75 compared to Volta's sm_70). In terms of compute resources, the RTX 2080 Ti's 544 tensor cores and 1545MHz boost clock is not far off of the Titan V's 640 tensor cores and 1455MHz boost clock, so the latest Turing-optimized binaries should better reflect the RTX 2080 Ti's raw GEMM acceleration capabilities. Likewise for the 368 tensor core RTX 2080, whose tensor-accelerated HGEMM performance in TFLOPS is somewhere around 20% less than the RTX 2080 Ti.

Compute: CompuBench 2.0 - Level Set Segmentation 256

Compute: CompuBench 2.0 - N-Body Simulation 1024K

Compute: CompuBench 2.0 - Optical Flow

Compute: Folding @ Home Single Precision

Compute: Geekbench 4 - GPU Compute - Total Score

Synthetic: TessMark, Image Set 4, 64x Tessellation

.

Total War: Warhammer II Power, Temperature, and Noise
Comments Locked

337 Comments

View All Comments

  • mapesdhs - Thursday, September 27, 2018 - link

    It also glosses over the huge pricing differences and the fact that most gamers buy AIB models, not reference cards.
  • noone2 - Thursday, September 20, 2018 - link

    Not sure why people are so negative about these and the prices. Sell your old card and amortize the cost over how long you'll keep the new one. So maybe $400/year (less if you keep it longer).

    If you're a serious gamer, are you really not willing to spend a few hundred dollars per year on your hardware? I mean, the performance is there and it's somewhat future proofed (assuming things take off for RT and DLSS.)

    A bowling league (they still have those?) probably costs more per year than this card. If you only play Minecraft I guess you don't need it, but if you want the highest setting in the newest games and potentially the new technology, then I think it's worth it.
  • milkod2001 - Friday, September 21, 2018 - link

    Performance is not there. Around 20% actual performance boost is not very convincing especially due much higher price. How can you be positive about it?
    Future tech promise doesn't add that much and it is not clear if game developers will bother.
    When one spend $1000 of GPU it has to deliver perfect 4k all maxed gaming and NV charges ever more. This is a joke, NV is just testing how much they can squeeze of us until we simply don't buy.
  • noone2 - Friday, September 21, 2018 - link

    The article clearly says that the Ti is 32% better on average.

    The idea about future tech is you either do it and early adopters pay for it in hopes it catches on, or you never do it and nothing ever improves. Game developers don't really create technology and then ask hardware producers to support it/figure out how to do it. Dice didn't knock on Nvidia's door and pay them to figure out how to do ray tracing in real time.

    My point remains though: If this is a favorite hobby/pass-time, then it's a modest price to pay for what will be hundreds of hours of entertainment and the potential that ray tracing and DLSS and whatever else catches on and you get to experience it sooner rather than later. You're saying this card is too expensive, yet I can find console players who think a $600 video game is too expensive too. Different strokes for different folks. $1100 is not terrible value. You talking hundreds of dollars here, not 10s of thousands of dollars. It's drop in the bucket in the scope of life.
  • mapesdhs - Thursday, September 27, 2018 - link

    So Just Buy It then? Do you work for toms? :D
  • TheJian - Thursday, September 20, 2018 - link

    "Ultimately, gamers can't be blamed for wanting to game with their cards, and on that level they will have to think long and hard about paying extra to buy graphics hardware that is priced extra with features that aren't yet applicable to real-world gaming, and yet only provides performance comparable to previous generation video cards."

    So, I guess I can't read charts, because I thought they said 2080ti was massively faster than anything before it. We also KNOW devs will take 40-100% perf improvements seriously (already said such, and NV has 25 games being worked on now coming soon with support for their tech) and will support NV's new tech since they sell massively more cards than AMD.

    Even the 2080 vs. 1080 is a great story at 4k as the cards part by quite a margin in most stuff.
    IE, battlefield 1, 4k test 2080fe scores 78.9 vs. 56.4 for 1080fe. That's a pretty big win to scoff at calling it comparable is misleading at best correct? Far Cry 5 same story, 57 2080fe, vs. 42 for 1080fe. Again, pretty massive gain for $100. Ashes, 74 to 61fps (2080fe vs. 1080fe). Wolf2 100fps for 2080fe, vs. 60 for 1080fe...LOL. Well, 40% is, uh, "comparable perf"...ROFL. OK, I could go on but whatever dude. Would I buy one if I had a 1080ti, probably not unless I had cash to burn, but for many that usually do buy these things, they just laughed at $100 premiums...ROFL.

    Never mind what these cards are doing to the AMD lineup. No reason to lower cards, I'd plop them on top of the old ones too, since they are the only competition. When you're competing with yourself you just do HEDT like stuff, rather than shoving down the old lines. Stack on top for more margin and profits!

    $100 for future tech and a modest victory in everything or quite a bit more in some things, seems like a good deal to me for a chip we know is expensive to make (even the small one is Titan size).

    Oh, I don't count that fold@home crap, synthetic junk as actual benchmarks because you gain nothing from doing it but a high electric bill (and a hot room). If you can't make money from it, or play it for fun (game), it isn't worth benchmarking something that means nothing. How fast can you spit in the wind 100 times. Umm, who cares. Right. Same story with synthetics.
  • mapesdhs - Thursday, September 27, 2018 - link

    It's future tech that cannot deliver *now*, so what's the point? The performance just isn't there, and it's a pretty poor implementation of what they're boasting about anyway (I thought the demos looked generally awful, though visual realism is less something I care about now anyway, games need to better in other ways). Fact is, the 2080 is quite a bit more expensive than a new 1080 Ti for a card with less RAM and no guarantee these supposed fancy features are going to go anywhere anyway. The 2080 Ti is even worse; it has the speed in some cases, but the price completely spoils the picture, where I am the 2080 Ti is twice the cost of a 1080 Ti, with no VRAM increase either.

    NVIDIA spent the last 5 years pushing gamers into high frequency displays, 4K and VR. Now they're trying to do a total about face. It won't work.
  • lenghui - Thursday, September 20, 2018 - link

    Thanks for rushing the review out. BTW, the auto-play video on every AT page has got to stop. You are turning into Tom's Hardware.
  • milkod2001 - Friday, September 21, 2018 - link

    They are both owned by Purch. Marketing company responsible for those annoying auto play videos and the lowest crap possible From the web section. They go with motto: Ad clicks over anything. Don't think it will change anytime soon. Anand sold his soul twice to Apple and also his web to Purch.
  • mapesdhs - Thursday, September 27, 2018 - link

    One can use Ublock Origin to prevent those jw-player vids.

Log in

Don't have an account? Sign up now