Gaming Performance

Sure, compute is useful. But be honest: you came here for the 4K gaming benchmarks, right?

Battlefield 1 - 3840x2160 - Ultra Quality

Battlefield 1 - 99th Percentile - 3840x2160 - Ultra Quality

Ashes of the Singularity: Escalation - 3840x2160 - Extreme Quality

Ashes: Escalation - 99th Percentile - 3840x2160 - Extreme Quality

Already after Battlefield 1 (DX11) and Ashes (DX12), we can see that Titan V is not a monster gaming card, though it still is faster than Titan Xp. This is not unexpected, as Titan V's focus is quite far away from gaming as opposed to the focus of the previous Titan cards.

Doom - 3840x2160 - Ultra Quality

Doom - 99th Percentile - 3840x2160 - Ultra Quality

Ghost Recon Wildlands - 3840x2160 - Very High Quality

Deus Ex: Mankind Divided - 3840x2160 - Ultra Quality

Grand Theft Auto V - 3840x2160 - Very High Quality

Grand Theft Auto V - 99th Percentile - 3840x2160 - Very High Quality

Total War: Warhammer - 3840x2160 - Ultra Quality

Despite being generally ahead of Titan Xp, it's clear Titan V is suffering from lack of gaming optimization. And for that matter, the launch drivers definitely have bugs in them as far as gaming is concerned. Titan V on Deus Ex resulted in small black box artifacts during the benchmark; Ghost Recon Wildlands experienced sporadic but persistant hitching, and Ashes occasionally suffered from fullscreen flickering.

And despite the impressive 3-digit FPS in the Vulkan-powered DOOM, the card actually falls behind Titan Xp in 99th percentile framerates. For such high average framerates, even a 67fps 99th percentile can reduce perceived smoothness. Meanwhile, running Titan V under DX12 for Deus Ex and Total War: Warhammer resulted in less performance. But with immature gaming drivers, it is too early to say if these are representative of low-level API performance on Volta itself.

Overall, the Titan V averages out to around 15% faster than the Titan Xp, excluding 99th percentiles, but with the aforementioned caveats. Titan V's high average FPS in DOOM and Deus Ex are somewhat marred by stagnant 99th percentiles and minor but noticable artifacting, respectively.

So as a pure gaming card, our preview results indicate that this would not the best gaming purchase at $3000. Typically, a $1800 premium for around 10 - 20% faster gaming over the Titan Xp wouldn't be enticing, but it seems there are always some who insist.

Synthetic Graphics Performance But Can It Run Crysis?
POST A COMMENT

112 Comments

View All Comments

  • Ryan Smith - Wednesday, December 20, 2017 - link

    The "But can it run Crysis" joke started with the original Crysis in 2007. So it was only appropriate that we use it for that test. Especially since it let us do something silly like running 4x supersample anti-aliasing. Reply
  • crysis3? - Wednesday, December 20, 2017 - link

    ah Reply
  • SirPerro - Wednesday, December 20, 2017 - link

    They make it pretty clear everywhere this card is meant for ML training.

    It's the only scenario where it makes sense financially.

    Gaming is a big NO at 3K dollars per card. Mining is a big NO with all the cheaper specific chips for the task.

    On ML it may mean halving or cutting by 4 the training time on a workstation, and if you have it running 24/7 for hyperparameter tuning it pays itself compared to the accumulated costs of Amazon or Google cloud machines.

    An SLI of titans and you train huge models under a day in a local machine. That's a great thing to have.
    Reply
  • mode_13h - Wednesday, December 27, 2017 - link

    The FP64 performance indicates it's also aimed at HPC. One has to wonder how much better it could be at each, if it didn't also have to do the other.

    And for multi-GPU, you really want NVlink - not SLI.
    Reply
  • npz - Wednesday, December 20, 2017 - link

    Can you include more integer benchmarks in the future? I see you have INT8 texture fillrate but I'd like to see more general int32 and int64 compute outside of the graphics rendering pipeline.

    I'm also interested in threading and branching performance. It would be interesting to see how the scheduler handles the workload. I have never seen anyone test this before.
    Reply
  • Ryan Smith - Wednesday, December 20, 2017 - link

    There are very few GPU integer benchmarks. If you do have any specific items you'd like to see, definitely drop me a line and I can look into it. Reply
  • Klimax - Monday, December 25, 2017 - link

    Generally mining SW at least. From memory intensive to shader-bound. IIRC some BOINC-based integer application supports GPU too. Reply
  • mode_13h - Wednesday, December 27, 2017 - link

    Try to get inferencing running with TensorRT. It can optimize your models to use int8. Also, I've read Caffe2 supports int8.

    I haven't tried either.
    Reply
  • takeshi7 - Wednesday, December 20, 2017 - link

    Will game developers be able to use these tensor cores to make the AI in their games smarter? That would be cool if AI shifted from the CPU to the GPU. Reply
  • DanNeely - Wednesday, December 20, 2017 - link

    First and formost, that depends if mainstream Volta cards get tensor cores.

    Beyond that I'm not sure how much it'd help there directly, AFAIK what Google/etc are doing with machine learning and neural networks is very different from typical game AI.
    Reply

Log in

Don't have an account? Sign up now