Compute

Shifting gears, let’s take a look at compute performance on GTX 1080 Ti.

Starting us off for our look at compute is LuxMark3.1, the latest version of the official benchmark of LuxRender. LuxRender’s GPU-accelerated rendering mode is an OpenCL based ray tracer that forms a part of the larger LuxRender suite. Ray tracing has become a stronghold for GPUs in recent years as ray tracing maps well to GPU pipelines, allowing artists to render scenes much more quickly than with CPUs alone.

Compute: LuxMark 3.1 - Hotel

The OpenCL situation for NVIDIA right now is a bit weird. The company is in the middle of rolling out OpenCL 2.0 support to their video cards – something that I had actually given up hope on until it happened – and as a result their OpenCL drivers are in a state of flux as company continues to refine their updated driver. The end result is that OpenCL performance has dipped a bit compared to where the GTX 1080 launched at back in May, with said card dropping from 4138 points to 3648 points. Not that the GTX 1080 Ti is too fazed, mind you – it’s still king of the hill by a good degree – but the point is that once NVIDIA gets their drivers sorted out, there’s every reason to believe that NVIDIA can improve their OpenCL performance.

For our second set of compute benchmarks we have CompuBench 1.5, the successor to CLBenchmark. CompuBench offers a wide array of different practical compute workloads, and we’ve decided to focus on face detection, optical flow modeling, and particle simulations.

Compute: CompuBench 1.5 - Face Detection

Compute: CompuBench 1.5 - Optical Flow

Compute: CompuBench 1.5 - Particle Simulation 64K

Like LuxMark, CompuBench shows some minor performance regressions on the GTX 1080 as compared to the card’s launch. None the less, this doesn’t do anything to impede the GTX 1080 Ti’s status as the fastest of the GeForce cards. It dominates every sub-benchmark, including Optical Flow, where the original GTX 1080 was unable to pull away from AMD’s last-generation Radeon R9 Fury X.

Hitman Synthetics
Comments Locked

161 Comments

View All Comments

  • MrSpadge - Thursday, March 9, 2017 - link

    An HBM2 equipped vega(n) rabbit?
  • eek2121 - Thursday, March 9, 2017 - link

    Before you do that though, you should test Ryzen with the Ti. Reviewers everywhere are showing that for whatever reason, Ryzen shines with the 1080 Ti at 4k.
  • just4U - Friday, March 10, 2017 - link

    I did a double take there as I actually thought you type pull a rabbit out of my a... Was like ... wait, what?? (..chuckle) Anyway, good review Ryan. I read about the Ti being out soon.. didn't realize it was here already.
  • Drumsticks - Thursday, March 9, 2017 - link

    Nice review Ryan.

    I can't wait to see what Vega brings. I'm hoping we at least get a price war over a part that can sit in between the 1080 and Ti parts. I would love to see Vega pull off 75% faster than a Fury X (50% clock speed boost, 20% more IPC?) but wow that would be a tough order. Let's just hope AMD can bring some fire back to the market in May.
  • MajGenRelativity - Thursday, March 9, 2017 - link

    I'm also extremely interested in seeing what Vega brings as well. My wallet is ready to drop the bills necessary to get a card in this price range, but I'm waiting for Vega to see who gets my money.
  • ddriver - Thursday, March 9, 2017 - link

    It will bring the same thing as ever - superior hardware nvidia will pay off most game developers to sandbag, forcing amd to sell at a very nice price to the benefit of people like me, who don't care about games but instead use gpus for compute.

    For compute amd's gpus are usually 2-3 TIMES better value than nvidia. And I have 64 7950s in desperate need of replacing.
  • MajGenRelativity - Thursday, March 9, 2017 - link

    That's a lot of 7950s. What do you compute with them?
  • A5 - Thursday, March 9, 2017 - link

    Fake internet money, I assume. And maybe help the power company calculate his bill...
  • ddriver - Thursday, March 9, 2017 - link

    Nope, I do mostly 3D rendering, multiphysics simulations, video processing and such. Cryptocurrency is BS IMO, and I certainly don't need it.
  • MajGenRelativity - Thursday, March 9, 2017 - link

    I'm assuming you do that for your job? If not, that's an expensive hobby :P

Log in

Don't have an account? Sign up now