GPU Performance

Last but certainly not least, we have GPU performance. As we mentioned earlier, the Snapdragon 810 introduces Qualcomm's Adreno 430, the latest member of the Adreno 400 GPU family. Qualcomm's own performance estimates call for a 30% increase over Adreno 420, with a final GPU clock of 600MHz being identical to the Snapdragon 805's (Adreno 420) own GPU clock speed.

From an architectural standpoint Adreno continues to be something of a black box for us. Other than being a modern OpenGL ES 3.1/AEP design, we don't know too much about how the GPU is laid out, and Qualcomm's current legal battle with NVIDIA likely not helping matters. In any case, Qualcomm has indicated that Adreno 430 is not just a simple extension of Adreno 420, so we may be looking at an architectural change such as wider shader blocks.

For today's benchmarks, as we mentioned before we only had a limited amount of time with the Snapdragon 810 and had issues with BaseMark X. We've had to pare down our GPU benchmarks to just 3DMark 1.2 and GFXBench 3.0. Once we get final hardware in, we will be able to run a wider array of graphics benchmarks on Snapdragon 810.

3DMark 1.2 Unlimited - Overall

3DMark 1.2 Unlimited - Graphics

3DMark 1.2 Unlimited - Physics

Starting off with 3DMark, compared to the Snapdragon 805 reference platform the actual graphics performance advantage is even greater than 30%, coming in at closer to 65%. However since drivers play a big role in this, a more recent 805 platform like the Nexus 6 may be a better comparison point, in which case the gains are 33%, just a hair over Qualcomm's own baseline performance estimate. We also find that Snapdragon 810 oddly struggles at physics performance here, underperforming Snapdragon 805 devices, something the Exynos 5433 didn't have trouble with. As a result overall performance is only slightly improved over the Nexus 6.

Continuing with GFXBench, we look at more pure GPU loads. One has to take note that the MDP/T employs a 4K screen resolution, and the on-screen results will likely suffer from that.

GFXBench 3.0 Manhattan (Onscreen)

GFXBench 3.0 Manhattan (Offscreen)

GFXBench 3.0 T-Rex HD (Onscreen)

GFXBench 3.0 T-Rex HD (Offscreen)

Under GFXBench 3.0's full rendering tests of Manhattan and T-Rex, the Snapdragon 810 continues to show considerable performance gains over the Snapdragon 805. Ignoring the onscreen results for now since the Snapdragon 810 reference platform runs at such a high resolution, offscreen results show the 810 outperforming the 805 by 33% in Manhattan and 16% in T-Rex. The former is again well in-line with Qualcomm's performance estimate, wile the older T-Rex benchmark doesn't show the same gains, possibly indicating that Adreno 430's biggest gains are going to come from shader-bound scenarios.

GFXBench 3.0 ALU Test (Onscreen)

GFXBench 3.0 ALU Test (Offscreen)

GFXBench 3.0 Alpha Blending Test (Onscreen)

GFXBench 3.0 Alpha Blending Test (Offscreen)

GFXBench 3.0 Fill Rate Test (Onscreen)

GFXBench 3.0 Fill Rate Test (Offscreen)

Meanwhile GFXBench's synthetic tests continue to put Adreno 430 and the Snapdragon 810 in a good light. ALU performance in particular is showing very large gains - 46% better than the Snapdragon 805 and Adreno 420 - while our blending and fillrate tests show almost no gain over Snapdragon 805. This adds further credence to our theory that Qualcomm has widened or otherwise improved Adreno's shader blocks for 430, as other elements of the GPU are not showing significant performance changes.

GFXBench 3.0 Driver Overhead Test (Onscreen)

GFXBench 3.0 Driver Overhead Test (Offscreen)

GFXBench 3.0 Quality/Accuracy Test (Medium Precision)

GFXBench 3.0 Quality/Accuracy Test (High Precision)

Finally, GFXBench's driver overhead and accuracy tests are more or less what we would expect for Snapdragon 810. In the case of driver overhead, a combination of newer drivers and a much faster CPU have reduced the CPU cost of driver overhead. Meanwhile with the underlying GPU architecture being unchanged, there are no material changes to quality/accuracy.

Overall then the performance gains for the Adreno 430 and Snapdragon 810 seem to be almost exclusively focused on shader performance, but in those cases where rendering workloads are shader bound, Qualcomm's 30% estimate is on the mark. Real-word performance gains meanwhile are going to depend on the nature of the workload; games and applications that are similarly shader-bound should see good performance gains, while anything that's bottlenecked by pixel throughput, texturing, or front-end performance will see much smaller gains. Thankfully for Qualcomm most high-end workloads are indeed shader bound, and this is especially the case when pushing high resolutions, as Qualcomm is trying to do with their 4K initiative for Snapdragon 810. However in the case of 4K, while Adreno 430 offers improved performance it's still slow enough that it's going to struggle to render any kind of decently complex content at that resolution.

As for Adreno 430 versus the competition, Qualcomm has narrowed much of the gap between themselves and NVIDIA/Apple, but they haven't closed it. Apple's Imagination GX6850 and NVIDIA's K1 GPUs continue to hold a performance advantage, particularly in GFXBench's Manhattan and T-Rex full rendering tests. Both Apple and NVIDIA invested significant die space in graphics, and while we don't know how much Qualcomm has invested in Adreno 430 with Snapdragon 810, it's safe to say right now that they would need to invest even more if they want to beat the graphics performance of NVIDIA and Apple's tablet SoCs.

CPU/System Performance Final Words
Comments Locked

119 Comments

View All Comments

  • tipoo - Thursday, February 12, 2015 - link

    Unless you have information we don't, we still have no sweet clue about the TDP of the X1. So I'll give that a [citation needed].
  • kron123456789 - Friday, February 13, 2015 - link

    Well, there is one clue about that from Nvidia — they claimed that Tegra X1 consumes 10W while running The Elemental demo(which is, considering frame drops, full load of the GPU)
  • tipoo - Friday, February 13, 2015 - link

    Exactly. Way too high for a phone. They'd have to drop wattage by nearly *triple*, so I'm not sure I believe that simply clocking it lower would have them lead on performance per watt.

    And I hope the 1tflop bogus number wasn't part of ops calculus.
  • kron123456789 - Saturday, February 14, 2015 - link

    You say "drop wattage by nearly *triple*" like other SoCs consumes no more than 3-3.5W.
    And i think this 1TFLOP isn't bogus, it's just in FP16 mode.
  • serendip - Friday, February 13, 2015 - link

    I assume Intel and Nvidia are still behind Qualcomm and Samsung when it comes to integrating LTE capability into their SOCs. Then again, the article mentioned that the power saving from having integrated LTE isn't much compared to other components.

    Any idea why Samsung went with Intel modems on some Exynos variants? The proliferation of so many LTE bands creates a mess of SKUs. It's interesting that some Galaxy S5 Snapdragon variants have access to TD-LTE, FDD-LTE, CDMA2000, WCDMA and GSM in one device.
  • hlovatt - Thursday, February 12, 2015 - link

    Really liked all the RF info and as you said this RF performance is just as important in overall phone performance as CPU and GPU. Now all we need to know is how it performs in an actual phone.
  • Gunbuster - Thursday, February 12, 2015 - link

    Maybe now that we can see this is not blowing other SOC's out of the water the big players can get some good pricing from Qualcomm. Perhaps Microsoft could make a real affordable flagship this time around... (or make the weak ass S4XX affordable flagship actually affordable at $200)
  • tipoo - Thursday, February 12, 2015 - link

    Any plans on throttling tests? That was the big controversy, with Samsung rumoured to not use it in the upcoming GS6 because of overheating.
  • JoshHo - Thursday, February 12, 2015 - link

    We intend on doing deep testing of the first S810 phone we get to the bottom of the story.
  • tipoo - Friday, February 13, 2015 - link

    Good to know, looking forward to you guys getting to the bottom of it. I've been wondering if Samsung was just saying that to hype up their own Exynos, or if the other phone manufacturers are going to have problems with S810.

Log in

Don't have an account? Sign up now