The Minor Issue of Overzealous Marketing

As mentioned earlier in the piece, the most common numbers from Huawei and Honor about the new technology follow the same pattern: GPU Turbo is going to offer up to 60% extra performance, and 30% better power consumption. Since launch, out of all the marketing materials we have seen, there is exactly one instance where either company expands on these figures. This is in the footnotes of Honor Play’s English global product page, explaining the context of the 60%/30% numbers:

Honor Play's Product Website GPU Turbo Explanation

Here is what that tiny bullet point says:

*2 The GPU Turbo is a graphics processing technology that is based on Kirin chips and incorporates mutualistic software and hardware interaction. And it supports some particular games. 
Results are based on comparison with the previous generation chip, the Kirin 960.

This is a big red flag. Normally when comparing a new technology, the performance difference should be quoted in an off/on state. So it shouldn’t be too complicated to see as to the fact that using the Kirin 960 as the base result is a pretty massive issue. It means that the marketing materials are mixing up its claims – values that are explicitly being attributed to GPU Turbo, a software technology, are mixed with silicon improvements between two generations of chipsets.

The honest comparison should be the Kirin 970 with GPU Turbo off and the Kirin 970 with GPU Turbo on. In this case, the baseline result is with the Kirin 960 with no GPU Turbo, compared against the latest Kirin 970 with GPU Turbo on.

For our readers unfamiliar with the generational improvements of the new Kirin 970 chipset, I recommend referring back to our in-depth article review of chipset released back in January. In terms of advancements, the Kirin 970 brings a new Mali G72MP12 GPU running at 747MHz, manufactured on a new TSMC 10nm process. This represented quite an improvement to the 16nm manufactured Kirin 960 which featured a Mali G71MP8 at up to 1037MHz.

Kirin 970 AnandTech Kirin 960
TSMC 10FF Mfg. Process TSMC 16FFC
4xA73 @ 2.36 GHz
4xA53 @ 1.84 GHz
CPU 4xA73 @ 2.36 GHz
4xA53 @ 1.84 GHz
Mali-G72MP12 @ 746 MHz GPU Mali-G71MP8 @ 1035 MHz
Yes NPU No
Cat 18/13 Modem Cat 12/13

Furthermore, the Kirin 960’s GPU performance and efficiency was extremely problematic, showcasing some of the worst behavior we’ve ever seen in any smartphone ever released. We’re not going to go back as to why this happened, but it was a competitive blow to the Kirin 960.

Now the Kirin 970 improved from these low figures, as we’ve shown in our reviews. But the 60% performance improvements and 30% power improvement mentioned for GPU Turbo, while in isolation might sound impressive, aren't nearly as impressive once we know what they're based on. By being relative to the badly performing Kirin 960, it completely changes the meaning. Users that enable GPU Turbo on their devices will not experience a 60%/30% difference in performance.

This is also in the face of Huawei’s own data presented throughout the lifetime of GPU Turbo. Quoting 60%/30% makes for impressive headlines (regardless of how honest they are), however even Huawei’s own analysis shows that 60%/30% are wildly optimistic:

Ultimately Huawei presented the 60%/30% figures as a differential between GPU Turbo On/Off. If anyone was expecting that on their device, then they would be sorely disappointed. The fact that the companies obfuscated the crucial comparison point of the Kirin 960 is almost unreal in that respect.

Also on that image above, we have to criticize quite heavily on the fact that those bar charts are misrepresenting all the gains: the 3 FPS gain in PUBG is shown as a 25% gain. Companies feel the need to misrepresent the true growth in values like this because it makes for a more impressive graph, rather than adhere to the standard of starting graphs at zero.

Why Using The Kirin 960 Is An Issue: Starting With A Low Bar

Going back to our GPU power efficiency tables measured in GFXBench Manhattan 3.1 and T-Rex, we put the two chipsets back into context:

GFXBench Manhattan 3.1 Offscreen Power Efficiency
(System Active Power)
AnandTech Mfc. Process FPS Avg. Power
(W)
Perf/W
Efficiency
Galaxy S9+ (Snapdragon 845) 10LPP 61.16 5.01 11.99 fps/W
Galaxy S9 (Exynos 9810) 10LPP 46.04 4.08 11.28 fps/W
Galaxy S8 (Snapdragon 835) 10LPE 38.90 3.79 10.26 fps/W
LeEco Le Pro3 (Snapdragon 821) 14LPP 33.04 4.18 7.90 fps/W
Galaxy S7 (Snapdragon 820) 14LPP 30.98 3.98 7.78 fps/W
Huawei Mate 10 (Kirin 970) 10FF 37.66 6.33 5.94 fps/W
Galaxy S8 (Exynos 8895) 10LPE 42.49 7.35 5.78 fps/W
Galaxy S7 (Exynos 8890) 14LPP 29.41 5.95 4.94 fps/W
Meizu PRO 5 (Exynos 7420) 14LPE 14.45 3.47 4.16 fps/W
Nexus 6P (Snapdragon 810 v2.1) 20Soc 21.94 5.44 4.03 fps/W
Huawei Mate 8 (Kirin 950) 16FF+ 10.37 2.75 3.77 fps/W
Huawei Mate 9 (Kirin 960) 16FFC 32.49 8.63 3.77 fps/W
Huawei P9 (Kirin 955) 16FF+ 10.59 2.98 3.55 fps/W
GFXBench T-Rex Offscreen Power Efficiency
(System Active Power)
AnandTech Mfc. Process FPS Avg. Power
(W)
Perf/W
Efficiency
Galaxy S9+ (Snapdragon 845) 10LPP 150.40 4.42 34.00 fps/W
Galaxy S9 (Exynos 9810) 10LPP 141.91 4.34 32.67 fps/W
Galaxy S8 (Snapdragon 835) 10LPE 108.20 3.45 31.31 fps/W
LeEco Le Pro3 (Snapdragon 821) 14LPP 94.97 3.91 24.26 fps/W
Galaxy S7 (Snapdragon 820) 14LPP 90.59 4.18 21.67 fps/W
Galaxy S8 (Exynos 8895) 10LPE 121.00 5.86 20.65 fps/W
Galaxy S7 (Exynos 8890) 14LPP 87.00 4.70 18.51 fps/W
Huawei Mate 10 (Kirin 970) 10FF 127.25 7.93 16.04 fps/W
Meizu PRO 5 (Exynos 7420) 14LPE 55.67 3.83 14.54 fps/W
Nexus 6P (Snapdragon 810 v2.1) 20Soc 58.97 4.70 12.54 fps/W
Huawei Mate 8 (Kirin 950) 16FF+ 41.69 3.58 11.64 fps/W
Huawei P9 (Kirin 955) 16FF+ 40.42 3.68 10.98 fps/W
Huawei Mate 9 (Kirin 960) 16FFC 99.16 9.51 10.42 fps/W

So while the Kirin 970 is an advancement and improvement over the 960 – in the context of the competition, it still has trouble holding up with this generation’s Exynos and Snapdragon.

The key point I’m trying to make here in the context of GPU Turbo claims, is that the 60%/30% figures are very much unrealistic and extremely misleading to users. If Huawei and Honor are not clear about the baseline comparisons, the companies' own numbers can never be trusted in future announcements again.

How Much Does GPU Turbo Actually Provide?

On Friday Huawei CEO Richard Yu announced the new Kirin 980, and some of the presentation slides addressed the new mechanism, showcasing a more concrete figure of the GPU Turbo effects on the newly released chipset:

Here, the actual performance improvement is rather minor because the workload is V-sync capped and the GPU doesn’t have issues in that regard, however the power improvements should be still representative. Here the actual power improvement was 10% - something that’s a lot more reasonable and believable improvement that can be attributed to software.

Problems with PUBG: Not All GPUs Render Equally Conclusion: Still A Plus, But
Comments Locked

64 Comments

View All Comments

  • Ian Cutress - Tuesday, September 4, 2018 - link

    In the past, those 'cheats' were often from not rendering parts of the scene. This is still doing the full render that any Mali GPU does, but in a more power efficient way. The key to benchmarking is to test across several titles regardless, which is going to be important moving forward.
  • Manch - Wednesday, September 5, 2018 - link

    Does Mali or any mobile GPUs do culling of unseen objects? If not, can that be implemented to further reduce load?
  • The Hardcard - Tuesday, September 4, 2018 - link

    That isn’t a quandry, it solves the problem. The problem before is that the makers showed benchmark performance that they didn’t feel the device could handle in normal user apps. If this pans out and users can have it everyday apps means no harm, no foul.

    Having it be a special mode for apps that can use it, while turning it off when it is not necessary is exactly what is needed and what everyone is trying to do and should do.

    If they do it properly, then it is on the developers to use it. Sure, older, unupdated apps will be left behind. That is the nature of advancing technology.
  • melgross - Tuesday, September 4, 2018 - link

    A benchmark cheat is just for benchmarks. There’s a reason for that, and it has to do with the fact that the SoC, and the device, as a whole, can’t perform at that level commercially, otherwise something negative will happen, such as overheating, and battery failure.

    So, no, they can’t extend cheating to regular apps, and that’s the entire point to the cheat. If they could, then they would, and it wouldn’t be a cheat. This cheating is different from the turbo mode the article is about.
  • s.yu - Monday, September 10, 2018 - link

    The only way this is working is the apparent popularity of MMO games. They only plan on catering to low end customer who only play whatever "everybody else" plays. I for one avoid them like the plague, IAP rigged games are cheap stimulation, too cheap.
  • tipoo - Tuesday, September 4, 2018 - link

    Reminds me of the good old ATI vs Nvidia days when there were notable differences in render quality, usually with the edge to ATI. That all but went away at least as far back as the 8800, maybe before. Now for mobile to repeat that process.
  • Ian Cutress - Tuesday, September 4, 2018 - link

    Just to make sure you're aware, that's kind of orthogonal to GPU Turbo. It's Mali behaviour right now, which explains some of the perf differences, but GPU Turbo is something separate.
  • Lord of the Bored - Wednesday, September 5, 2018 - link

    Not ALWAYS to ATI, though. Sometimes they got a little aggressive in their "optimizations" too.
    QUAFF3 NEVER FORGET!

    https://techreport.com/review/3089/how-ati-drivers... ffor the kiddos that never saw this one. Back when men were men, and PC gaming was the exclusive domain of nerds that knew what IRQ and DMA meant(but probably not PCMCIA. No one could remember PCMCIA).
  • Holliday75 - Friday, September 7, 2018 - link

    I recently found a PCMCIA 10mb NIC in one of my file cabinets and a 28.8k modem. I looked at them a second like wtf then remembered what they were.
  • nils_ - Friday, September 7, 2018 - link

    People Can't Memorize Computer Industry Acronyms

Log in

Don't have an account? Sign up now