Mea-Culpa: It Should Have Been Caught Earlier

Section By Andrei Frumusanu

As stated on the previous page, I had initially had seen the effects of this behaviour back in January when I was reviewing the Kirin 970 in the Mate 10. The numbers I originally obtained showed worse-than-expected performance of the Mate 10, which was being beaten by the Mate 9. When we discussed the issue with Huawei, they attributed it to a firmware bug, and pushed me a newer build which resolved the performance issues. At the time, Huawei never discussed what that 'bug' was, and I didn't push the issue as performance bugs do happen.

For the Kirin 970 SoC review, I went through my testing and published the article. Later on, in the P20 reviews, I observed the same lower performance again. As Huawei had told me before it was a firmware issue, I had also attributed the bad performance to a similar issue, and expected Huawei to 'fix' the P20 in due course.

Looking back in hindsight, it is pretty obvious there’s been some less than honest communications with Huawei. The newly detected performance issues were not actually issues – they were actually the real representation of the SoC's performance. As the results were somewhat lower, and Huawei was saying that they were highly competetive, I never would have expected these numbers as genuine.

It's worth noting here that I naturally test with our custom benchmark versions, as they enable us to get other data from the tests than just a simple FPS value. It never crossed my mind to test the public versions of the benchmarks to check for any discrepancy in behaviour. Suffice to say, this will change in our testing in the future, with numbers verified on both versions.

Analyzing the New Competitive Landscape

With all that being said, our past published results for Kirin 970 devices were mostly correct - we had used a variant of the benchmark that wasn’t detected by Huawei’s firmware. There is one exception however, as we weren't using a custom version of 3DMark at the time. I’ve now re-tested 3DMark, and updated the corresponding figures in past reviews to reflect the correct peak and sustained performance figures.

As far as I could tell in my testing, the cheating behaviour has only been introduced in this year’s devices. Phones such as the Mate 9 and P10 were not affected. If I’m to be more precise, it seems that only EMUI 8.0 and newer devices are affected. Based on our discussions with Huawei, we were told that this was purely a software implementation, which also corroborates our findings.

Here is the competitive landscape across our whole mobile GPU performance suite, with updated figures where applicable. We are also including new figures for the Honor Play, and the new introduction of the GFXBench 5.0 Aztec tests across all of our recent devices:

3DMark Sling Shot 3.1 Extreme Unlimited - Graphics 

3DMark Sling Shot 3.1 Extreme Unlimited - Physics 

GFXBench Aztec Ruins - High - Vulkan/Metal - Off-screen GFXBench Aztec Ruins - Normal - Vulkan/Metal - Off-screen 

GFXBench Manhattan 3.1 Off-screen 

GFXBench T-Rex 2.7 Off-screen

Overall, the graphs are very much self-explanatory. The Kirin 960 and Kirin 970 are lacking in both performance and efficiency compared almost every device in our small test here. This is something Huawei is hoping to address with the Kirin 980, and features such as GPU Turbo.

Raw Benchmark Numbers The Reality of Silicon And Market Pressure
Comments Locked

84 Comments

View All Comments

  • Andrei Frumusanu - Tuesday, September 4, 2018 - link

    As noted, the Aztec labels have been fixed.
  • eastcoast_pete - Tuesday, September 4, 2018 - link

    @Andrei and Ian: I also wonder how the apparent superiority of highly customized GPUs in Apple and QC SoCs reflects a conundrum that ARM faces with their Mali designs: Basically, Mali GPUs have to work in configurations ranging from as few as 1-2 to as many as 20 (or 24) units and with widely varying CPU cores (dual-core A53 or 55 at the low end to now A76 or mongoose octacores). In contrast, Apple GPUs and QC's Adrenos appear to be a lot more "tailored" to the SoCs they end up in, which, together with optimized drivers, probably gives them a leg up. Andrei, as you are truly an seasoned expert in this field, I wonder if you could comment on this, maybe even in some detail in a dedicated article in the future?
  • mekpro - Wednesday, September 5, 2018 - link

    Qualcomm also need to ship smaller version of Adreno to their lower-end SoC like Snapdragon 625 or so. While Anandtech haven't cover the efficiency of lower-end Snapdragon SoC, my experience confirmed that Snapdragon 625 had better much better GPU efficiency than Kirin 960 (lower temp at the same frame rate)
  • vladx - Tuesday, September 4, 2018 - link

    What about Honor 9, is it affected after upgrading to EMUI 8?
  • nfriedly - Tuesday, September 4, 2018 - link

    > So Honor is trying to promote the Honor Play as a gaming-centric phone, making bold marketing claims about its performance and experience. This is a quite courageous marketing strategy given the fact that the SoC powering the phone is currently the worst of its generation when it comes to gaming.

    Do you mean "courageous" in the Apple sense?
  • V900 - Tuesday, September 4, 2018 - link

    I’m shocked and stunned, nay SHOCKED AND STUNNED that Chinese Smartphone/SOC vendors have been caught cheating in benchmarks.

    Such a surprising development from a country that’s known for being the high watermark in ethical and honest business practices.

    (Industrial espionage and stealing stealth fighters through hacking not withstanding.)
  • Allan_Hundeboll - Tuesday, September 4, 2018 - link

    I know Oneplus used to do something similar but they did it for benchmarks and games, witch is'nt cheating I my book because end users do get the performance shown in benchmarks.
    Is oneplus still doing this? The oneplus 6 seems to benchmark faster than other sd845 based phones, so this would kind of explain how.
  • zodiacfml - Tuesday, September 4, 2018 - link

    Is this click bait or something?
    One cannot simply add more power/speed to a device without throttling or shutting down. If one would run gaming benchmark for more than 10 minutes then it levels out. It is a different story with browser benchmarks where burst speeds for a few seconds is valuable.

    It would have been cheating if a phone starts dimming or turning off the display when there is a benchmark run.

    I wouldn't put much more value on battery efficiency when running games or benchmarks because it will be difficult to regulate. Everyone will have differing opinions about it.
    When you turbo/boost (any chip), it is the least efficient anyway.
  • unixfg - Wednesday, September 5, 2018 - link

    No, not Huawei.

    https://www.engadget.com/2018/08/20/huawei-caught-...
  • mekpro - Wednesday, September 5, 2018 - link

    I always need extra cooling in order to make PUBG playable on My poor Mate 9.
    Sometimes I use wet tissue to paste on the back of the chassis, sometimes I just put Ice and let it melt to cool the phone down. (which usually take less than 3 minutes to melt all). Yes I know this phone is not water resistance certified but come on! its just a hot potatoes that need to be cooled and I don't expect flagship phone to have this behavior. The ironic is I had buy Xiaomi Redmi Note 4X with Snapdragon 625 for 150$ and this devices play PUBG much better than Huawei's Flagship.

    After read this article, I choose to not believe Huawei's marketing anymore, I don't believe that Kirin 980 can close the gap in GPU performance with Snapdragon 845, let alone acceptable performance.

Log in

Don't have an account? Sign up now