Final Words

Qualcomm tends to stagger the introduction of new CPU and GPU IP. Snapdragon 805 ultimately serves as Qualcomm's introduction vehicle for its Adreno 420 GPU. The performance gains there over Adreno 330/Snapdragon 801 can be substantial, particularly at high resolutions and/or higher quality settings. Excluding 3DMark, we saw a 20 - 50% increase in GPU performance compared to Snapdragon 801. Adreno 420 is a must have if you want to drive a higher resolution display at the same performance as an Adreno 330/1080p display combination. With OEMs contemplating moving to higher-than-1080p resolution screens in the near term, leveraging Snapdragon 805 may make sense there.

The gains on the CPU side are far more subtle. At best we noted a 6% increase in performance compared to a 2.5GHz Snapdragon 801, but depending on thermal/chassis limitations of shipping devices you may see even less of a difference.

Qualcomm tells us that some of its customers will choose to stay on Snapdragon 801 until the 810 arrives next year, while some will choose to release products based on 805 in the interim. Based on our results here, if an OEM is looking to specifically target the gaming market I can see Snapdragon 805 making a lot of sense. For most of those OEMs that just launched Snapdragon 801 based designs however, I don't know that there's a huge reason to release a refresh in the interim.

I am curious to evaluate the impact of ISP changes as well as dive deeper into 4K capture and H.265 decode, but that will have to wait until we see shipping designs. The other big question is just how power efficient Adreno 420 is compared to Adreno 330. Qualcomm's internal numbers are promising, citing a 20% reduction in power consumption at effectively the same performance in GFXBench's T-Rex HD onscreen test.

GPU Performance
Comments Locked

149 Comments

View All Comments

  • phoenix_rizzen - Friday, May 23, 2014 - link

    Snapdragon S4 Pro didn't have an integrated modem (APQ80xx) and it sold quite well into phones like the LG Optimus G / Nexus 4.
  • jerrylzy - Thursday, May 22, 2014 - link

    The Memory Interface of Snapdragon 805 should be 2 x 64bit instead of 4 x 32bit...
  • Zaydax - Thursday, May 22, 2014 - link

    Just noticed: First page of the review says 32 nm. Aren't these all 28nm?
    As always, great review!
  • Zaydax - Thursday, May 22, 2014 - link

    Never mind. totally read that wrong. It said 32 bit...
  • jjj - Thursday, May 22, 2014 - link

    At least it's not just a rebrand like the 801, we want 20nm already!
    On the flip side, really looking forward to the A53 SoCs about to arrive from Qualcomm and Mediatek, the Allwinner A80 quad A15, the Rockchip RK3288 with it's quad A17 (not sure it's not A12 ,we'll see soon i guess) and the quad A17 MediaTek MT6595. The more budget side should be getting some nice perf boost.
    And while at it, heave you guys heard anything about Intel investing/collaborating with Rockchip?
  • hahmed330 - Thursday, May 22, 2014 - link

    I actually found very detailed power consumption figures for Jetson Tegra K1.... I am quite surprised how low power it is... I am quite surprised anand missed that... Here is the link....

    http://developer.download.nvidia.com/embedded/jets...

    Also for even more geeky details including the schematics...

    https://developer.nvidia.com/jetson-tk1-support
  • hahmed330 - Thursday, May 22, 2014 - link

    660 mW at idle... 3660 mW running at iPhone 5S speed... At full load at 950mhz 6980mW...

    These numbers are SOC+DRAM...
  • henriquen - Thursday, May 22, 2014 - link

    "Tegra K1 performance measured on Jetson TK1 platform running LINUX"
  • Ryan Smith - Thursday, May 22, 2014 - link

    Memory bandwidth! Sweet, sweet memory bandwidth!
  • bradleyg5 - Thursday, May 22, 2014 - link

    Hope this chipset doesn't just come on 2560x1440 screens because the bump in performance isn't going to match the increased demands of that resolution.

    The only test you did that reflects real world gaming(in my experience) is 3d mark so it's pretty shocking how poor it does.

    40% improved performance might as well mean nothing because the current chips rarely even run full power. So basically a 20% power efficiency bump is the only thing that will be actualized.

    X-Com will drain my note3's entire battery in about 2 1/2 hours where the 800 runs full power.

    The better shader support is by far the biggest news, direct x 11 effects are solely needed, shaders in games currently look like direct x 7. So that's a massive leap.

    All the visual fidelity currently comes from running games at very high resolution with high poly counts. I wonder if in the next generation we are going to see developers run games under native resolution, think of what you could do with shaders running at 720p instead of 1440p. Now that you have all the advanced lighting and shadowing effects why would you want to burn up all your performance on super high resolutions.

    The xbox 1 and ps4 don't even run most games at 1080p it's sort of crazy that mobile games do. You aren't going to get ps3 level graphics at 1440p but you could at 720p.

Log in

Don't have an account? Sign up now