Final Words

Qualcomm tends to stagger the introduction of new CPU and GPU IP. Snapdragon 805 ultimately serves as Qualcomm's introduction vehicle for its Adreno 420 GPU. The performance gains there over Adreno 330/Snapdragon 801 can be substantial, particularly at high resolutions and/or higher quality settings. Excluding 3DMark, we saw a 20 - 50% increase in GPU performance compared to Snapdragon 801. Adreno 420 is a must have if you want to drive a higher resolution display at the same performance as an Adreno 330/1080p display combination. With OEMs contemplating moving to higher-than-1080p resolution screens in the near term, leveraging Snapdragon 805 may make sense there.

The gains on the CPU side are far more subtle. At best we noted a 6% increase in performance compared to a 2.5GHz Snapdragon 801, but depending on thermal/chassis limitations of shipping devices you may see even less of a difference.

Qualcomm tells us that some of its customers will choose to stay on Snapdragon 801 until the 810 arrives next year, while some will choose to release products based on 805 in the interim. Based on our results here, if an OEM is looking to specifically target the gaming market I can see Snapdragon 805 making a lot of sense. For most of those OEMs that just launched Snapdragon 801 based designs however, I don't know that there's a huge reason to release a refresh in the interim.

I am curious to evaluate the impact of ISP changes as well as dive deeper into 4K capture and H.265 decode, but that will have to wait until we see shipping designs. The other big question is just how power efficient Adreno 420 is compared to Adreno 330. Qualcomm's internal numbers are promising, citing a 20% reduction in power consumption at effectively the same performance in GFXBench's T-Rex HD onscreen test.

GPU Performance
Comments Locked

149 Comments

View All Comments

  • akdj - Thursday, May 22, 2014 - link

    "{"Here even NVIDIA's Shield with Tegra 4 cooled by a fan can't outperform the Adreno 420 GPU"}
    Anandtech needs to stop makings dumb NV statements. It's a YEAR old device and won't even be in this race, not to mention it's not getting a ton from the fan anyway which is really there for longevity and temps in the hands for hours.... Stop taking AMD checks to be their portal site, and you can get back to making unbiased reporting without the little BS NV digs."

    Reading. Comprehension. Even YOU, taking the time to quote and post the comment, specifically relating to an OBJECTIVE benchmark. nVidia isn't ON the market! Unless you buy THEIR pad....and that's extremely niche right Now..."gaming for hours?" Who does THAT on their phone? Even their tablet?? An hour, ok...I see that. But there's a FAN for a reason. Not to just keep your hands cool. You said it yourself. Longevity. Doesn't that DIRECTLY relate to 'cooling' the SoC? The guts? So it can 'live longer?' It wasn't a dig. It certainly wasn't an AMD stamp, WhateverTH that is. Certainly possible it was I that missed the innuendo there. It's a fact though, bro! nVidia, Intel, AMD...ALL late to the 'mobile game'. Intel has the resources to jump into the fray...head first. nVidia doesn't. They're being very careful while maintaining their, again...slowly but certainly 'niche' dedicated GPU activity. With Intel's iGPU performance envelope and TDP, along with its kinda close association with the CPU ;)...increasing demand for smaller, faster and more portable computing is going to destroy nVidia if the K1 projects isn't accepted by more mobile vendors and OEMs. There's a LOT of money in the R&D of these SOCs and to date, the nVidia 'Tegra' solution scared a LOT of OEMs using or considering using their silicon graphically. I think you owe Anand and his crew an apology. I'd be interested as to what your contribution to the world of technology is...it's got to be something incredible! I'm all ears!!! Seriously, for you to disrespect the author of the article as you did...you owe at least an apology. Then, feed the spider. Leave mom's basement. Get a job. Stop playing games all day. And don't 'pick a winner!' You'll NEVER win. It's called gambling. That's why the lights are on in Vegas. It's cool to be a fan of theirs but to post such a silky comment disrespecting one or the MOST respected and intelligent employee of or Anand himself is bad juju. Take it back. Get off the 'net for a couple of days. Get some sunshine. Good for ya!
  • phoenix_rizzen - Friday, May 23, 2014 - link

    I've played games on my phone (LG G2) for over 4 hours at a time (Puzzle Quest 2 is damned addictive). Once for over 6 hours, although I had to plug it in near the end. :) Anytime I get a new, interesting RPG onto my phone, I'll go through bouts of playing it for 4-6 hours at a time.

    And my daughter has played games on our tablet (2012 Nexus 7) for multiple hours at a time, including some Netflix and local video watching. The battery on that thing tends to only last about 4 hours, though. :(

    Just because YOU can't see a reason to play games on your phone for over an hour doesn't mean nobody does that.
  • Alexey291 - Tuesday, May 27, 2014 - link

    Hear hear - i play my psp games emulated on the phone these days. the psp is too much of a pain to carry around (and too old tbh) but some of the old rpgs on it are ossum and yes i can play them for hours on end.
  • kron123456789 - Thursday, May 22, 2014 - link

    actually, it's 30fps in Manhattan Offscreen))
    http://gfxbench.com/device.jsp?benchmark=gfx30&...
  • sachouba - Thursday, May 22, 2014 - link

    Having amazing scores at benchmarks is good, but Nvidia's Soc still aren't compatible with a lot of apps...
  • kron123456789 - Thursday, May 22, 2014 - link

    What apps, for example?
  • tviceman - Thursday, May 22, 2014 - link

    So Qualcomm will continue to have the better phone SoC in 805, while Nvidia will have the better tablet, set top, and chromebook SoC in TK1.
  • ArthurG - Thursday, May 22, 2014 - link

    without integrated modem in S805, I'm not sure it's better than TK1 for super phones. Let's wait for power consumption figures...
  • testbug00 - Thursday, May 22, 2014 - link

    How many phones used T4 (not T4i, which, is a good product!) again? One.

    Nvidia either cannot, or does not offer a compelling solution in phones.

    I would say why, but, you would scream "that is not true" as the only evidence is in how OEMs have acted and design wins count.
  • fivefeet8 - Thursday, May 22, 2014 - link

    Market comparisons aside as a taken, the K1 is quite different than the Tegra 4 as far as GPU hardware goes. It should have the chance to be used in Super Phones and mobile devices if only to put pressure on Qualcomm to fix their OpenGL drivers.

Log in

Don't have an account? Sign up now