Final Words

Qualcomm tends to stagger the introduction of new CPU and GPU IP. Snapdragon 805 ultimately serves as Qualcomm's introduction vehicle for its Adreno 420 GPU. The performance gains there over Adreno 330/Snapdragon 801 can be substantial, particularly at high resolutions and/or higher quality settings. Excluding 3DMark, we saw a 20 - 50% increase in GPU performance compared to Snapdragon 801. Adreno 420 is a must have if you want to drive a higher resolution display at the same performance as an Adreno 330/1080p display combination. With OEMs contemplating moving to higher-than-1080p resolution screens in the near term, leveraging Snapdragon 805 may make sense there.

The gains on the CPU side are far more subtle. At best we noted a 6% increase in performance compared to a 2.5GHz Snapdragon 801, but depending on thermal/chassis limitations of shipping devices you may see even less of a difference.

Qualcomm tells us that some of its customers will choose to stay on Snapdragon 801 until the 810 arrives next year, while some will choose to release products based on 805 in the interim. Based on our results here, if an OEM is looking to specifically target the gaming market I can see Snapdragon 805 making a lot of sense. For most of those OEMs that just launched Snapdragon 801 based designs however, I don't know that there's a huge reason to release a refresh in the interim.

I am curious to evaluate the impact of ISP changes as well as dive deeper into 4K capture and H.265 decode, but that will have to wait until we see shipping designs. The other big question is just how power efficient Adreno 420 is compared to Adreno 330. Qualcomm's internal numbers are promising, citing a 20% reduction in power consumption at effectively the same performance in GFXBench's T-Rex HD onscreen test.

GPU Performance
Comments Locked

149 Comments

View All Comments

  • ams23 - Thursday, May 22, 2014 - link

    You just don't get it. Tegra is focused on automotive, embedded, consumer, and gaming products. Mainstream smartphones is NOT a focus now. Tegra will make it's way into some high end differentiated smartphone products in the future, but the lion's share outside of Apple and Samsung will go to Qualcomm and Mediatek. Qualcomm is able to attractively bundle their modem with their SoC, and certain large carriers have legacy WCDMA networks that require Qualcomm's modem tech. Mediatek is the lowest cost provider. That's life, and it takes nothing away from Tegra K1 which is still a revolutionary product for the ultra mobile space.
  • fteoath64 - Saturday, May 24, 2014 - link

    QC's lead in mobile chips and their pricing probably account for the leading position until MediaTek and others starts chipping away on prices and performance. The failure of Tegra3 shows where the price/performance point was and Nvidia knows that and it is the reason why they venture to automotive and other products becuase these needed powerful and higher power gpu chips as opposed to mobile. Except for rendering video in 10bit, and possibly 120fps video encode, there is no real need for the 805 in a phone. The S5 shows that the 801 is more than capable of all things mobile yet have an acceptable battery life. The K1 is a beast in itself being able to do vision graphics and VR stuff. Not that the 805 cannot do but probably better at it in a competitive price package. Nvidia Icera 500 modem is not as popular either having gone through the certification of carriers yet is hardly in any handsets commercially. Also Nvidia knows this up front.
  • Alexey291 - Tuesday, May 27, 2014 - link

    what's the focus then? Testbed devices? It can be as "revolutionary" as you claim (or more likely its just a downclocked desktop part)

    And what sort of a revolution will a device with no OEM wins will cause? I mean we know there are faster parts in the hardware market as a whole. We also know that some of them used 250watts of power. So why does a part with high power usage and higher performance surprise anyone? :)
  • Ghost0420 - Wednesday, May 28, 2014 - link

    It was NV's 1st LTE integration attempt. Carrier Qualification takes long, and since it's the 1st NV silicon with integrated LTE, it probably took longer. If NV, can continue to develop it's LTE, and not have any IP issues with QC, i'm sure NV would give QC a run for their $$. think of it this way, QC been in the game for awhile...NV showed up about 5yrs ago, was able to give enough competition for TI to leave the phone market. (NOT saying NV should take credit for this). and now with K1 GPU
  • hahmed330 - Friday, May 23, 2014 - link

    Tegra 2 & 3 were both subsidised. Tegra 4 isn't and it was delayed as well thats why there were fewer design wins. Also the fact that it didn't had integrated. Not because it lowers power required. Integration of LTE modem doesn't lowers power consumption. (apple's iphone 5s doesn't have integrated modem) Integration of modem reduces oem costs instead.
  • Ghost0420 - Wednesday, May 28, 2014 - link

    More than likely, QC has a strangle hold on LTE, as they're not likely to license out their tech to a competitor. they've been in the Phone game longer, so OEMs probably have it easier on the Sftwre side. QC SD SoCs run hot too, just as hot as any other SoC. I've had Tegra devices and SD devices, both run at similar temp to the touch. except the T4 devices don't lag as much as SD devices (This could be due to stupid TouchWiz)
  • Flunk - Thursday, May 22, 2014 - link

    If they don't get the actual production hardware out there, it doesn't mean much.
  • ArthurG - Thursday, May 22, 2014 - link

    For you:
    http://en.miui.com/thread-22041-1-1.html
    5hours heavy 3D game on 6700mAH battery means that TK1 runs with ~3W
    and 11 hours on video
    so excellent numbers when taking into account performance
  • testbug00 - Thursday, May 22, 2014 - link

    The GPU is running in the mid-600 Mhz range (from the 950Mhz or so Nvidia touted) and the CPU is certainly also downclocked.

    Do you have performance numbers for that game? How about how fast/power usage on competitor chips? Not enough knowledge to draw large conclusions... Still, really odd how NVidia is not talking about the clockspeeds in the tablet... You think they would talk up how highly clocked and efficient their chip is...
  • kron123456789 - Thursday, May 22, 2014 - link

    "The GPU is running in the mid-600 Mhz range" — How do you know that? Where is proof?

Log in

Don't have an account? Sign up now