Qualcomm Snapdragon 805 Performance Preview
by Anand Lal Shimpi on May 21, 2014 8:00 PM EST- Posted in
- Tablets
- Snapdragon
- Qualcomm
- Mobile
- SoCs
- Snapdragon 805
GPU Performance
3DMark
Although it's our first GPU test, 3DMark doesn't do much to show Adreno 420 in a good light. 3DMark isn't the most GPU intensive test we have, but here we see marginal increases over Snapdragon 800/Adreno 330. I would be interested in seeing if there are any improvements on the power consumption front since performance doesn't really change.
Basemark X 1.1
Basemark X 1.1 starts to show a difference between Adreno 420 and 330. At medium quality settings we see a 25% increase in performance over the Snapdragon 801 based Adreno 330 devices. Move to higher quality settings and the performance advantage increases to over 50%. Here even NVIDIA's Shield with Tegra 4 cooled by a fan can't outperform the Adreno 420 GPU.
GFXBench 3.0
Manhattan continues to be a very stressful test but the onscreen results are pretty interesting. Adreno 420 can drive a 2560 x 1440 display at the same frame rate that Adreno 330 could drive a 1080p display.
In an apples to apples comparison at the same resolution, Adreno 430 is over 50% faster than Adreno 330. It's also faster than the PowerVR G6430 in the iPad Air.
Once again we see an example where Adreno 420 is able to drive the MDP/T's panel at 2560 x 1440 at the same performance as Adreno 330 can deliver at 1080p
At 1080p, the Adreno 420/S805 advantage grows to 45%.
I've included all of the low level GFXBench tests below if you're interested in digging any deeper. It's interesting that we don't see a big increase in the ALU test but far larger increases in the alpha blending and fill rate tests.
149 Comments
View All Comments
ams23 - Thursday, May 22, 2014 - link
You just don't get it. Tegra is focused on automotive, embedded, consumer, and gaming products. Mainstream smartphones is NOT a focus now. Tegra will make it's way into some high end differentiated smartphone products in the future, but the lion's share outside of Apple and Samsung will go to Qualcomm and Mediatek. Qualcomm is able to attractively bundle their modem with their SoC, and certain large carriers have legacy WCDMA networks that require Qualcomm's modem tech. Mediatek is the lowest cost provider. That's life, and it takes nothing away from Tegra K1 which is still a revolutionary product for the ultra mobile space.fteoath64 - Saturday, May 24, 2014 - link
QC's lead in mobile chips and their pricing probably account for the leading position until MediaTek and others starts chipping away on prices and performance. The failure of Tegra3 shows where the price/performance point was and Nvidia knows that and it is the reason why they venture to automotive and other products becuase these needed powerful and higher power gpu chips as opposed to mobile. Except for rendering video in 10bit, and possibly 120fps video encode, there is no real need for the 805 in a phone. The S5 shows that the 801 is more than capable of all things mobile yet have an acceptable battery life. The K1 is a beast in itself being able to do vision graphics and VR stuff. Not that the 805 cannot do but probably better at it in a competitive price package. Nvidia Icera 500 modem is not as popular either having gone through the certification of carriers yet is hardly in any handsets commercially. Also Nvidia knows this up front.Alexey291 - Tuesday, May 27, 2014 - link
what's the focus then? Testbed devices? It can be as "revolutionary" as you claim (or more likely its just a downclocked desktop part)And what sort of a revolution will a device with no OEM wins will cause? I mean we know there are faster parts in the hardware market as a whole. We also know that some of them used 250watts of power. So why does a part with high power usage and higher performance surprise anyone? :)
Ghost0420 - Wednesday, May 28, 2014 - link
It was NV's 1st LTE integration attempt. Carrier Qualification takes long, and since it's the 1st NV silicon with integrated LTE, it probably took longer. If NV, can continue to develop it's LTE, and not have any IP issues with QC, i'm sure NV would give QC a run for their $$. think of it this way, QC been in the game for awhile...NV showed up about 5yrs ago, was able to give enough competition for TI to leave the phone market. (NOT saying NV should take credit for this). and now with K1 GPUhahmed330 - Friday, May 23, 2014 - link
Tegra 2 & 3 were both subsidised. Tegra 4 isn't and it was delayed as well thats why there were fewer design wins. Also the fact that it didn't had integrated. Not because it lowers power required. Integration of LTE modem doesn't lowers power consumption. (apple's iphone 5s doesn't have integrated modem) Integration of modem reduces oem costs instead.Ghost0420 - Wednesday, May 28, 2014 - link
More than likely, QC has a strangle hold on LTE, as they're not likely to license out their tech to a competitor. they've been in the Phone game longer, so OEMs probably have it easier on the Sftwre side. QC SD SoCs run hot too, just as hot as any other SoC. I've had Tegra devices and SD devices, both run at similar temp to the touch. except the T4 devices don't lag as much as SD devices (This could be due to stupid TouchWiz)Flunk - Thursday, May 22, 2014 - link
If they don't get the actual production hardware out there, it doesn't mean much.ArthurG - Thursday, May 22, 2014 - link
For you:http://en.miui.com/thread-22041-1-1.html
5hours heavy 3D game on 6700mAH battery means that TK1 runs with ~3W
and 11 hours on video
so excellent numbers when taking into account performance
testbug00 - Thursday, May 22, 2014 - link
The GPU is running in the mid-600 Mhz range (from the 950Mhz or so Nvidia touted) and the CPU is certainly also downclocked.Do you have performance numbers for that game? How about how fast/power usage on competitor chips? Not enough knowledge to draw large conclusions... Still, really odd how NVidia is not talking about the clockspeeds in the tablet... You think they would talk up how highly clocked and efficient their chip is...
kron123456789 - Thursday, May 22, 2014 - link
"The GPU is running in the mid-600 Mhz range" — How do you know that? Where is proof?