GPU Performance

Snapdragon 800 features Qualcomm's Adreno 330 GPU. Qualcomm hasn't stated publicly how Adreno 330 compares to Adreno 320 featured in Snapdragon 600, but it's almost certainly a larger GPU. The 8974 implementation in LG's G2 clocks the Adreno 330 GPU at a maximum of 450MHz, yet we see better performance than the 450MHz Adreno 320 in Snapdragon 600 - lending credibility to the idea of having more execution resources. There's also an 8974AB variant which includes a 100 MHz bump in GPU clocks up to 550 MHz.

3DMark

3DMark for Android features the Ice Storm benchmark and uses OpenGL ES 2.0. Ice Storm is divided into two graphics tests and a physics test. The first graphics test is geometry heavy while the second test is more pixel shader intensive. The physics test, as you might guess, is CPU bound and multithreaded. The overall score takes into account both graphics and physics tests. The benchmark is rendered to an offscreen buffer at 720p/1080p and then scaled up to the native resolution of the device being tested. This is a very similar approach we've seen by game developers to avoid rendering at native resolution on some of the ultra high resolution tablets. The beauty of 3DMark's approach here is the fact that all results are comparable, regardless of a device's native resolution. The downside is we don't get a good idea of how some of the ultra high resolution tablets would behave with these workloads running at their native (> 1080p) resolutions.

For these benchmarks we stuck with the default presets (720p, normal quality).

3DMark performance generally fell very close to Qualcomm's MSM8974 MDP/T, with one exception. The CPU bound physics tests had the G2 far lower down the list than I would've expected. Given that test is mostly a multithreaded CPU benchmark, it's entirely possible that the G2's thermal/frequency governors are set more conservatively there. The performance gains elsewhere over Snapdragon 600/Adreno 320 are huge, but 3DMark can be very influenced by CPU performance so it's not clear how much of this advantage is due to Adreno 330 or Krait 400.

3DMark - Demo

3DMark - Graphics

3DMark - Graphics Test 1

3DMark - Graphics Test 2

3DMark - Ice Storm

3DMark - Physics

GFXBench 2.7

GFXBench (formerly GLBenchmark) gives us some low level insight into these platforms. As usual, we'll start with the low level tests and move onto the game simulation benchmarks:

GLBenchmark 2.7 - Fill Test

GLBenchmark 2.7 - Fill Test (Offscreen 1080p)

GLBenchmark 2.7 - Triangle Throughput

GLBenchmark 2.7 - Triangle Throughput (Offscreen 1080p)

GLBenchmark 2.7 - Triangle Throughput, Fragment Lit

GLBenchmark 2.7 - Triangle Throughput, Fragment Lit (Offscreen 1080p)

GLBenchmark 2.7 - Triangle Throughput, Vertex Lit

GLBenchmark 2.7 - Triangle Throughput, Vertex Lit (Offscreen 1080p)

The low level tests put the G2 closer in performance to some of the Snapdragon 600 based devices than the MDP/T, again early software at work here. The T-Rex HD performance looks pretty good, putting the G2 between the S600 devices and S800 MDP/T.

GLBenchmark 2.7 - T-Rex HD

GLBenchmark 2.7 - T-Rex HD (Offscreen 1080p)

 

GLBenchmark 2.7 - Egypt HD

GLBenchmark 2.7 - Egypt HD (Offscreen 1080p)

Basemark X

Basemark X is a new addition to our mobile GPU benchmark suite. There are no low level tests here, just some game simulation tests run at both onscreen (device resolution) and offscreen (1080p, no vsync) settings. The scene complexity is far closer to GLBenchmark 2.7 than the new 3DMark Ice Storm benchmark, so frame rates are pretty low:

Basemark X - Off Screen

Basemark X performance tracks with what we saw in the GFXBench T-Rex HD test. Performance is clearly higher than on any other device, but not quite up to MDP/T levels. I wonder how much closer the final device will get.

Basemark X - On Screen

Epic Citadel

Epic's Citadel benchmark gives us a good indication of lighter workload, v-sync limited performance at native resolution. At 1080p, the Snapdragon 800 MDP/T offers over 50% better performance than the Snapdragon 600 based platforms. Granted we're comparing to smartphones here so there's some thermal advantage playing to the 800's favor.

Epic Citadel - Ultra High Quality, 100% Resolution

 
CPU Performance NAND Performance
Comments Locked

120 Comments

View All Comments

  • Krysto - Sunday, September 8, 2013 - link

    Cortex A9 was great efficiency wise, and better perf/Watt than what Qualcomm had available at the time (S3 Scorpion), but Nvidia still blew it with Tegra 3. So no, that's not the only reason. Nvidia can do certain things like moving to smaller node or keeping the clock speed low of the GPU's, but adding more GPU cores, and so on, to increase efficiency and performance/Watt. But they aren't doing any of that.
  • UpSpin - Sunday, September 8, 2013 - link

    You mean they could and should have released more iterations of Tegra 3 and adding more and more GPUs to improve at least the graphics performance than waiting for A15 and Tegra 4.

    I never designed a SoC myself :-D so I don't know how hard it is but I did lots of PCB which is practically the same except on a much larger scale :-D If you add some parts you have to increase the die size, thus move other parts on the die around, reroute the stuff etc. So it's still a lot of work. The main bottleneck of Tegra 3 is memory bandwidth. So adding more GPU cores without adressing the memory bandwidth would not have made any sense most probably.

    They probably expected to ship Tegra 4 SoCs sooner, thus they saw no need in releasing a totally improved Tegra 3 and focused on Tegra 4.

    And if you compare Tegra 4 to Tegra 3, then they did exactly what you wanted, moving to a smaller node, increasing the number of GPU cores, moving to A15 while maintaining the power efficient companion core, increasing bandwidth, ...
  • ESC2000 - Sunday, September 8, 2013 - link

    I wonder whether it is more expensive to pay to license ARM's A9, A15, etc (thought they were doing an A12 as well?) or to develop it yourself like Qualcomm does. Obviously QCOM isn't starting from scratch every time, but R&D adds up fast.

    This isn't a perfect analogy at all but it makes me think of the difference between being a pharmaceutical company that develops your own products and one that makes generic versions of products someone else has already developed once the patent expires. Of course now in the US many companies that technically make their own products from scratch really just take a compound already invented and tweak it a little bit (isolate the one useful isomer, make the chiral version, etc), knowing that it is likely their modified version will be safe and effective just as the existing drug hopefully is. They still get their patent, which they can extend through various manipulations like testing in new populations right before the patent expires, but the R&D costs are much lower. Consumers therefore get many similar versions of drugs that rely on one mechanism of action (see all the SSRIs) and few other choices if that mechanism does not work for them. Not sure how I got off into that but it is something I care about and now maybe some Anandtech readers will know haha.
  • krumme - Sunday, September 8, 2013 - link

    Great story mate :), i like it.
  • balraj - Saturday, September 7, 2013 - link

    My first comment on Anandtech
    The review was cool...I'm impressed by g2 battery life n camera...
    Wish Anandtech can have a UI section
    Also can you ppl confirm if lg will support g2 with Atleast 2 yrs of software update
    That's gonna be deciding factor in choosing between g2 or nexus 5 for most of us !!!!!!!
  • Impulses - Saturday, September 7, 2013 - link

    Absolutely nobody can guarantee that, even if an LG exec came out and said so there's no guarantee they wouldn't change their mind or a carrier wouldn't delay/block an update... If updates are that important to you, then get a Nexus, end of story.
  • adityasingh - Saturday, September 7, 2013 - link

    @Brian could you verify whether the LG G2 uses Snapdragon 800 MSM8974 or MSM8974AB?

    The "AB" version clocks the CPU at 2.3Ghz, while the standard version tops out at 2.2Ghz.. However you noted in your review that the GPU is clocked at 450Mhz.. If I recall correctly, the "AB" version runs the GPU at 550Mhz.. while the standard is 450Mhz

    So in this case the CPU points to one bin.. but the GPU points to another.. Can you please confirm?
    Nice "Mini Review" otherwise.. Am looking forward to the full review soon.. Please include the throttling analysis like the one from the MotoX. It would be nice to see how the long the clocks stay at 2.3Ghz :)
  • Krysto - Sunday, September 8, 2013 - link

    He did mention it's the first. no the latter.
  • neoraiden - Saturday, September 7, 2013 - link

    Brian could you comment on how the lumia 1020 compares to a cheap ($150-200) camera as I was impressed by the difference in colour for the video comparison even if ois wasn't the best.

    I currently have a note 2 but the camera quality in low light conditions is just too bad, also the inability to move apps to my memory card has been annoying. I have an upgrade coming up in January I think, but I might try to change phone before. I was wondering whether you could comment on whether the lumia 1020 is worth the jump from android due to picture quality or will an htc one or nexus 5 (if similar to the g2) suffice? I was considering the note 3 as I like everything else but it still doesn't have ois or would the note 3 with a cheap compact be better even given the inconvenience of having to bring a camera?

    The main day to day use of my phone is news apps, Internet, email some threaded (which I hear is a problem for windows phone).
  • abrahavt - Sunday, September 8, 2013 - link

    I would wait to see what camera nexus 5 would have. Alternative is to get the Sony QX 100 and you would get great pictures irrespective of the phone

Log in

Don't have an account? Sign up now