Performance Benchmarks

In the world of laptops where I come from, we’re fast reaching the point – if not well beyond it – where talking about raw performance only matters to a small subset of users. Everything with Core i3 and above is generally “fast enough” that users don’t really notice or care. For tablets, the difference in speed between a budget and a premium device is far more dramatic. I’ve included numbers from the Dell Venue 8, which I’ll be providing a short review of in the near future. While the price isn’t bad, the two Samsung tablets feel substantially snappier – as they should. We’ll start with the CPU/system benchmarks and then move to the GPU/Graphics tests.

SunSpider 1.0 Benchmark

Mozilla Kraken Benchmark (Stock Browser)

Google Octane v2

WebXPRT - Overall Score

In terms of CPU speed, the Apple A7 chips still take the lead in all of our tests, though not always by a large margin. We’re also using different browsers (Chrome vs. Safari), and JavaScript benchmarks aren’t always the greatest way of comparing CPU performance. I ran additional benchmarks on the two Samsung tablets just to see if I could get some additional information; you can find a table of results for AndeBench, Basemark OS II, and Geekbench 3.

CPU/System Benchmarks of Samsung Galaxy Tab Pro
Benchmark Subtest Tab Pro 10.1 Tab Pro 8.4
Andebench Native 13804 17533
Java 708 790
Basemark OS II Overall 865 1062
System 1542 1529
Memory 419 503
Graphics 1032 2555
Web 838 724
Geekbench 3 Single 942 910
Multi 2692 2847
Integer Single 1028 967
Integer Multi 3135 3343
FP Single 897 848
FP Multi 3106 3080
Memory Single 860 922
Memory Multi 982 1391

Interestingly, the Samsung Exynos 5 Octa 5420 just seems to come up short versus the Snapdragon 800 in most of these tests. However, there’s a bit more going on than you might expect. We checked for cheating in the benchmarks, and found no evidence that either of these tablets was doing anything unusual in terms of boosting clock speeds. What we did find is that the Pro 10.1 is frequently not running at maximum frequency – or anywhere near it – in quite a few of our CPU tests.

Specifically, Sunspider, Kraken, and Andebench had the cores hitting a maximum 1.1-1.3GHz. The Pro 8.4 meanwhile would typically hit the maximum 2.3GHz clock speed. The result, as you might imagine, is that the 8.4 ends up being faster, sometimes by a sizeable margin. Basemark OS II, Geekbench 3, and AnTuTu on the other hand didn’t have any such odd behavior, with the Pro 10.1 often hitting 1.8-1.9GHz (on one or more cores) during the testing, and when that happens it often ends up slightly faster than the Pro 8.4.

Which benchmark results are more "valid"? Well, that's a different subject, but as we're comparing Samsung tablets running more or less the same build of Android, we can reasonably compare the two and say that the 8.4 has better overall performance. Updated drivers or tweaking of the power governor on the 10.1 might change things down the road, but we can only test what exists right now.

Overall, both systems are sufficiently fast for a modern premium tablet, so I wouldn’t worry too much about whether or not you’re getting maximum clock speeds in a few benchmarks – in normal use you likely won’t notice one way or the other. But we’re only talking about the CPU performance when we say you won’t see the difference; let’s move over to the graphics benchmarks to see how the Galaxy Pro tablets fare.

Graphics Performance

3DMark Unlimited - Ice Storm

3DMark Unlimited - Physics Score

3DMark Unlimited - Graphics Score

3DMark Unlimited - Graphics Test 1

3DMark Unlimited - Graphics Test 2

Basemark X (Offscreen 1080p)

Basemark X (Onscreen)

GLBenchmark 2.7 - T-Rex HD (Offscreen)

GLBenchmark 2.7 - T-Rex HD (Onscreen)

Outside of the 3DMark Unlimited Graphics Test 1 result, the Pro 8.4 sweeps the tables against its big brother. Playing games like Angry Birds Go! or any other reasonably demanding 3D titles in my experience confirms the above results – Adreno 330 beats the Mali-T628; end of discussion. I also had a few quirks crop up with the Pro 10.1 graphics, like Plants vs. Zombies 2 at one point stopped rendering all the fonts properly; a reboot fixed the problem, but I may have seen one or two other rendering glitches during testing. I have some additional results for GPU testing as well via GFXBench 3.0 if you’re interested:

Graphics Benchmarks of Samsung Galaxy Tab Pro
Benchmark Subtest Tab Pro 10.1 Tab Pro 8.4
GFXBench 3.0 Onscreen Manhattan (FPS) 2.9 5.8
T-Rex (FPS) 14 17.1
ALU (FPS) 13 59.8
Alpha Blending (MB/s) 3295 6847
Fill (MTex/s) 1956 3926
GFXBench 3.0 Offscreen Manhattan (FPS) 5.5 10.8
T-Rex (FPS) 22.9 25.9
ALU (FPS) 25.6 138
Alpha Blending (MB/s) 3093 7263
Fill (MTex/s) 1956 3780

The new Manhattan benchmark was one of the other tests where the Pro 10.1 didn’t seem to render things properly, and even then the Pro 8.4 ends up with nearly twice the frame rates. The ALU, Alpha Blending, and Fill rate scores might explain some of what’s going on, where in some cases the Pro 8.4 is more than four times as fast. Regardless, if you want maximum frame rates, I’d suggest getting the Pro 8.4 over the 10.1.

LCD Testing: A Feast for Your Eyes Battery Life and Storage Performance
Comments Locked

125 Comments

View All Comments

  • Wilco1 - Monday, March 24, 2014 - link

    What is claimed this is CPU performance at maximum frequency, not a latency test of bursty workloads. It would be interesting to see Anand's browsing test reporting both power and performance/latency results as it seems a reasonable test of actual use. However SunSpider is not like a real mobile workload.

    The datasets for most of the benchmarks in Geekbench are actually quite large, into 20-30MBytes range. That certainly does not fit into the L2 on any SoC I know, let alone on L1. So I suggest that Geekbench gives a far better idea of mobile performance than a benchmark that only measures the set of JIT optimization tricks to get a good SunSpider score.

    Intel doesn't have magic that makes frequency scaling 10-100 times faster - PLLs and voltage regulators all use the same physics (until recently Intel was using the same industry-standard voltage regulators as everybody else). The issue is one of software, the default governor is not recognizing repeated patterns of bursty behaviour and keeping clocks high for longer when necessary. Intel avoids the Linux governor issues by using a separate microcontroller. I have no doubt that it has been well tuned to the kind of bursty behaviour that SunSpider exhibits.
  • virtual void - Monday, March 24, 2014 - link

    So you are suggesting that the performance counters in Sandy Bridge is reporting the wrong thing when it reports 97% L1D$-hit rate in Geekbench? They seem to work quite well on "real" programs.

    The performance counters also suggest that Geekbench contains trivial to predict branches, while program developed with dynamic languages and/or OOP languages usually contains a lot of indirect and even conditional indirect calls that is quite hard to predict. Only the most advanced CPU-designs keep history on conditional indirect calls, so a varying branch target on a indirect call will always result in a branch-prediction miss on mobile CPUs.

    The sampling frequency of CPU-load and the aggressiveness the Linux kernel switches P-state is based on the reported P-state switch latency. All modern Intel CPUs report a switching latency of 10µs while I haven't seem any ARM SoC report anything lower than 0.1ms. The _real_ effect of this is that Intel platforms will react about ten times as fast to a sudden burst in CPU-load when running Linux-kernel.
  • Wilco1 - Monday, March 24, 2014 - link

    SPEC2006 has ~96% average L1D hit rate, so do you also claim SPEC has a small working set and runs almost entirely out of L1? The issue is not about the correctness of the performance counters but your interpretation of them. The fact that modern CPUs can run at multiple GHz despite DRAM internally running at ~50MHz still is precisely because caches and branch predictors work pretty well.

    C++ and GUI code typically only has a limited number of distinct targets, which are easy to predict on modern mobile CPUs (pretty much any ARM CPU since Cortex-A8 has had indirect predictors, and since A15 they support multiple targets). I've never seen conditional indirect calls being emitted by compilers, so I can imagine some CPUs may ignore this case, but it's not in any way hard to predict. The conditional indirect branches you do get in real code are conditional return (trivial to predict) and switch statements on some ARM compilers.

    Well if there is such a large difference then there must be a bug - I did once glance over the Samsung cpufreq drivers and they seemed quite a mess. It is essential to sample activity at a high resolution, if you sample at Nx slower rate then you do indeed react N times slower to a burst of activity - irrespectively of how fast the actual frequency/voltage scaling is done.
  • Egg - Monday, March 24, 2014 - link

    Alright, I'll admit I didn't actually read the article. It just seemed you were unaware of what Brian had said previously.
  • UltraWide - Saturday, March 22, 2014 - link

    The Galaxy Note 10.1 2014 has 3GB of RAM.
  • JarredWalton - Sunday, March 23, 2014 - link

    It's not clear if all 10.1 Note 2014 come with 3GB, or just the 32GB models, but I'm going to go with 3GB (and hopefully that's correct, considering the cost increase for the Note). I had the Samsung specs pages open when putting together that table, and unfortunately they didn't list RAM on the 10.1 16GB I was looking at. Weird.
  • Reflex - Saturday, March 22, 2014 - link

    " If you want another option, the Kindle Fire HDX 7” ($200) and Kindle Fire HDX 8.9” ($379) pack similar performance with their Snapdragon 800 SoCs, but the lack of Google Play Services is a pretty massive drawback in my book."

    For many of us that's actually the Kindle line's largest advantage. Android and a good chunk of its app ecosystem, without compromising our privacy and exposing ourselves to all the malware. Plus we got these specs six months ago with the HDX line, and for a lower price in a better package.
  • A5 - Saturday, March 22, 2014 - link

    Yeah, because the best way to avoid malware is to bypass the Play Store and install an APK from a random website to get Youtube to work.

    And you're only fooling yourself if you think Amazon is any better for your privacy than Google.
  • Reflex - Saturday, March 22, 2014 - link

    Have you actually read their privacy policies and compared? Or taken a look at their profit models? There is a significant difference between the two for their approaches to privacy.

    And no, if I really care to get an app like that I can get it from a third party market if I must. There are some that mirror the Play store. But that said, there are very few needs that are not met via apps already available in the Amazon store.
  • R0H1T - Sunday, March 23, 2014 - link

    So you're saying that Amazon has no record of you in their database whatsoever OR that they don't track your browsing history through their Silk browser, using Amazon's own servers, & never target (ads/promos) you based on your buying/browsing history ?

    I'd say you're deluding yourself if you think that Yahoo, twitter, FB, bing or even Amazon are any different than Google when it comes to tracking their users or targeting them with specific ads/promos based on their (recorded) history ):

Log in

Don't have an account? Sign up now