CPU Performance

The original Note I played with was based on Qualcomm’s APQ8060 platform with MDM9200 baseband (the so-called Fusion 2 platform) and was for its time a pretty awesome piece of kit, combining LTE and a dual core SoC. The Note 2 I played with next was based on Samsung’s own Exynos 4412 SoC with quad core Cortex A9 at 1.6 GHz and Mali–400MP4 GPU. For the Note 3, I’m looking at a T-Mobile variant (SM-N900T if you want to be exact about it) which means it includes a Snapdragon 800 SoC, and Samsung has gone for the 2.3 GHz bin (really 2.265 GHz rounded up). Inside are 4 Krait 400 CPUs running at up to 2.3 GHz, and Adreno 330 graphics at up to 450 MHz, all built on TSMC’s 28nm HPM HK-MG process.

I should note that this is MSM8974 and not MSM8974AB which oddly enough one of Qualcomm’s customers already announced (Xiaomi for the Mi3) which boosts GPU clocks up to 550 MHz and the LPDDR3 memory interface up to 933 MHz, among a few other changes. I’ve confirmed that GPU clocks on the Note 3 are indeed maxing out at 450 MHz, and quite honestly it’s a bit early for 8974AB in the first place, though it wouldn’t surprise me to see Samsung eventually get that faster bin at some point and put it in something.

 

I should mention that the Note 3 (like many other Android devices - SGS4, HTC One) detects certain benchmarks and ensures CPU frequencies are running at max while running them, rather than relying on the benchmark workload to organically drive DVFS to those frequencies. Max supported CPU frequency is never exceeded in this process, the platform simply primes itself for running those tests as soon as they're detected. The impact is likely small since most of these tests should drive CPU frequencies to their max state regardless (at least on the CPU side), but I'm going to make it a point to call out this behavior whenever I see it from now on. Make no mistake, this is cheating plain and simple. It's a stupid cheat that most Android OEMs seem to be ok with and honestly isn't worth the effort. Update: Of our CPU tests only AndEBench is affected exclusively by Samsung's optimizations, the performance gain appears to be around 4%. Vellamo is gamed by all of the Snapdragon 800 platforms we have here (ASUS, LG and Samsung). None of this is ok and we want it to stop, but I'm assuming it's not going to. In light of that we're working with all of the benchmark vendors we use to detect and disable any cheats as we find them. We have renamed versions of nearly all of our benchmarks and will have uniquely named versions of all future benchmarks we use. We'll be repopulating our Bench data where appropriate.

CPU performance is honestly excellent. The Galaxy Note 3 is more or less the fastest Android smartphone we've tested up to this point. In the situations where we can do cross platform (OS/browser) comparisons, it isn't quite as fast as the iPhone 5s but in some cases it comes close.

AndEBench - Java

AndEBench - Native

SunSpider Javascript Benchmark 1.0 - Stock Browser

Google Octane Benchmark v1

Mozilla Kraken Benchmark - 1.1

Browsermark 2.0

Vellamo Benchmark - 2.0

Vellamo Benchmark - 2.0

GPU Performance

Samsung definitely likes to win, and the Galaxy Note 3 walks away with the GPU performance crown in literally every single offscreen test we've got here. The onscreen tests are obviously governed by display resolution, but all things being equal the Note 3 manages to get the edge over the PowerVR G6430 in Apple's iPhone 5s. It's also interesting to note that the Galaxy Note 3 appears to outperform all other Snapdragon 800 smartphones we've tested thus far. There's a couple of potential explanations here. First, the Galaxy Note 3 is using newer drivers than any of the other S800 platforms we've tested:

Note 3: 04.03.00.125.077
Padfone: 04.02.02.050.116
G2: 4.02.02.050.141

Secondly, it's unclear how much the manual CPU DVFS setting upon benchmark launch is influencing things - although I suspect it's significant in the case of something like 3DMark. 

Finally each manufacturer has the ability to define their own thermal limits/governor behavior, it could simply be that Samsung is a bit more aggressive on this front. We honestly haven't had enough time to dig into finding out exactly what's going on here (Samsung gave us less than a week to review 3 devices), but the end result are some incredibly quick scores for the Note 3. If I had to guess I'd assume it's actually a combination of all three vectors: drivers, high CPU frequencies and being more lenient with thermals.

Update: GFXBench 2.7 isn't affected by any optimizations here, but Basemark X and 3DMark are. We expect the Note 3's performance is inflated by somewhere in the 3 - 10% range. We're working on neutralizing this optimization across our entire suite.

GLBenchmark 2.7 - T-Rex HD

GLBenchmark 2.7 - T-Rex HD (Offscreen 1080p)

GLBenchmark 2.7 - Egypt HD

GLBenchmark 2.7 - Egypt HD (Offscreen 1080p)

3DMark Unlimited - Ice Storm

Basemark X - On Screen

Basemark X - Off Screen

Epic Citadel - Ultra High Quality, 100% Resolution

NAND & USB 3.0 Performance

Our Galaxy Note 3 review sample posted some incredible storage performance results, at least compared to all other Android smartphones we've tested. Sequential read and write performance are both class leading - the latter is nearly 2x better than the next fastest phone we've tested. Random read performance is decent, but it's random write performance that's surprising. Unlike the Moto X, the Galaxy Note 3 doesn't rely on a flash-friendly file system to get great random write performance - this is raw eMMC horsepower (if you can call ~600 IOPS that). The result isn't quite as good as what you get out of the Moto X, but it comes very close. Android 4.3 should bring FSTRIM support to the Galaxy Note 3, so as long as you remember to leave around 20% of your storage as free space you should enjoy relatively speedy IO regardless of what you do to the phone.

Sequential Read (256KB) Performance

Sequential Write (256KB) Performance

 

Random Read (4KB) Performance

Random Write (4KB) Performance

The Galaxy Note 3 ships with USB 3.0, unfortunately at least in its current state it doesn't seem to get any benefit from the interface. Although the internal eMMC is capable of being read from at ~100MB/s, sustained transfers from the device over adb averaged around 30MB/s regardless of whether or not I connected the Note 3 to a USB 2.0 or 3.0 host.

Update: USB 3.0 does work on the Note 3, but only when connected to a Windows PC with USB 3.0. Doing so brings up a new option in the "USB Computer Connection" picker with USB 3.0 as an option. Ticking this alerts you that using USB 3.0 might interfere with calls and data, but then switches over. Connection transfer speed is indeed faster in this mode as well, like you'd expect.

 

It only appears on Windows as well, my earlier attempts were on OS X where this popup option never appears. 

Battery Life & Charge Time Display
Comments Locked

302 Comments

View All Comments

  • Nathillien - Tuesday, October 1, 2013 - link

    You whine too much LOL (as many others posting here).
  • vFunct - Tuesday, October 1, 2013 - link

    I agree that it's cheating.

    The results don't represent real-world use. Benchmarks are supposed to represent real-world use.

    Geekbench actually runs real programs, for example.
  • Che - Tuesday, October 1, 2013 - link

    Since when do canned benchmarks really represent real world use?

    I don't have a dog in this fight, but benchmarks are very controlled, tightly scripted, and only give you details on the one thing they are measuring. The only way to define real world performance is by..... Using said device in the real world for a period of time.

    I care more for his comments on the actual use of the phone, this will tell you more than any benchmark.
  • doobydoo - Saturday, October 19, 2013 - link

    They are meant to be a way of measuring the relative performance that you'll get with real world use.

    Whatever the actual benchmark, provided some element of that benchmark is similar to something you'll do on the device, the relative performance of different phones should give you a reasonable indication how they will relatively perform in real world use.

    The problem is when companies specifically enable 'benchmark boosters' to artificially boost the phone above what is normally possible for real world use, and thus the relative scores of the benchmark which were previously useful are not.
  • darwinosx - Tuesday, October 8, 2013 - link

    So you are a kid that owns a Samsung phone. Yes, it really is that obvious.
  • Spunjji - Tuesday, October 8, 2013 - link

    Handbag.
  • runner50783 - Tuesday, October 1, 2013 - link

    Why is this cheating?, is not that they are swapping CPUs or anything, the SoC is still running under specification, so, get over it.

    What this make is benchmarks irrelevant, because Manufactures can tweak their kernels to just get better scores that do not reflect daily use.
  • Chillin1248 - Tuesday, October 1, 2013 - link

    No, it is not running under the specification that the consumer will get.

    They raise the thermal headroom, lock the speed to 2.3 ghz (which would normally kill battery time and cause heat issues). Now if Anand would test the battery life while looping the benchmark tests, then it would be fine as the discrepancy would show up. However, he uses a completely different metric to measure battery life.

    Thus, Samsung is able to artificially inflate only their benchmark scores (the only time the "boost" runs is during specific benchmark programs) while hiding said power usage to get those scores.
  • vFunct - Tuesday, October 1, 2013 - link

    It's cheating because the resuts can't be reproduced in the real world for real users.

    Geekbench uses real-world tests, and they need to represent real use.

    Samsung artificially raises the speed of Geekbench so that, for example it's BZip2 compress speeds can't be reproduced when I run BZip2 compress.

    Samsung doesn't allow me to run BZip2 as fast as they run it in benchmarks. Samsung gives the benchmarks a cheat to make them run faster than what the regular user would see.
  • bji - Wednesday, October 2, 2013 - link

    You know, you'd think benchmark authors would figure this stuff out and provide a tool to be used with their benchmark to obfuscate the program so that it can't be recognized by cheats like this. Whatever values the cheaters are keying off of when analyzing the program, just make those things totally alterable by the installation tool. If the benchmark program ends up with a randomized name, it is still usable for benchmarking purposes and the cheaters cannot tell its the benchmark they are trying to cheat on.

    Seriously why do I have to be the one to always think of all of the obvious solutions to these problems!??! Same thing happens at work! lol

Log in

Don't have an account? Sign up now