Conclusions

The Note 3 is an iterative product, that’s absolutely true, but the improvements in the Note 3 are pretty dramatic. It really does feel better, thinner, lighter all while having a bigger, more usable display. The silicon inside is incredibly quick, easily the fastest in the Android camp. It's also good to see Samsung on the forefront of RF technology here, implementing an envelope power tracker alongside Qualcomm's 3rd generation LTE modem. The combination results in a fairly robust, very high-end platform that is modern on both compute and modem/RF fronts. Given my affinity for the latter, I'm happy.

Battery life benefits from the large chassis and associated battery, as well as Qualcomm's Snapdragon 800 platform which seems to manage power a lot better than the outgoing Snapdragon 600. I was also impressed by the Galaxy Note 3's IO performance. Although it didn't beat the Moto X in random write IO performance, it came extremely close and absolutely destroyed everything else in sequential write speed. Samsung clearly went all out with the Note 3 and pretty much tried to win all of our tests. The beauty of that approach is it should lend itself to an awesome user experience.

The S Pen experience continues to improve and I don't really have any major complaints about it on the Note 3. It's a novel addition that I can see resonating very well with the right type of user. Approximating pen/paper is tough and no one has really done a perfect job there, but the S Pen can be good enough in the right situations. The good news is that even if you don't use the S Pen much, it hides away quite unobtrusively and you can go about using the Note 3 just like a large Android device.

There are only three issues I'd like to see addressed with the Note 3. The move to USB 3.0 is interesting and could be a big benefit when it comes to getting large files off of the device (the NAND/eMMC isn't quick enough to make USB 3 any faster at putting data on the phone), but the hardware or software implementation of USB 3 on the Note 3 doesn't actually deliver any performance advantage (Update: In OS X, in Windows you can actually get USB 3.0 working). For whatever reason 802.11ac performance on the Note 3 wasn't as good as it was on the SGS4 or other 802.11ac devices we've tested. It's not a huge deal but for an otherwise very well executed device I don't like to see regressions. And finally, I would like to see Android OEMs stop with manual DVFS control upon benchmark detect, but that seems to be an industry wide problem at this point and not something exclusive to the Galaxy Note 3.

Whereas previous Notes felt like a strange alternative to the Galaxy S line, the Galaxy Note 3 feels more like Samsung's actual flagship. It equals the Galaxy S 4 in camera performance, but exceeds it pretty much everywhere else. There's a better SoC, better cellular/RF and even better industrial design. I suppose next year we'll see the Galaxy S 5 play catch up in these areas, but until then it's clear that the Note 3 is the new flagship from Samsung. Although you could argue that the improvements within are incremental, the Note 3  really defines what incremental should be. 

Cellular, WiFi, Speaker & Noise Rejection
Comments Locked

302 Comments

View All Comments

  • Nathillien - Tuesday, October 1, 2013 - link

    You whine too much LOL (as many others posting here).
  • vFunct - Tuesday, October 1, 2013 - link

    I agree that it's cheating.

    The results don't represent real-world use. Benchmarks are supposed to represent real-world use.

    Geekbench actually runs real programs, for example.
  • Che - Tuesday, October 1, 2013 - link

    Since when do canned benchmarks really represent real world use?

    I don't have a dog in this fight, but benchmarks are very controlled, tightly scripted, and only give you details on the one thing they are measuring. The only way to define real world performance is by..... Using said device in the real world for a period of time.

    I care more for his comments on the actual use of the phone, this will tell you more than any benchmark.
  • doobydoo - Saturday, October 19, 2013 - link

    They are meant to be a way of measuring the relative performance that you'll get with real world use.

    Whatever the actual benchmark, provided some element of that benchmark is similar to something you'll do on the device, the relative performance of different phones should give you a reasonable indication how they will relatively perform in real world use.

    The problem is when companies specifically enable 'benchmark boosters' to artificially boost the phone above what is normally possible for real world use, and thus the relative scores of the benchmark which were previously useful are not.
  • darwinosx - Tuesday, October 8, 2013 - link

    So you are a kid that owns a Samsung phone. Yes, it really is that obvious.
  • Spunjji - Tuesday, October 8, 2013 - link

    Handbag.
  • runner50783 - Tuesday, October 1, 2013 - link

    Why is this cheating?, is not that they are swapping CPUs or anything, the SoC is still running under specification, so, get over it.

    What this make is benchmarks irrelevant, because Manufactures can tweak their kernels to just get better scores that do not reflect daily use.
  • Chillin1248 - Tuesday, October 1, 2013 - link

    No, it is not running under the specification that the consumer will get.

    They raise the thermal headroom, lock the speed to 2.3 ghz (which would normally kill battery time and cause heat issues). Now if Anand would test the battery life while looping the benchmark tests, then it would be fine as the discrepancy would show up. However, he uses a completely different metric to measure battery life.

    Thus, Samsung is able to artificially inflate only their benchmark scores (the only time the "boost" runs is during specific benchmark programs) while hiding said power usage to get those scores.
  • vFunct - Tuesday, October 1, 2013 - link

    It's cheating because the resuts can't be reproduced in the real world for real users.

    Geekbench uses real-world tests, and they need to represent real use.

    Samsung artificially raises the speed of Geekbench so that, for example it's BZip2 compress speeds can't be reproduced when I run BZip2 compress.

    Samsung doesn't allow me to run BZip2 as fast as they run it in benchmarks. Samsung gives the benchmarks a cheat to make them run faster than what the regular user would see.
  • bji - Wednesday, October 2, 2013 - link

    You know, you'd think benchmark authors would figure this stuff out and provide a tool to be used with their benchmark to obfuscate the program so that it can't be recognized by cheats like this. Whatever values the cheaters are keying off of when analyzing the program, just make those things totally alterable by the installation tool. If the benchmark program ends up with a randomized name, it is still usable for benchmarking purposes and the cheaters cannot tell its the benchmark they are trying to cheat on.

    Seriously why do I have to be the one to always think of all of the obvious solutions to these problems!??! Same thing happens at work! lol

Log in

Don't have an account? Sign up now