Back to Article

  • maglito - Tuesday, July 24, 2012 - link

    It would be nice to see MX player with its FFD show based codec packs or/or VLC nightly performance.
    720p(modest bit rate), 1080p(decent bit rate) and full bit rate 1080 blu ray rips ala makemkv. Testing I've done on the nexus7 and galaxy nexus (same guts as nexus q) show this is a big area of needed improvement in next gen SoCs.
    Consider testing off of the SD card and also playing back from a NAS over wifi.
  • ViRGE - Wednesday, July 25, 2012 - link

    If I'm understanding this right, wouldn't this be bypassing all of the hardware acceleration features of the SoC? Why would you possibly want to do that? The whole point of having those features on the SoC is to take a CPU/power-hungry task and run it in fixed function hardware at a much lower power cost. Reply
  • Guspaz - Wednesday, July 25, 2012 - link

    It's a decent test of CPU power, and not every codec is supported by hardware decode. Heck, it's not uncommon to come over h.264 video that uses some feature or level or profile not supported by the hardware decoder.

    Doing this sort of thing in software on a smartphone would be unpractical since it would cause a massive battery life problem, but it's not as big a deal on tablets, where you've got an enormous battery and the biggest power drain is the screen backlight. Qualcomm's test platform here is a tablet as well.

    On the other hand, the OP could also be asking for benchmarks of the hardware decoding support. Not all hardware h.264 decoders are created equal.
  • maglito - Wednesday, July 25, 2012 - link

    Correct. Especially with Qualcomm purchasing the HQV benchmark (Hollywood Quality Video) last year, I was very curious as to how well the new Adreno 320 decodes video.
    This type of testing seems long overdue.
  • maglito - Wednesday, July 25, 2012 - link

    Both MX player and the VLC nightlys support hardware accelerated decoding to varying degrees. I'd be interested in either hardware or software results (obviously hardware should have an advantage if properly supported). In my testing on various current SoC 720p sometimes works without dropping too many frames (and very often the content will fail/crash too), but nothing can play back decent or full bitrate 1080p content at all. This is trying both software and hardware accelerated playback methods in various media players. With all new high end tablets being 1080p (technically 1200) and almost every other tablet being 720p (800) asking such a device to play at least 720p content off a NAS and not chocking shouldn't be too big a deal, but very often they fail. This is an area very overlooked by the press in reviewing these SoC/tablets and I'm hoping anandtech will lead the way in future reviews, as their coverage of such topics in the HTPC articles is excellent. Reply
  • Brian Klug - Thursday, July 26, 2012 - link

    I'm actively looking for more tests - I've played with MX player before, but does it show any decode stats that we could present like number of dropped frames, load, etc? Without that I could only really report sort of a boolean pass/fail and subjective remark or two.

  • jleach1 - Monday, August 13, 2012 - link

    Have you seen the tablet reviews over at Toms hardware? To test screen response they use some sort of high speed camera. That'd be something hard to acquire, but I don't think it's out of bounds given the quality and in depth-ness of long form, technical reviews here. I'd like to have all the information I require here, but the reality is, the combination of a few sites yields the best results. However, I receive 75% of my information here. A testament to the quality of the writers.

    Nonetheless, a simple 1-10 subjective quality test would be better than nothing. Probably in a few areas, since the software decode (ESPECIALLY in MX Player) has virtually identical quality regardless of the source file (extreme downgrades here)...pixelation could be shown in screenshots, color (cmpared to native playback), frame rate, and overall delight would be useful. We trust you guys, and only you guys, I might add, to make these type of subjective judgments, and report back until you guys become billionaires and make your own appsuite / benchmark for android.
  • lilmoe - Tuesday, July 24, 2012 - link

    I'm confused. Why aren't you guys including Samsung's latest Exynos 4 Quad SoC in the benchmark tests? Exynos WILL be powering Samsung's next tablet, so I guess it's a viable comparison.

    It would be interesting to see how both the A5X and Snapdragon S4 (PowerVR and Adreno 320) perform against Exynos (ARM's Mali 400)... You never included "triangle tests" or "fill rates" in your "brief" preview of the Galaxy S3.... A full review is way over-due...
  • teiglin - Wednesday, July 25, 2012 - link

    +1. I'd also like to see Exynos 4210 performance listed in with the rest of your benchmarks, as its Mali400 was far ahead of anything else in its generation aside from the A5's SGX 543MP2, to the point that it would also be at the very least competitive with tegra3. While my own use case of the Galaxy Tab 7.7 is admittedly niche, there are millions of people out here with international/AT&T GS2s, or international Galaxy Notes, who'd probably like to see if Adreno 320 provides enough of a GPU bump to be worthy of consideration. Reply
  • rd_nest - Wednesday, July 25, 2012 - link

    I take AT as the most definitive tech site for reviews. But they seem to ignore most international Samsung products, or they are very late in reviewing them (case in point international SGS2).

    I can say one thing from experience that US-Galaxy series models just doesn't measure upto the international models. I am not referring to cores, but the overall smoothness or optimization of device. I thought the Exynos-based models are definitely smoother compared to Qualcomm-based models.

    Samsung have actually made quite good improvement in driver front for Mali400. Check GLBenchmark 720 offscreen Egypt for SGS2 ( last year and compare it today. It was 42 last year, now it's around 65 (same as Tegra3 or Adreno 225). About 50% increment in fps sounds great to me in a year because of driver optimization.

    I have a feeling that Adreno 320 will have a tough time against Mali-T604. We will see, won't we??
  • MantasPakenas - Wednesday, July 25, 2012 - link

    I was trying to search for an argument along the lines that maybe it's hard for them to review international models due to "peculiarities" of the US wireless market (as those handsets might not be suitable for daily use on their carrier, etc), or maybe their need to cater to US audience makes this too low a priority, but then I remembered they did a review of the Medfield based Lava's Xolo X900 and I can't think of any other reason to justify their discrimination. Reply
  • Stuka87 - Wednesday, July 25, 2012 - link

    As I recall, the X900 was tested by one of the guys in Europe, not the US.

    But my guess is that it comes down to time, and that the bulk of the readers are in the US. But thats just a guess really.
  • ltcommanderdata - Tuesday, July 24, 2012 - link

    I wonder when we will start getting details on OpenGL ES 3.0. I believe it's based on OpenGL 3.2, which should be pretty equivalent to DX10. However, the Adreno 320 only supports DirectX feature level 9_3 which is based on DX9.0b/SM2.0b, not even DX9.0c/SM3.0 much less DX10. It'd be interesting to see the feature differences between OpenGL ES 3.0, DX10/OpenGL 3.x, OpenGL ES 2.0, and DX9/OpenGL 2.x. Reply
  • B3an - Tuesday, July 24, 2012 - link

    I thought a DirectX feature level of 9_3 was basically DX9.0c? And that 9_2 was equal to DX9.0b? Either way its disappointing. I hope they upgrade this before its used in Win8/RT tablets.

    Its confusing how in the article it says "Adreno 320 is Direct X 11 feature level 9_3" when its not DX11 at all. Mistake?
  • ltcommanderdata - Tuesday, July 24, 2012 - link

    9_3 only requires VS2.0a and PS2.0b. I believe this resulted from the differences in ATI and nVidia's implementation of SM3.0 where there were arguments over what was and was not required for SM3.0 compliance, notably ATI not implementing vertex shader texture fetch in their X1000 series. So 9_3 is a slightly relaxed SM3.0 or an enhanced SM2.0b encompassing what everyone could agree on.

    Direct X 11 feature level 9_3 is the correct terminology since 9_3 is available through the DX11 API bringing some of the stability and speed improvements of DX11 although not the new features obviously. Presumably true DX9.0c is still available separately (maybe not on Windows RT?) for compatibility.
  • Ryan Smith - Wednesday, July 25, 2012 - link

    My expectation is that we won't get an approachable rundown of features until OpenGL ES 3.0 is finalized. Khronos's operations are largely in the open so we could put together a list based on draft revisions, but after the craziness that was OpenGL 3.0 it's proven to not be a good idea to assume anything about a new OpenGL standard until that standard is finalized. Reply
  • aryonoco - Tuesday, July 24, 2012 - link

    "It's unclear how much of this performance increase over the dual-core S4 is due to the added cores vs. software optimizations to the MDP/T's browser."

    This keeps coming up in your review of Android devices. Considering that going forward, I doubt you will be testing many (any) Android devices running Gingerbread or earlier, Chrome should now be available on all Android devices you'll be testing. Also considering that the version of Chrome downloaded from the Play store is going to be without OEM modifications, you can take out this software optimization variable when comparing SoCs if you switched to Chrome.
  • lilmoe - Tuesday, July 24, 2012 - link


    Or they can use the new Firefox browser and compare results with Chrome... The latest Firefox browser for Android is VERY fast.
  • jleach1 - Monday, August 13, 2012 - link

    I agree, which was odd, because Firefox was GARBAGE before the UI and PX improvements came.

    But a .apk of the Chrome browser would be needed, because update rates are through the roof for gapps, and a new version would likely be out by the time the next architecture or SoC review would be up for grabs.

    Bottom line, Anandtech really can't rely on its own software. Relative performance is important, but looking at a chart of numbers representing a browser that has 2x the JavaScript performance due to an update and comparing it to the old version is a pain.

    It'd also be relevant to have updated, native browser data and benchmarks as well.

    Sorry for the typos, and grammatical errors, I'm typing this on a tablet.
  • tipoo - Wednesday, July 25, 2012 - link

    Agreed, or really any non-default browser. Use the same software to isolate the hardware. Reply
  • MantasPakenas - Wednesday, July 25, 2012 - link

    +1. Thanks for saving me a couple of minutes to write exactly this :) The next commenter also took care of the other non-default browser point.

    The only thing I still wanted to add in support of Chrome is that including anything with Android <4 would be pointless anyway, as it's no longer a question of software optimizations but rather different browsers altogether. Additionally, it's already the default browser on the Nexus 7, and will likely replace the stock browser in the next version of Android anyway.

    Also, they could at least sideload the stock version of the browser.apk (or just use ICS+ from the Play store I guess) and skip on the excuses...
  • mosmov - Tuesday, July 24, 2012 - link

    You've tested only for the standard configure which has the less quality config than the High config.
    As we see over 100 fps in the offscreen mode, we could expect to run the high mode smoothly as well as the std mode.
  • tviceman - Tuesday, July 24, 2012 - link

    When will products likely be out on the market in stock to buy with this CPU? It's looking pretty good, but timing is important. I imagine that when they actually start shipping, Nvidia will strategically start letting Tegra4 info out. Reply
  • antef - Tuesday, July 24, 2012 - link

    This is neat and all, but I have to wonder how exciting it really is...recent phones are already plenty fast for what people use them for. Is this really that exciting except to look at benchmarks? What will make use of this except for some non-existent games? I could maybe see future docking scenarios where you use your phone's power for computing on a bigger screen, but honestly who wants that, it's still going to be full of compromises. Reply
  • Death666Angel - Wednesday, July 25, 2012 - link

    Did you ask those questions when the first quad core CPUs appeared? Or when SLI/Cf was introduced? Developers and manufacturers will always find ways to use the power given to them. If you don't need it, simple, don't buy it. Reply
  • torp - Wednesday, July 25, 2012 - link

    If they cut out 1/4 of this SoC and put it into a phone, will it increase my phone's standby from 2-3 days to 6-7 days? :) Reply
  • torp - Wednesday, July 25, 2012 - link

    Hmm this is a Big Brother like moment... i registered just now specifically to post the above comment. At no point did i ever upload a photo, yet this drawing (which is mine, but i'm pretty sure i didn't have it on this particular computer i post from) shows up on my posts. Where did you steal it from, Ananandtech? Reply
  • Ryan Smith - Wednesday, July 25, 2012 - link

    You'd have to ask Gravitar - that's where it comes from. I'd have to assume you signed up for their service at some point. Reply
  • torp - Wednesday, July 25, 2012 - link

    Not consciously. Please delete this account from your servers and I'll see about deleting my info from this "gravitar". Reply
  • torp - Wednesday, July 25, 2012 - link

    Oh wait. How do I find them? Nothing at and all I get with google is some arcade game. Reply
  • Ryan Smith - Wednesday, July 25, 2012 - link

    Sorry, I misspelled that. It's Reply
  • Rocket321 - Wednesday, July 25, 2012 - link

    I think your power conclusion is a bit misleading or at least should be clarified. In the case of a Tablet things are spot on. However you mentioned in the article this SoC may make its way into high-end phones coupled with a MDM chip to provide air interfaces. This would likely come at a significant power hit over currently shipping Krait phones which have the air interfaces integrated. Reply
  • maxsteve - Wednesday, July 25, 2012 - link

    Dear Anand, I'm a avid reader of your website since long. But never felt like posting anything. As you cover almost everything in your reviews.

    But this morning I just felt like asking you for few things. Hope you'll consider it.

    1. Looking for review on following.

    Allwinner 10 Cortex A8 Mali400x2 Cost $4/1000 pcs
    Amlogic Dualcore A9 MX3L Mali400x2 Cost 6$/1000pcs
    RK3066 Dualcore A9 Mali400x4 Cost $9 /1000pcs
    Freescale i.mx6 QUAD adreno320 Cost unknown
    Huawei K3 QUAD Cost Unknown
    MTK A9 6575 integrated cellular dualsim dual standby Cost 4$/1000
    MTK Dualcore A9 6577 integrated cellular dual sim dual standby Cost $7 /1000
    Telechips8925 A8 Cost $3 / 1000
    Telechips 8965 A9 DualCore Cost $4 /1000

    Upcoming by Christmas on real products:

    MTK QUAD 6585 28nm Mali 543mp4
    Allwinner A10II QUAD 32nm Mali450x4
    Amlogic QUAD 32NM Mali450x4
    Rockchip4066 QUAD 32 Mali543mp4

    Huawei +1 series QUAD
    ZTE unnamed QUAD

    Point of enlisting all this: TIME HAS CHANGED.

    Quality processors with cheap prices, Chinese are doing it well. So they also deserve some mainstream love from websites like this.

    And I'm sure there are lot of out there who want to see the benchmarks of these el-cheapos give run for their money to these biggies.

    2. Headers on Anandtech
    CPUs Motherboards SSD/HDD GPUs Mobile IT/Datacenter
    Smartphones Memory Cases/Cooling/PSU(s) Displays Mac Systems Linux Trade Shows Guides

    My apologies but again, Time bas changed. Now tablet is coming into mainstreams soon going to replace laptop of happy go crowds. (Not talking about you:))

    My point of view is add a "Tablets" Header. Time of tablets has come, They have arrived. So please acknowledge it. Give them their place with due respect.

    Also "Mobile" and "Smartphones" placement should be swapped. As its more easy on eyes and less confusing to see "Tablets" and "Smartphones" sitting besides each-other.

    3. This is an extra complementary demand, Please add a "Processors" Header under which there is enlisting of processor as per their features and category. Can also be linked with their respective benchmarks etc.

    I know I'm being too greedy, But trust me all your loyal readers are gready for more and more info. Its all your fault, You have spoiled them. :)

    Apologies for the offtopic post.

    Have a great day and good luck.
  • barn25 - Wednesday, July 25, 2012 - link

    Time and time again anand has shown a disdain for using real bench marks rather than what the maker provides. Its really sad actually because it is apparent that he is an all-round intelligent guy. Sites like xbit labs and realworld technologies as well as phoronix use benchmarks without bending backwards to intel or anybody. Reply
  • Arnulf - Friday, July 27, 2012 - link

    This thing is finally suitable for non-x86 PCs that IT analystis are predicting to emerge en massse by 2015. Enough horsepower for all desktop work, internet browsing, multimedia and Ouya kind of games. Now if only VIA could get this onto their "VAB" board in place of those lame 800/1000 MHz chips and sell it at a reasonable price ... Reply

Log in

Don't have an account? Sign up now