POST A COMMENT

39 Comments

Back to Article

  • RaistlinZ - Thursday, June 19, 2014 - link

    I'm throwing money at my screen but nothing is happening. Reply
  • skiboysteve - Thursday, June 19, 2014 - link

    Hahaha Reply
  • soccerballtux - Friday, June 20, 2014 - link

    try screaming the specs as loudly as you can Reply
  • nunya112 - Friday, June 20, 2014 - link

    haha that's funny :) Reply
  • Frenetic Pony - Friday, June 20, 2014 - link

    I guess that's cool... but it's a small upgrade to an existing phone not have a year old so isn't this Samsung just going "First Post!" Reply
  • lmcd - Friday, June 20, 2014 - link

    GPU ain't a small upgrade, as the post noted. Reply
  • Gondalf - Friday, June 20, 2014 - link

    Umm yes and i am not surprised of this. I my knowledge just now Adreno 330 throttles down all the time of around 30% under a sustained workload. The most laughable Qualcomm claim is a 20% power reduction on Adreno 420 vs. 330. Obviousy it is only marketing because the process is the same (28nm) and all the available tricks to reduce the power consumption are already widely adopted in the older Mali GPUs. It's funny but Qualcomm is the only Company around the world for which the laws of physics are not valid. Another boring throttle festival for brainless customers.
    It is strange that Anandtech do not stress the top tier phones with a sustained GPU bench to reveal all the weakness of these absurd tablet thinked SOCs. But it is strange for real???? or site contacts numbers are more relevant......
    Reply
  • UpSpin - Friday, June 20, 2014 - link

    You said:
    "Obviousy it is only marketing because the process is the same (28nm) and all the available tricks to reduce the power consumption are already widely adopted in the older Mali GPUs."

    So according to you, there are no ways to further increase the efficiency without changing the process?
    So according to you there's no reason to do further research in GPUs, because they are as efficient as possible, and all they have to do is wait for TMSC or others until they finalized a smaller process?

    That's wrong. I don't have a link for GPUs, but an article about the improvements of Cortex A9 revisions: http://www.anandtech.com/show/6787/nvidia-tegra-4-...
    Reply
  • pepone1234 - Friday, June 20, 2014 - link

    And he is also forgetting the nvidia maxwell example. Reply
  • Gondalf - Friday, June 20, 2014 - link

    Maxwell example is only a larger utilization of very common techniques in mobile CPUs/GPUs. Here the topic is not an high power GPU for Pcs but some just now heavily optimized IP blocks for handset SOCs. With Maxwell Nvidia has only done what he had not yet done, hoping for a faster node shrink. Reply
  • Gondalf - Friday, June 20, 2014 - link

    But yes, it is always possible to have better computational strongness at a given clock speed (more IPC or throughput), still this have a cost in power consumption because the units (alus, fp blocks etc.) are more active, this draws power even if the units are more advanced (better caches, internal buses, minor bottlenecks). It's a matter of transistor switch, the more are the switching transistors at a given process, higher is the power consumption. Obviously there is the better power management to drop the passive power (leakage) but.....in these days nearly all was done to drop the power in mobile SOCs, we are at the minor incremental enhancement era.
    The real story is that all the dust is hidden by inexistent TDP figures from Qualcomm and by an aggressive throttling after two or three minutes of full load, just in time to make hot the body of your loved high end phone (not much high end after all).
    Reply
  • UpSpin - Friday, June 20, 2014 - link

    I doubt you know what you're talkig about.
    In benchmarks the units are as active as possible, how on earth can they become more active by a new revision?
    The more the transistors switch, the higher the clock speed. But we're talking about the performance at a fixed identical clock speed. So the 'more switching' is nonsense.
    Power management has little to do with leakage current. Leakage current is a matter of the process. The smaller the process, the higher the leakage current. To handle this issue you can power gate specific parts. Power management is more about how responsive, fast and efficient the GPU can transition between different states, so it can ramp up only when needed and stay there as short as needed.
    Reply
  • R0H1T - Friday, June 20, 2014 - link

    "It is strange that Anandtech do not stress the top tier phones with a sustained GPU bench to reveal all the weakness of these absurd tablet thinked SOCs" ~ Just take a wild guess which SoC's will throttle the most, hint it starts with an I & ends with L !

    P>S> sorry for yet another rant (-:
    Reply
  • Homeles - Friday, June 20, 2014 - link

    Do you have any facts to back up your claim? Or are you just being an ignorant shill for AMD, as usual? Reply
  • R0H1T - Friday, June 20, 2014 - link

    Funny you say that when in fact you also come to the defense of Intel seemingly anywhere & everywhere on this planet, guess what that makes you OR is it the usual double standards from your side of the fence ? Reply
  • Homeles - Friday, June 20, 2014 - link

    I actually don't. I'll defend anyone, but your head is too far up your own rear to realize this.

    Since you are far too uneducated about hardware in general, the designs that would be most subject to thermal throttling are the ones that produce the most heat.

    AMD would easily be the worst, seeing as they can't even scale their designs down to smartphone levels, and are barely even tablet worthy. They'd be followed by Nvidia and Apple, in no particular order, that have higher power targets than their competition.
    Reply
  • R0H1T - Friday, June 20, 2014 - link

    The fact that you brought up AMD in a topic involving Adreno & Intel is no surprise to me but there is no AMD SoC (yet in tablets) & so your extrapolation of "fun facts" also tells me that your arse is not in the right place. FYI before Maxwell AMD had the most efficient GPU in HD 7790, the thing AMD's doing right now is slowly but surely cutting down their APU power consumption with a top down strategy heavily reliant on node shrinks & GPU optimization unlike the other major vendors who've developed SoC's from the bottom up. The new ARM based AMD cores will show where does AMD in the league of established players but hey carry on ranting wherever I mention the "I" word in any of my posts, but since I already know where you & most of the Intel apologists on this site come from I do take pleasure in winding up the so called "neutral" fans ! Reply
  • 787b - Friday, June 20, 2014 - link

    I second that. Almost every contemporary android flaghip uses a version of snapdragon 800, but there are packaging differences between different models, so the thermals are bound to be different. Sony, for example, uses a thermal paste between SOC and a metal chassis in Z1 Compact, but I have no ideal if it works in real world.

    I propose a freezer test, similar to the nexus 4 throttling investigation, just with more phones. Also a review of a Sony smartphone would be really nice.
    Reply
  • Morawka - Saturday, June 21, 2014 - link

    problem is, this soc IS ON a smaller fabrication process (20nm) than all the others you mention. Your argument is entirely invalidated. Reply
  • ruzveh - Friday, June 20, 2014 - link

    It would be interesting to see Full HD 1080p display performance over 20nm SOC. It would either bring more power or drastic reduction in power.. Frankly who needs QHD on phones? Reply
  • heffeque - Friday, June 20, 2014 - link

    AFAIK, there are a lot of people in tech sites that claim to have a better sight than what's scientifically proven to be impossible for a human being to have.

    Science creates these amazing phones, but science has no clue about something as simple as to know how much pixel density human eyes/brain can distinguish at a certain distance!

    People on tech sites know a lot more about this than scientists, that's for sure.
    Reply
  • UpSpin - Friday, June 20, 2014 - link

    It's less about the urgent desire to have QHD on phones, and more about the progress in general.
    Whatever you develop must result in profit in some way. Because smartphone sales are gigantic, you can be sure, that something implemented in smartphones is profitable, can get mas produced which results in a lower price per part.

    Just look at the price for MEMS sensors (accelerometer/gyrometer). Prior to the smartphone era they were priced at >$50 a piece. Thanks to the huge mass production required for smartphones, they are cheap and can get used in tons of less successful products with low quantities (quadrocopter, Oculus, watch, ...)

    A smartphone might not need a QHD, but the Oculus for example needs it. The quantity of produced Oculus is too small to justify the research and production of such high density displays. Thanks to QHD smartphones however, those displays will become affordable.

    So you should be happy if the latest and greatest tech gets integrated in smartphones, even if it looks like overkill or you don't have a huse for it right now. Smartphones make this technology affordable and thus a lot of other great products possible.
    Reply
  • akdj - Saturday, June 21, 2014 - link

    More PPI, equals better character recognition as well. Asian countries not using our alpha (26 easily formed letters or variations of; Spanish, French, German et al) numeric system we're using in the west. Cantonese, Chinese traditional and Japanese 'letters' and words are significantly easier to read (& write) at HiDPI. They're coming. It's all the rage. Weird that Amazon missed the CES '13 3D craze transition to the '14's HiDPI 'fad' ...that'll 'keep' it's traction Reply
  • tuxRoller - Friday, June 20, 2014 - link

    That's wrong. They even had an article on this site that spoke of the theoretical benefits of increasing DPI up to, iirc, around 600. After that, there is then no more advantage to going higher. Reply
  • Klug4Pres - Friday, June 20, 2014 - link

    "While the CPU revisions are minor, the GPU is fast enough to have the same level of performance at 1440p as an Adreno 330 at 1080p"

    What are you, a mouthpiece for the mobile phone industry?

    You might just as well have said: "In spite of a significant increase in potential graphics performance from Adreno 330, the move to a 1440p screen resolution (from 1080p) completely negates that. On top of that, almost nobody will be able to discern any improvement in the screen from the increased resolution, maximum brightness is lower than it could have been at 1080p, and the screen uses x% more power."
    Reply
  • jlabelle - Friday, June 20, 2014 - link

    I hope that the MacLaren Lumia-Microsoft flagship will not have a QHD screen as I am waiting to upgrade my Lumia 1020 and I find those screen reader with PPI above 400-500 utterly silly. Reply
  • jlabelle - Friday, June 20, 2014 - link

    Screen race I meant. Reply
  • maroon1 - Friday, June 20, 2014 - link

    Why you are assuming that all android games will run on native resolution.

    Some games like Record of Agarest War only support 720p max resolution. Not all games run on native resolution
    Reply
  • akdj - Saturday, June 21, 2014 - link

    Hmmm. Reading comprehension. You said almost EXACTLY what the author did in different terms. He could've also said WTF's up with the LG3? 1440p and the 330? Now we've got a GPU that can handle that many pixels without evident stuttering and possible consistent UI refresh rates @60hz? I'm not following you, other than your attempt at some sort of 'journalism criticism'. Can you link HS to your blog? Site? Contributions to technology? Reply
  • Klug4Pres - Tuesday, June 24, 2014 - link

    Is this a reply to me? I don't follow you either. Reply
  • iwod - Friday, June 20, 2014 - link

    And Rumors were Qualcomm not happy with TSMC due to Apple getting all the 20nm shipment?
    Well Qualcomm is shipping those NOW.
    Reply
  • AndreiLux - Friday, June 20, 2014 - link

    There is no evidence that it's actually made by TSMC. Qualcomm has multisourced modems to Samsung for the past years. Reply
  • JoshHo - Saturday, June 21, 2014 - link

    The MDM9x35 was announced as the first 20SOC product from Qualcomm. It's a reasonable assumption that these modems are fabricated at TSMC. Reply
  • Notmyusualid - Friday, June 20, 2014 - link

    $1395 USD on a well known auction site.

    Hmm, no thanks.

    Still dying to see that screen though, and see how battery life is.
    Reply
  • ACA777 - Friday, June 20, 2014 - link

    Now that they've baited the early birds into buying the "Galaxy S 4.5", Sammy is bringing out the real S5. Or should we say what the S5 should've been in the first place. Well played Sammy! Reply
  • identity - Friday, June 20, 2014 - link

    Only released in Korea. Learn to read and comprehend. Reply
  • Marthisdil - Friday, June 20, 2014 - link

    Hopefully they will use the same stuff in the Note 4 :) Reply
  • vortmax2 - Friday, June 20, 2014 - link

    This is their Note 4 test bed, kind of. Reply
  • vortmax2 - Friday, June 20, 2014 - link

    Maybe it's an SOC test bed for the Note 4... Reply

Log in

Don't have an account? Sign up now