POST A COMMENT

33 Comments

Back to Article

  • blueboy11 - Saturday, January 28, 2012 - link

    I give it Q4 2013 until we see true "console" gaming on a phone, but my thing will be a wireless controller along with putting it on a HDTV (tablets already kinda do this, I know, but true gaming and not steaming like OnLive) then I can safely say that we will see performance like a high-end netbook, but definitely not like a Core i7 notebook. I'm just glad that tech has advanced, abeit very slowly of course. Reply
  • ssj4Gogeta - Monday, January 30, 2012 - link

    The fact that smartphone SoC's are approaching console graphics performance just tells you how outdated current consoles are. Reply
  • Shadowmaster625 - Monday, January 30, 2012 - link

    There is no point in having console level graphics in a phone because it requires the thing to be plugged into a wall anyway, lest you drain your battery in an hour. What good is a phone with a dead battery? I never could get a straight answer for that. So if your going to be tethered to a wall socket, why not just play games on a frickin console? Reply
  • freezervv - Monday, January 30, 2012 - link

    Because you're doing one of two things:

    A) Driving "console-level graphics" on a cell phone screen
    In which case your resolution is relatively minimal, so it's not outside of the realm of possibility that power efficiency will rise far enough to make this doable with current battery technology. In any case, it's not as though electrical sockets are as rare as hydrogen stations. ;)

    B) Driving "console-level graphics" on a HD television
    In which case your cell phone is already plugged in. Power delivery is simply a matter of connectors.

    Not seeing insurmountable problem...
    Reply
  • B3an - Saturday, January 28, 2012 - link

    The Adreno 225 still isn't good enough to surpass the PowerVR SGX543MP. I know this benchmark was running at a slightly higher res than the Galaxy S2's Mali-400 but this still wont be good enough to even match the SGX543MP. Also keep in mind that the CPU in the S4 is faster than the A5 in Apples toys and this GPU still fails. Not good enough IMO. Lets just hope it's this single benchmark it does poorly with. Reply
  • jrocks84 - Saturday, January 28, 2012 - link

    They need to get the real-world performance closer to theoretical. The theoretical processing power of the Adreno GPU's is very high but lacks efficiency due to it being a VLIW architecture derived from AMD/ATI's GPUs. This is the last Qualcomm GPU that's based on the current architecture though so they'll hopefully be better soon. Reply
  • MrMilli - Monday, January 30, 2012 - link

    While Adreno is indeed VLIW based, it's not ATI based but made by BitBoys. Reply
  • zorxd - Saturday, January 28, 2012 - link

    Most people do not care about GPU performance on phones at this point so I think qualcomm will aim for the 99% of the non-hardcore mobile gamer market.

    Just like on PCs, many people are fine with an integrated GPU or a low end add-on video card.
    Reply
  • B3an - Sunday, January 29, 2012 - link

    It's not just about games. The whole UI is GPU accelerated on Android 4.0 (finally). These future SoC's will also be driving 1080p tablets and even possibly higher resolutions late this year, like 2560x1600.

    Any many people do care about the GPU, even if they dont know much about the technical side, they will certainly care when they cant run even relatively simple games on there new 1080p tablet.
    Reply
  • Exodite - Sunday, January 29, 2012 - link

    2D acceleration is trivial though.

    My 20 year old Amiga 1200 can handle that perfectly fine even at monstrous resolutions, I'm pretty sure it won't be an issue for any modern SoC.
    Reply
  • B3an - Sunday, January 29, 2012 - link

    Look at Android on Tegra 2, it's a laggy mess. Anand has said a few times that he believes it's partly down to the GPU. And thats with just 1280x720. Reply
  • Alexvrb - Sunday, January 29, 2012 - link

    The hardware is only half the equation. Look at the shitty Adreno 200 in first-gen WP7 devices. They're smooth as butter, and the Adreno 200 is a DOG. Future revisions of Android (especially vanilla non-carrier-raped setups) should be decent on the UI front regardless of whether they have the most powerful GPU, as we're hitting the "good enough" point.

    Although in terms of mobile graphics for gaming purposes, there's a reason Sony went with a decent-clocked SGX543MP4 with dedicated VRAM. Smokes anything Nvidia could show them at the time they were building, especially for the power/thermal envelope they were targeting. I look forward to seeing what Series 6-powered chips can do.
    Reply
  • zorxd - Monday, January 30, 2012 - link

    Exactly.
    Also note the crappy SGX535 in the iPhone 4. Much slower than a Tegra 2.
    Reply
  • B3an - Sunday, January 29, 2012 - link

    BTW i seriously doubt your Amiga can handle 1080p at a smooth 60FPS (which a UI needs to be). No chance. Reply
  • mwarner1 - Monday, January 30, 2012 - link

    While I am a big Amiga fan, AGA's highest resolution is 1440x580 and that is interlaced - impressive for it's time, but not exactly high resolution in this era. Reply
  • zorxd - Sunday, January 29, 2012 - link

    current GPUs are fast enough for the GUI
    Just like integrated graphics are fast enough for Windows Aero
    Reply
  • shompa - Sunday, January 29, 2012 - link

    GPU will be more and more important since Android 4 is accelerated with OpenGL.

    One of the reasons why Apple A class SoCs are so much faster in real world testing is that Apple can accelerate its OS/programs with NOVA SIMD extensions. Androids "open" approach does this impossible since some hardware don't even have NOVA. Their work around is OpenGL.

    Apple have 10 + year experience in accelerating its OS/Programs with Quartz Extreme and Altivec. Instead of using brute force, they like "elegance". SIMD is always faster then brute force CPU.
    Reply
  • zorxd - Monday, January 30, 2012 - link

    SIMD isn't "always faster" (not all programs can be SMID'ed) and it is even less relevant when you start moving your stuff to the GPU. The GPU is a much larger SMID unit. Reply
  • metafor - Tuesday, January 31, 2012 - link

    Altivec (and the SIMD instructions Quartz used) are CPU features.... Reply
  • Sttm - Saturday, January 28, 2012 - link

    I have been trying to play GTA 3 and some other games on my Droid but just using a touchscreen to play is awful. I don't know if there is a way to make touchscreen controls for a full 3D player controlled movement game actually fun. So while the GPU power might double or triple, the types of games that play well on the touchscreen hardware might not change. Maybe the Xperia Play successor could fix this a bit, or Microsoft gets a Xbox WP8 phone out there with triggers and joysticks. Reply
  • B3an - Sunday, January 29, 2012 - link

    Android already supports many gamepads. And some phones have HDMI out. With the upcoming 720p res phones, they will also output in 720p - which most X360 and PS3 games run at. All thats needed is for a developer to support a gamepad and thats it. Reply
  • Ilfirin - Sunday, January 29, 2012 - link

    You're seriously going to carry around a gamepad with your phone everywhere? Reply
  • Solidstate89 - Saturday, January 28, 2012 - link

    I believe by the time a portable media device such as a phone or tablet will have "console quality graphics" we'll already be on the next-gen of consoles making the whole comment I keep hearing everytime there's a new generation of Mobile GPUs quite moot.

    And of course I don't see it ever reaching PC-graphic levels, that's for sure.
    Reply
  • MonkeyPaw - Saturday, January 28, 2012 - link

    Well that's a bit of a silly comment. Of course it won't match PC-quality graphics (I assume you mean high end). We are talking about hundreds of watts consumed by the GPU alone on the desktop. The comparison to consoles is valid, as millions deem console graphics acceptable based on the success alone. It also makes for a good comparison, since a benchmark doesn't really say much to the average person.

    What would be rather cool is a 10" tablet syncing up with a wireless 360 controller. If you could play full-blown console games on the go, that would be impressive and appealing. That's the thing, it takes real games to leverage the value of this. If all we have to play are the simple $2 touch apps of today, all this graphics power is meaningless.
    Reply
  • abhaxus - Sunday, January 29, 2012 - link

    I agree about needing better games. That's part of the reason I bought GTA for Android. I really have very little desire to play it, but it would be nice to see more big name games coming to android. Reply
  • tipoo - Sunday, January 29, 2012 - link

    That's exactly it, the chips in these tablets and phones should in theory easily rival last-generation consoles and close on the current generation in 2-3 years, but as long as app costs are capped we will never see a well funded game come out for them. Its all about whether or not the developer thinks they can comfortably make up the development cost, and going from charging 60 dollars for a new console games to a max of what, 6, 7 dollars on Android or less on iOS means big sacrifices. Reply
  • badmiker - Sunday, January 29, 2012 - link

    Very fancy speeds, this however us somewhat irrelevant to the real world at the moment. The real potential of this chip is the power efficiency potential due to the 28nm size. A quick GPU, with a good battery life would be truly amazing. Reply
  • skydrome1 - Sunday, January 29, 2012 - link

    I would like to see the Adreno 320 (Q4 2012 or Q1 2013) in the next Xperia Play. Either that, or the G6400.

    I think the G6400 would be faster though. Backwards compatibility with the PS Vita would be an added bonus!

    With regard to the 225 found in the MSM8960, it's still based on the same design as the 220, with added bandwidth, a few tweaks and better drivers. Given that Anandtech did a comparison previously (I forgot which article, am a bit lazy to dig it up now, think it's the iPad 2 review or the 4S review) and found that the 220 should have the same compute performance as the SGX543MP2, this could tie the SGX543MP2.

    It follows that this would beat Tegra 3's ULP GeForce too, but lose out to the upcoming A6. This also definitely loses to the Vita's 543MP4+.

    Good times! Coincidentally, my contract's up in Q4 this year. So maybe, just maybe, my next phone would negate a need for a PS Vita! Please, Sony, make it happen! :)
    Reply
  • tipoo - Sunday, January 29, 2012 - link

    Tegra 3 in Egypt Pro offscreen gets 78FPS, the Mali 400 gets 67. However, is that difference that big considering the latter is paired with an older dual core CPU? How do we know the higher score in the former isn't due to core count, and a quad core A9 CPU would do better with a Mali 400?

    http://images.anandtech.com/graphs/graph5163/42749...

    http://images.anandtech.com/graphs/graph4951/41613...

    I'll be waiting for some benchmarks of the Adreno 225 on an offscreen benchmark like this where resolution won't skew results. On paper, the Adreno looks great, but historically it hasn't been able to stack up in real-world results due to lower efficiency.
    Reply
  • Lucian Armasu - Sunday, January 29, 2012 - link

    Egypt only measures the GPU, does it not? Reply
  • zorxd - Monday, January 30, 2012 - link

    The CPU seems to make a little difference too. Reply
  • metafor - Tuesday, January 31, 2012 - link

    It depends. If the GPU is fast enough there are a few scenes in Egypt nowadays where the limitation is the CPU's ability to both setup the scene as well as run the driver to compile the GPU code -- you have to keep in mind that the driver for the GPU runs on the CPU.

    This is mostly because Egypt is, by modern standards, fairly dated and that future games will likely use heavier programmable resources available with OpenGL ES 2.5 or 3.0. At that point, the GPU will likely be the sole bottleneck again.
    Reply
  • dgingeri - Sunday, January 29, 2012 - link

    While the specs of the Tegra3's GPU sound pretty good, it's restricted to a single 32-bit memory channel. That performance is going to suck. This GPU looks like where it's at. Reply

Log in

Don't have an account? Sign up now