Back to Article

  • krazyfrog - Friday, November 02, 2012 - link

    Not quite the 2x performance over the A5X that Apple promises but impressive nonetheless. Curious to know what’s the situation with the heating is like with the A6X. Reply
  • ltcommanderdata - Friday, November 02, 2012 - link

    Well compared to the A5X they've moved from a 45nm process to a 32nm process and the clock speed increase is very modest over the A6 in the iPhone 5 despite the much larger tablet form factor, so presumably things are under control. Reply
  • Alexvrb - Saturday, November 03, 2012 - link

    Well in terms of raw compute performance it really IS a 2x increase. So for compute-bound situations... well just look at the Egypt HD results above. But as always, real world performance in a lot of situations isn't going to net the same results. Doubling the compute performance won't normally equate to twice the frames, especially since memory bandwidth and fill rate didn't double too.

    Still, I'm glad to see the 554 being used in an actual chip design. I think even a 2 or 3 core variant would be great in a phone. It performs very well and makes me look forward to Series 6 even more. PowerVR has consistently put out good mobile designs, that's for certain. I would like to see more cutting-edge PowerVR designs outside of Apple devices, of course.
  • VeronRio - Sunday, November 04, 2012 - link

    Love my job, since I've been bringing in $5600… I sit at home, music playing while I work in front of my new iMac that I got now that I'm making it online(Click on menu Home)
  • Achtung_BG - Friday, November 02, 2012 - link

    Do you know how big this chip die size mm2 ? Reply
  • ltcommanderdata - Friday, November 02, 2012 - link

    The A6X is 123 mm2. 30% larger than the A6.
  • ltcommanderdata - Friday, November 02, 2012 - link

    When the iPad 3 increased the pixel count by 4x with only a 2x increase in GPU power the concern was whether it'd be fill rate limited. Now with the iPad 4, the ALU performance has theoretically increased more than 2x due to doubled ALU count and higher clock speed putting it above the iPad 2 on a per pixel basis. However, the fill rate has only increase modestly with clock speed since the ROP count remains the same, putting it still below the iPad 2 on a per pixel basis. The Egypt HD results suggests that the relative fill rate deficit doesn't limit performance at native Retina resolution compared to the iPad 2 at native resolution. Do we expect this to be the case in real world high-end games as well? Reply
  • Kevin G - Friday, November 02, 2012 - link

    The issue I think we're running into isn't fill rate directly but rather memory bandwidth. It is evident that the iPad 4 keeps the 128 bit wide memory interface but the clock speed of that bus is still known to my knowledge. The A6 brought some much needed efficiency improvements in the memory controller that should translate well to the A6X. I'm really curious what bandwidth figures the A6X can produce.

    I'm also kinda surprised that Apple is only going with a 1.4 Ghz core clock in the iPad 4. I would have suspected something a bit higher, at least for a turbo mode. Then again they doubled GPU performance and it looks like most of the power headroom with the tablet form factor went there.
  • ltcommanderdata - Friday, November 02, 2012 - link

    Apple went from LPDDR2-800 to LPDDR2-1066 for the A5 vs the A6 so I expect the same bump occurred between the A5X and A6X. Reply
  • DERSS - Friday, November 02, 2012 - link

    Since Egypt test, even in HD version, is old and does not really uses advanced sharers and other fancy technologies, it is not indication of real high-end games you speak about of.

    There is no real game-like test, unfortunately. Maybe some could do such test based on a scene from upcoming Infinity Blade 3 game or something, but Egypt test does not tell us much.
  • Krysto - Friday, November 02, 2012 - link

    I wonder when Kishonti will release their GLBenchmark 3.0 with tests for OpenGL ES 3.0. They were supposed to release it by the end of the year. Adreno 320 already has it, and Mali T604 should get it soon. Will they just wait until Apple release their PowerVR 6 series iPad in spring or whenever, to do it? Reply
  • ltcommanderdata - Friday, November 02, 2012 - link

    As far as I know, the OpenGL ES 3.0 conformance tests aren't even finalized yet so there are no official OpenGL ES 3.0 drivers. Adreno 320 and Mali-T604 maybe the first OpenGL ES 3.0 capable GPUs on the market, but it'll likely go unused until next year. Reply
  • BoyBawang - Friday, November 02, 2012 - link

    The speed of the previous iPad processor was already far beyond any Android out there and now they made it even faster! That's just insane! Reply
  • Krysto - Friday, November 02, 2012 - link

    Mali T604 seems or be slightly faster than A5X, and now A6X overtook it by 60%. If Apple didn't need to upgrade their GPU in iPad because of the poor performance with the retina display, we would've waited a while until next spring to see an increase in speed over Mali T604.

    I wonder when Mali T624 or Mali T658 are supposed to appear. I'm not entirely sure they will come a year from now. At least one of them (probably T624) should appear in Galaxy S4 along with a big.Little set-up.
  • froggyperson - Friday, November 02, 2012 - link

    its really sad that the t604 doesn't blow everything out of the water, I was eagerly anticipating its release from last year, and now it comes out with a noticeable improvement over the 400 but nothing close to the ipads gpu. Reply
  • powerarmour - Friday, November 02, 2012 - link

    I think Nvidia should be able to beat this with Tegra 4, especially if it's Kepler based, and should have the software/driver support to go with it. Reply
  • Wurzelsepp - Friday, November 02, 2012 - link

    I don't think so, Tegra 2 and Tegra 3 have been disappointing in terms of graphics performance. According to NVIDIA, Tegra 4 is going to be about 3 times faster than its predecessor, even if these claims are true it would still be slower than the A6X. Reply
  • Krysto - Friday, November 02, 2012 - link

    When did they say it will be 3x faster? Do you mean those old 2011 charts? That's outdated now. The Tegra 4 that will arrive next year is more like Tegra 5 or Tegra 4.5, together with a new GPU architecture (somehow based on Kepler, according to rumors).

    One of the rumors said it will have 64 GPU cores, compared to the 12 in Tegra 3 right now. Assuming the cores are no more powerful than Tegra 3 (they probably are) then it should be 5.3x faster than Tegra 3, which is the least Nvidia needs to even be competitive in 2013.

    However, another rumor also says there will be a 32 core version as well, and if the cores are no more powerful than the ones in Tegra 3, then I hope that's just a chip version for $200 tablets or something, because otherwise it would be pretty disappointing if it doesn't launch straight with the 64 core one this spring.
  • augiem - Friday, November 02, 2012 - link

    Not that I think all the information on Wikipedia is correct, but says:

    Wayne - About 10 times faster than Tegra 2. Q1 2013.
    Logan - About 50 times faster than Tegra 2. To be released in 2013 (no specific Q).
    Stark - About 75 times faster than Tegra2. To be released in 2014.

    Now I'm not sure the speed differences between Tegra 2 and Tegra 3, but right now, at the distance Tegra 3 trails the A6, I think there's very little chance Wayne will beat it if the 10x Tegra 2 spec is accurate. Logan, probably, but with 2 launches in a year, I'd expect that Q4 or thereabouts. However PowerVR is gearing up to launch their 6xx series Rogue GPU in Q1 2013 I believe. That gives Apple time to put a Rogue in their next iPad, which I would guess will be launched in Q4 2013. I personally don't see anything coming to really challange Imagination earlier than 2014.
  • melgross - Friday, November 02, 2012 - link

    The one we do know is that the Tegra's are never as good as Nvidia says they are. This has become tradition. Reply
  • Alucard291 - Friday, November 02, 2012 - link

    But they get soooo much hype its almost silly :) Reply
  • CBone - Saturday, November 03, 2012 - link

    The problem with nvidia is that they obsolete their latest gpu by talking about the next one before anything even uses the current one.

    Does it get beaten in benches? Yes. Badly. But it's faster than crap in games and no one will do as much heavy lifting as nvidia to help optimize a game for their chips. I'll take that trade. Funny that the powervr designs are stomping through benches in apple stuff, but the same games on both platforms look better with tegra.
  • CeriseCogburn - Monday, November 05, 2012 - link

    I wondered what all the insane babbling idiots were talking about on 1st page of comments... LOL

    It sure is a lot of hot air when the real world GAMES and the like leave Jobs' PR monkeys sucking dirt and gagging on their words.

    I find it amazing how large the emotional hyper ditzy girl sensational self happy deception baggage is.

    Then half of them tell others they don't know what they're talking about....
    (roll eyes again, rinse and repeat, often)
  • steven75 - Monday, November 05, 2012 - link

    U jelly? Reply
  • CeriseCogburn - Tuesday, November 06, 2012 - link

    Disappointed. We have another whole segment of babbling retards.
    Clueless idiots, the best we have in "tech" at the "most comprehensive and in depth" review site.

    Ashamed is the answer, ashamed that so many are so stupid and so estrogen doused that the feed of lies and fantasies is 10X longer than the simple and short facts.
  • mavere - Tuesday, November 06, 2012 - link

    "estrogen doused"

    Well then.
  • andsoitgoes - Saturday, November 10, 2012 - link

    Yeah. That was. Was. Probably the most interesting thing I've seen trolled on a site.

    I applaud the crazy. The crazy in that one is SSSSSTRONG.
  • djgandy - Monday, November 05, 2012 - link

    Yawn Nvidia fanboys even in the mobile market where they are a complete non player. Tegra's thermal characteristics are terrible. Nvidia always talks big, but they can't walk the walk.

    I find it hilarious that you think roadmaps can just be "outdated" and products magically moved forward. You clearly lack understanding of how complex hardware design and how long that cycle lasts. People are already sitting in rooms discussing requirements for what will be in your smartphone in 2017.
  • CeriseCogburn - Tuesday, November 06, 2012 - link

    Not sure what you're babbling about, but I did go look up the tegra 3, because I recalled reading here how the Appl competition at the time FAILED against it.

    Looks to me like you have a big chip on your shoulder... all sour about how you perceive nVidia to be talking...

    Another sad, biased, angry fanboy, with emotional issues.
  • djgandy - Thursday, November 08, 2012 - link

    Oh a video, lovely. What is it exactly we are looking at, a nice Nvidia Marketing demo of how their chip is better just like they have been doing for 10 years now?

    You didn't refute my point either. Look at T3 power consumption, the thing is only any use if its connected to the mains. Hardly meets the requirements for a mobile chip.

    Quad core CPU though, how useful, I've always wanted to do Folding@WhereverIAm over 3G.
  • B3an - Friday, November 02, 2012 - link

    I don't understand why Android and Win RT devices don't use these PowerVR GPU's. They're nothing to do with Apple in any way and are clearly much better than the competition. Reply
  • augiem - Friday, November 02, 2012 - link

    I'm incredulous. Why is it so hard for them to understand this? Sony did it with the PS Vita. Intel and Apple do own small stakes in the company, but the chips are not exclusive. Ugh. Reply
  • andsoitgoes - Saturday, November 10, 2012 - link

    I'm impressed with Sony, they know how to make stuff look DAMN good.

    I'm very intrigued to see what the power this means for games on the iPad going forward. I've seen the vita, and if the iPad can power even a fraction of that, what could that mean for the visuals?

    The biggest most unbelievably frustrating issue is pricing and controls. The ipad sucks with regards to the controls. Sucks. And the fact that game manufacturers can't put out higher than normal priced games without flopping financially. It's beyond frustrating to think what this device CAN do, but how in turn it has in essence limited itself.

    I'm no game elitist, but a serious game does Infinity Blade not make.
  • Peanutsrevenge - Friday, November 02, 2012 - link

    It's had me wondering as well.
    Reasons I can think of:

    PowerVR refuse to open source their drivers.

    The performance isn't that on Android due to Dalvik handicap.

    PowerVR charge a high price.

    I'm not overly clued up on such things, so purely speculating, but the iEtc devices have consistently had a massive GPU advantage, though TBH, I don't think it really matters at the moment as I've not found GPU performance to be a problem on Android, but I don't game on mine.

    Perhaps a topic for this weekends Podcast (Which I'm sure(and hope) will be at least 6 hours long);) ;) ;)
  • mavere - Friday, November 02, 2012 - link

    I assume that Apple began to focus on GPUs simply as a way to maintain GUI smoothness. Along the way, gaming capability became such a strong marketing tool that the company emphasized graphics performance even more. Reply
  • augiem - Friday, November 02, 2012 - link

    The thing is everyone assumes GPU's are only used for gaming, but they're not. As the displays go higher and higher res, a powerful GPU is important for keeping the experience smooth. That's why I find it really silly that Google went with a higher-res display than iPad and a weaker GPU. Reply
  • Krysto - Friday, November 02, 2012 - link

    But Mali T604 also seems to be a lot more efficient than the A5X and probably A6X GPU's too.

    Nexus 10 uses a 22% smaller battery, and powers 33% more pixels, and yet it only has 6% lower battery life than the iPad. If you normalize those numbers, then the iPad's GPU is 50% more inefficient (or Mali T604 is 30% more efficient than iPad's GPU).
  • Krysto - Friday, November 02, 2012 - link

    My point is that Google (and ARM) went more for higher performance/power consumption in GPU's, the same way Apple did the same with their Swift CPU core (but not in GPU's). Reply
  • mavere - Friday, November 02, 2012 - link

    As our discussion and your conclusion revolve around the GPUs, you cannot use those battery-life figures as evidence because the WiFi tests do not assess GPU efficiency. Reply
  • ltcommanderdata - Friday, November 02, 2012 - link

    Is that based on the web browsing result from Anand's Nexus 10 Preview because that tells you very little about GPU efficiency since the GPU is hardly used. You'd need to run GLBenchmark loops to actually test GPU battery life under load. Reply
  • CeriseCogburn - Tuesday, November 06, 2012 - link

    In this former review, when APPL claimed 4x the performance, it was noted the glbench gave APPL a large advantage, but the CPU tests gave nVidia just as large a win.

    So they ran a game (also reviewed here, with nVidia declared the winner).

    So this glbench is a way to skew real life performance toward APPL in that case, and I'd bet this one as well.

    Not like it's realistic, as the same thing before felled APPL down to "equal in real life", so skepticism instead of blind allegiance is required here.

    Enjoy the link that exposes the former lies and spin, likely directly related to this opening look.
  • andsoitgoes - Saturday, November 10, 2012 - link

    "It's worth noting that neither Shadowgun nor Riptide has yet been optimized for the A5X chip, so it's possible that developers will be able to squeeze more power from the A5X in a later update."

    It's like running an Xbox game on an Xbox and an Xbox 360. This wasn't remotely a comparison that would show anything at all.

    The games were tested almost at launch of the retina iPad, win a processor and GPU that were just enough to push the screen without providing the same kind of gains that we get now. Now if we were to have that game optimized for the chipset it is running on, then we would see some real numbers.

    But as of now, like I said, you've got a game designed to support all manner of much older devices, hell the game probably runs great on the iPhone 4 or the iPad 2. That's the thing with the game developers, they are amazing at refining the game so it will run like butter across the platforms.

    It's interesting, Shadowgun was updated to support the bees knees of the chip in April. Take both of those systems and run the game then. Riptide was updated 11 days after that.

    Until that's done, it's a fartshoot.
  • augiem - Friday, November 02, 2012 - link

    That's great from a manufacturing standpoint. They can use a smaller battery. Ultimately, that benefits the consumer by allowing the tablets to be lighter and somewhat cheaper. But it doesn't help those chasing the high-end of performance. And unfortunately, overall the Nexus 10's battery life isn't that competitive to iPad 4 even with more efficiency. (8.? hours vs 11.? I think it said) Reply
  • CeriseCogburn - Tuesday, November 06, 2012 - link

    That's pretty close actually, so what's the RECHARGE TIME differences ?

    Does the APPL take forever to recharge the giant battery in comparison ?

    LOL - very important - as downtime is critical.

    Might be wise to know that answer before declaring 11 big battery is better than 8 small battery.
  • andsoitgoes - Saturday, November 10, 2012 - link

    But if your downtime is slightly longer while your run time is dramatically longer, it's 6 of one, a fart in a bucket in the other.

    The thing with the new iPad is the 12w power adapter. It takes 6 hours from dead, to give 11-12 hours of run time.

    Actually some other people have reported just a smidge over 5 hours.

    Is the nexus faster? What charger did it include with it? I can't find charge numbers for the 10 so...
  • doobydoo - Saturday, November 03, 2012 - link

    'Nexus 10 uses a 22% smaller battery, and powers 33% more pixels, and yet it only has 6% lower battery life than the iPad. If you normalize those numbers, then the iPad's GPU is 50% more inefficient (or Mali T604 is 30% more efficient than iPad's GPU).'

    Um, no, because when the iPad is powering those 33% more pixels it's doing so at least 50% faster.
  • andsoitgoes - Saturday, November 10, 2012 - link

    *slow clap of awesomeness*

    Thank you for being smarter than me and saying those awesome words.
  • djgandy - Monday, November 05, 2012 - link

    Nexus 10 has a much dimmer display too. You missed that bit. Reply
  • BlendMe - Monday, November 05, 2012 - link

    Not only that. The Nexus has an AMOLED screen as opposed the the iPad's IPS panel. Reply
  • Aenean144 - Monday, November 05, 2012 - link

    All of the current Nexii has LCDs. The Nexus 4 and 7 have IPS displays while the Nexus 10 has a PLS display. They are all LCD technology, not AMOLED tech.

    Like everyone else, the display is the number one user of power in a tablet or phone. For 10 inch sizes, it's probably 70% to 80% of the power consumption.
  • robinthakur - Wednesday, November 07, 2012 - link

    That's very interesting, but due to the fact that there is nothing in the Play store worth playing on a Nexus 10 (or any Android Tablet, I'll be buying an iPad 4 thanks just as there is nothing worth playing on my Galaxy 3. I'm sure the relative efficiency of the device based on your incomplete knowledge will not bother me unduly. Reply
  • CeriseCogburn - Saturday, November 10, 2012 - link

    What's unique to the iPad 4 that's worth playing ? Reply
  • andsoitgoes - Saturday, November 10, 2012 - link

    I mention a few below, especially if you're a board game fan. I have easily a few hundred dollars in board games, very few of which are available on Android, less that are designed for the tablet (Catan, a HUGELY popular game isn't)

    Ghost trick
    The Walking Dead
    Junk Jack (a 2D minecraft better, in my opinion, than Minecraft Pocket)
    Sid Meier's Pirates
    Infinity Blade
    Groove Coaster
    Tilt to Live
    Multiple Gamebook Adventures
    Dead Space (barely functional on Android)
    Ticket to Ride (worth mentioning outside of "board games")
    ****Magic 2013 the name says it all
    ****King of Dragon Pass (one of the most amazing games EVER)

    And while other games often aren't permanently exclusive to iOS, many are hugely delayed before they port to Android, if at all.

    There's more, but I feel like paying more attention to my movie :)

    Simply put, depending on your tastes, iOS is the bees knees.
  • andsoitgoes - Saturday, November 10, 2012 - link

    But yet everyone ignores this. 99.9% of my games are either available in an iPad optimized app, or... No, that's that.

    Everything just works and with the retina iPad, 2x retina iPhones games look just stellar.

    There's no weird stretched displays. No oddly placed controls or shapes or distorted anything.

    Because apple has bucked the "normal" tech standards of matching the various screen sizes, it's given developers confidence to spend the time creating an iPad optimized app/game. They know Apple is being consistent, and while some people may prefer widescreen tablets, they don't realize that by making that move, it's giving developers more and more reasons not to develop on the platform when they've got to try to work their games to fit these slew of screen sizes. With apple, the iPad 1 screen is just a scaled down version of the retina screen. Yes it still takes work to optimize it to work on the higher quality display, but it's scaling versus complete redesign of the whole package.

    The sheer lack of tablet optimized products on Android, plus the really crappy store, is what will prevent the platform from being a serious threat to Apple. With how far Android is coming, how awesome this hardware is, I'd LOVE to see them put apples butt under the fire! It's only good for the competition.

    But for me, I honestly don't think I could change regardless of what Android does. Not that I don't like the hardware or the UI, they're fine, albeit not really for me now (a few years ago when I was into building my own custom Roms and such, sure) but it comes down to how ensconced I am in the ecosystem.

    I have 4 iPads and 3 iPhones if I replace any of those devices with an Android option, I will lose an easy 3/4 of the apps and games that my family use on a regular basis. And those I don't lose, I will have to purchase again. For that reason alone, Android has so far to go in order to compell me to the point that I want to jump ship.

    Any cost savings for a current Apple owner is lost when you think about all you will pay to outfit this new ecosystem to match what you currently have.

    It makes you realize just from that perspective how daunting a task it is trying to convert existing Apple users. How can you look someone who already owns and has used a product for a lengthy time and say

    "Hey, hey you. Yeah you, look here. I've got this awesome device! The screen is better, yeah. Look, look at how pretty it is! It's cheaper and, look, you can expand storage for almost nothing! Yeah baby! You want this hotness!"

    But then when your push has to turn into some twisted commercial for GERD medication, and now you have to list the side effects, and one happens to be seizures that cause your brain to melt and convert into a pile of jello.

    "So while this device is pretty, all those apps you've bought... You own what, 300? Well, here's the thing. We don't have about 150 (Ed. I'm being generous here) of those that you use, and of the 150 you do use, well only 90 of them are designed to run properly on the screen. Those that aren't will sort of look like you squished up a phone app and stuffed it onto the device. Oh, and uh... The last thing is that for those 80 apps you own that you had to pay for, well... You'll have to buy those again here. Sorry"

    For a user that watches movies, listens to music, browses the web and only plays super popular cross platform games/uses only super popular cross platform apps and mostly uses free or incredibly cheap games/apps, for them it's worth the jump.

    For those like me?

    * Almost every single learning/educational app my kids use is iOS only
    * games like Sword and Sworcery (reasons enough for owning a mobile device) take forever to port to Android
    * games like welder, junk jack, punch quest (hell, anything from rocket cat), the multitude of board games that just squeaked through the "we made enough money to justify making the game!" Barrier that will never come to any other platform...
    * apps unique to iOS, too many to name. Unique and specializes apps like Awareness, my various photo editing and processing apps, my various (again) family oriented chore apps, cookbooks and such.
    * high priced apps, like my ssh terminal, file browser. RDP/VNC viewer, my central password system, pocket informant. These apps all combined would set me back $50 - $60, if not more, and I rely on them.

    Simply put, Android would have to be able to take care of "all my needs and desires" (if you know what I mean, WINK) for me to even consider it. Basically it would need to come with Kate Upton for me to be able to justify it. A very needy, willing Kate Upton ;)

    But seriously, I do think those are issues that can't be easily addressed by Android. And while I'm an exception, most people will have at least quite a few things that either aren't available on android or will cost them a significant amount when having to switch to repurchase various productivity or other types of software/games.

    God. This was TL;DR to a LOTR level. I need an editor.
  • dagamer34 - Friday, November 02, 2012 - link

    Intel's Clover Trail chips use a PowerVR SGX543MP2 @ 533Mhz. Also, you are asking the wrong question. It's not "why can't they use these chips?" but "who is going to pay for these chips?" because you have to remember that the VAST majority of the market is made up of phones that don't even have GPUs as good as the iPhone 4. Larger dies = more cost, and when you're trying to build a $100 phone, you can't stick a fancy GPU in there built on the latest fab process.

    Apple's economies of scale work greatly in it's favor because they sell tens of millions of devices with the same chip for several years, meaning it will pay back for itself more and more and more over time. nVidia can't honestly sell a Tegra 2 device today without being laughed at, and yet the iPad Mini came out with the very similar A5.
  • ssj4Gogeta - Friday, November 02, 2012 - link

    What $100 phones? Maybe you get them for $100 after carrier subsidy in the US (but it isn't really subsidy as you more than pay it back as much higher per-month charges during your contract).
    The high-end phones are easily $600-800 unlocked and the mid-end ones are $400.
  • 1droidfan - Friday, November 02, 2012 - link

    Its well known Apple signs exclusivity arrangements with PowerVR for their GPUs. So Android and WinRT could get a lower spec PowerVR GPU, or wait for a better one to reach production, but what good would that do? Apple has the biggest muscle by far in mobile, nobody can compete with them on hardware. Too bad their OS sucks. Reply
  • darkcrayon - Friday, November 02, 2012 - link

    Good thing their OS gets out of the way and lets you run some of the best mobile *applications* around... The same ones using that good ol' GPU. Reply
  • CeriseCogburn - Monday, November 05, 2012 - link

    Not in APPL hyper fanville it isn't, and once it is, the APPL freak forgets that instantly, while they worship their dead Master.

    Then they squeal why can no one else do this ?!
    That is not an exaggeration.

    It's the sad state of the "finest techy minds".
  • winterspan - Wednesday, November 07, 2012 - link

    Source?? TI uses the PowerVR SGX 5 series, as does Intel, Sony, Marvell, and others... Reply
  • Urizane - Wednesday, November 07, 2012 - link

    It's worth noting that PowerVR does not manufacture GPUs, they license IP to chip makers. Scarcity of virtual goods only serves to hurt PowerVR. Reply
  • KoolAidMan1 - Friday, November 02, 2012 - link

    You get what you pay for.

    Most tablets cut corners with the SoCs they use. A good GPU costs money. Apple manages both high performance and better profit margins at similar prices to the competition by making many times more tablets and individual phone models than other companies.

    If Android and WinRT devices were to use comparable GPUs you'd either see something even more expensive than an iPad, or they'd make less profit per device sold.
  • augiem - Friday, November 02, 2012 - link

    Finally some Nexus 10 benchmarks. Mali T604 performance is utterly abysmal. I'm shocked, yet I'm not. Par for the course. And it's got to push more pixels than the iPad with a squirrel under the hood.

    All Android and Windows phone and tablet makers are absolutely negligent for not only continuing to let Apple's massive graphics lead go on for a 6th year, but for allowing the gap become even wider than its ever been.

    Nobody but Imagination knows squat about making a GPU. It's really frustrating as hardware enthusiast to see this year after year.
  • WaltFrench - Friday, November 02, 2012 - link

    Let's not overstate the case: nVidia has been a leader in graphics chips since its founding in 1993.

    But the economics are quite different here: aside from Samsung, virtually no Android OEM is able to charge a high enough price to justify putting advanced GPU chips, and the extra battery, memory interface, etc., that high-performance graphics needs.

    Is this likely to change? Certainly, hard-core gamers and tech enthusiasts care. But somebody who just wants a tablet for casual browsing and email will be happy to have a lower-cost device instead. And Google's incentive is NOT to get lots of very happy gamers, but rather a huge number of eyeballs for its ads. If the equation gets so lopsided that sales suffer, then they'll aim for more fps.

    Or just wait for Moore's Law to catch up with high-performance game standards. As these tests show, iPads are now within spitting distance of “more than good enough,” the point at which you do NOT throw more money at a function.
  • augiem - Friday, November 02, 2012 - link

    Yes that's true. For general purpose needs, you don't HAVE to have the fastest GPU on the market. But GPU's aren't only used for games. When you're talking about ~4 megapixel screens, thet GPU is going to come into play throughout the entire usage experience. I believe the original purpose of Apple's focus on GPU power was not because of gaming as much as it was because of their push toward higher desktop resolutions. Ultimately, users will care if the usage experience is marred by studdering. It's not a great formula for Google to put a higher res screen than the iPad, and then skimp on the engine powering that display. And the average user does play games. There's a reason Nintendo's portable division, which was their bread and butter for the last 20 years, is seriously suffering showing something like a 2/3 loss in revenue this year vs last.

    Waiting for Moore's law to catch up doesn't do anything for the present. Apple pretty much has a monopoly on mobile graphics performance.
  • CeriseCogburn - Tuesday, November 06, 2012 - link

    Isn't it also true that in these devices the CPU has a broad effect on screen stuttering, and in the case of the cpu's present, APPL has been losing.

    So you have the appl ax5 vs the nVidia tegra3 and despite a huge glbench win for appl, actual usage and games shows the tegra3 equivalent and winning. The strong 4core nVidia cpus contained within.

    So it's not just gpu in these devices that dictate screen experience.
  • augiem - Friday, November 02, 2012 - link

    >> iPads are now within spitting distance of “more than good enough,”

    I don't think that will ever be the case. People have been buying hardware on performance for 30+ years now and it never stops. People always want more. Look at CG in movies. By around 1999 the CG was good enough to look really good if they spent enough money on it. Fast forward 13 years and everyone's still obsessed with it. Every blockbuster that comes out has more advanced CG than the last. And if its not more advanced, it's just plain got more of it. It's only human nature to chase complexity. You can see it just by looking at the evolution of society. I don't think there really is such as thing as "good enough." We'll always get used to it and want something bettter.
  • ssiu - Friday, November 02, 2012 - link

    At first I was excited to hear about this, as "new tech == better" right?

    But then I realize: in the beginning when everyone think "no GPU change", it implies A6X GPU somehow manage to run at twice the frequency of A5X (not sure actually how that's accomplished without killing battery life) to achieve "up to twice GPU performance", but that would imply close to 2X performance on **almost every GPU operations**.

    Now with this new GPU core, it truly means **up to** twice GPU performance, i.e. from close to 2x performance on some operations ("ALU heavy workloads"), down to no improvement for some other operations (or x% if there is x% GPU frequency increase, for some small x).
  • ssiu - Friday, November 02, 2012 - link

    P.S. Don't mean to bad-mouth A6X GPU, yes it is still king-of-the-hill (even A5X is great in GPU). Just feeling a bit anti-climatic as my subject says. Reply
  • MykeM - Saturday, November 03, 2012 - link

    The A5X is decent but unlike the A6 or the A6X, it still uses the older 45nm technology. Apple even moved the iPad2's A5 to the smaller and more efficient die (I believe the A5 used in the still in production iPhone 4S remains 45nm). The move to 32nm should explain the reason why the iPad 4 retains pretty much the same if not better battery life despite clocking higher. I've never actually own past iPad but with the 4 I'm rather surprised at how cool (as in temperature not factor) the whole unit feels. I heard the iPad 3 runs rather warm. Reply
  • gus6464 - Friday, November 02, 2012 - link

    This isn't the new PowerVR "Rogue" chip that is still slated for next year right? The power of their GPU's is pretty amazing. Intel has licensed their new chip for the next Atom right? Reply
  • Aenean144 - Friday, November 02, 2012 - link


    Rogue chips will be labeled as PowerVR G6200 or G6400 or something close to that, like G6230. The PowerVR 554, in an MP4 config in the A6X, is the same 5-series architecture as the 543, 540, etc. It just has more ALUs like Anandtech is saying.

    Next year's Atom SOC was slated to have a PowerVR 554 single core I think, possibly dual core, a MP2 config.
  • Kevin G - Friday, November 02, 2012 - link

    Nope. The PowerVR Rogue line carry the G6200 and G6400 model numbers. Reply
  • eddman - Friday, November 02, 2012 - link

    Atom Valleyview is supposed to have Ivy bridge graphics.
  • augiem - Friday, November 02, 2012 - link

    This will be one to watch. We'll FINALLY get to see some comparison between desktop and mobile SoC GPUs. That should be fun. Reply
  • augiem - Friday, November 02, 2012 - link

    Of course this will be late 2013 - 2014 according to the article, so more waiting. And Anand's not sure which Intel part they're using...

    "The big news are the four integrated Gen7 graphics engines, which I can only assume refer to Intel's EUs (Intel's HD 2500 has 6 EUs, while HD 4000 has 16 EUs). "

    I see the slide says "4 Intel Gen 7 Graphics Engines", so am I to guess this is a cut down version of HD 4000 with 1/4 the power? Assuming EU (execution unit?) is equivalent to Graphics Engine. If that's the case, I'm not that excited anymore.
  • Penti - Saturday, November 03, 2012 - link

    At least driver wise they will not have to rely on Img tec. Plus their performance today with SGX540 is worthless. Will be pretty sweet for embedded field any way. It will not compete with faster tablet/phone oriented SoC's graphic wise. Not even a 16EU gpu would be awesome for games so. Reply
  • leomax999 - Saturday, November 03, 2012 - link

    Mobile Soc variants will have rogue. Gen 4 elsewhere. Reply
  • eddman - Saturday, November 03, 2012 - link

    Any sources? I'd like to read it. Reply
  • Achtung_BG - Friday, November 02, 2012 - link

    3,1mpix display and iPad 4 have only 1GB RAM by Elpida, frequency? What part of the RAM will be use for the video. Reply
  • tipoo - Friday, November 02, 2012 - link

    Usually Apples performance numbers were right on the money before this, I wonder how they got to 2x? Reply
  • Alucard291 - Friday, November 02, 2012 - link

    New management - new directive - new hype. Reply
  • darkcrayon - Friday, November 02, 2012 - link

    Or maybe it actually is "up to" 2x as fast as they say on the website... Reply
  • tipoo - Saturday, November 03, 2012 - link

    They said the same thing with the A5X IIRC, the difference being that one actually scaled somewhat close to 2x and sometimes exceeded it, the A6X averages around 1.53x and never goes over double. Reply
  • DavidKl - Saturday, November 03, 2012 - link

    in the egypt HD The A5X scores 21 and the A6X almost 42FPS , and offscreen the A5X scores 25FPS and the A6X almost 52 ... Reply
  • Tangey - Monday, November 05, 2012 - link

    wrong...the overall GL 2.5 benchmark figure is 1.95 faster than the ipad3. 5850 Vs 3018 Reply
  • AP27 - Friday, November 02, 2012 - link

    Imagination Tech has always put out GPUs with badass performance. I'm surprised at why they aren't more widespread. Apple has used them in every iteration of the iPhone and iPad (I think) and Samsung used them in the Galaxy S1 (which was a powerhouse back then) before moving to Mali.
    The only other chipset that I know of that uses PowerVR GPU's is the TI 4xxx and 5xxx, and TI was talking about shutting down their SoC operations. Adreno, Mali and Tegra have all been behind PowerVR for a while now.
  • Krysto - Friday, November 02, 2012 - link

    Probably because they are not very energy efficient. Reply
  • melgross - Friday, November 02, 2012 - link

    It isn't efficiency. It's performance. If something is twice as powerful, and uses twice the energy, it's just as efficient . Until recently, Android devices have been pretty inefficient. Some are now pretty good. But it's cost. I would imagine (ahem!) that Imagination's IP costs more, and other manufacturers are selling on price. Reply
  • darkcrayon - Friday, November 02, 2012 - link

    Odd, since iOS devices are usually at or near the top of battery life for devices in their class (and even higher when you consider battery size). Obviously there are a million other factors that affect battery life, but if these GPUs are unusually poor at performance per watt compared to competitors I'd be surprised considering this. Reply
  • djgandy - Monday, November 05, 2012 - link

    Hilarious. PowerVR is the most energy efficient and has the least thermal issues. Unlike Tegra, that over heats. Adreno that has to throttle because it gets too hot.

    Mali, Tegra, and Adreno together are why Samsung has to put 9W/hr batteries in their phones and Apple only needs 6W/hr.
  • KitsuneKnight - Friday, November 02, 2012 - link

    It's quite disappointing indeed. Fortunately, the Adreno 320 (at least in the Optimus G, not the Nexus 4) appears to actually be pretty damn good (unlike the new Mali), so maybe they'll be a bit of competition on the GPU front... maybe we'll have a nice GPU battle raging (in addition to the CPU battle) by the time Intel arrives in full force to the mobile landscape.

    Or maybe it's just a fluke. Even in flagship Android phones, it seems like the manufacturers aren't really taking things all that seriously (such as the bajillion different SoCs in the SGS3).
  • AnotherHariSeldon - Saturday, November 03, 2012 - link

    Intel and Samsung use IMG IP as do Mediatek - The fastest growing global smartphone SoC manufacturer.

    Samsung uing IMG in the form of TI omap -
  • iwod - Friday, November 02, 2012 - link

    The A15 is approaching needed Desktop computing performance. Where are we in terms of Graphics performance?

    Say A57 is an Core2Duo Class CPU, ( Not a Fact, i am just guessing and giving examples here )

    What is an PowerVR SGX 554 MP4? Ivy Bridge G3000? Radeon 6370?
  • Zodiark1593 - Saturday, November 03, 2012 - link

    A 6370 (80 stream processors) approaches roughly 130-140 Gigaflops at 750 MHz, not including specific optimizations on either part, I'd say the GPU performance should be roughly half, maybe slightly less so.

    Considering many PC games are made with much stronger GPUs in mind (such GPUs rate 800+ Gflops), I'd estimate visuals to be worse with a 6370 on average than a high end, well optimized game made for the SG554 MP4.

    And then, consider that the 6370 is a little over half as powerful as the Xbox 360 GPU (similar compute power to the 6450), Tablets still have a little ways to go before hitting console performance levels, not counting any additional quirks like the eDRAM in Xenos.

    What I would love to see though is AMD getting into the mobile GPU game as well. Even an 80 SP Radeon part in the 250 MHz speed would shake things up.
  • ananduser - Saturday, November 03, 2012 - link

    AMD was in the mobile GPU game. They sold their division to Qualcomm, due to financial difficulties. It's was called Imageon and Qualcomm rebranded it to Adreno. Reply
  • Penti - Saturday, November 03, 2012 - link

    It's roughly equivalent to HD2000 or HD2500 graphics from Intel, if we only go by GFLOPS. It's slower then the average integrated gpu now days. Or about as fast as a GeForce 210, 310 and a third of a GT620M notebook chip. Roughly that is in the ballpark of an old X1600 or X800/X850 graphics card. It's close to a integrated Radeon HD6320 and HD6310 that's in the AMD Zacate APU E-350 and E-450. Here it will largely depend on other factors and drivers though. Mobile GPU's will always be bandwidth limited so it doesn't make much sense to put in too large GPU in here, that is also why Apple uses quad-channel memory for their higher performing chips. Reply
  • krumme - Saturday, November 03, 2012 - link

    Apu -350/450 is what? around 70mm2 on 40nm, and this a6x is 130mm2 on 28/32nm

    If performance is around the same for the gpu part / and say the cpu also, it mean the old generation bobcat is at least 4 times as effective for perf / mm2. Ofcourse much less at lower voltages.

    Efficiency will improve very nice with the new jaguar with cgn.

    I have a doubt there will be that much difference for power/perf. If true, amd/intel still have a huge leg up on designing cpu/apus and especially drivers for the gpu side (amd).

    We need some comparisons of hardware on the same software platform.

    All this comparisons on different software is nice, but its a mess to evaluate the hardware if not within same family and platform. Its very difficult to do just proper benchmarking here.

    Look how sgx perform on the Intel platform, the 32nm atoms. Its pathetic, it doesnt even work most of the time.
  • Zodiark1593 - Saturday, November 03, 2012 - link

    We can't go by flops alone, though that's the only comparison I've seen.

    The 6370 is limited to a single 64 bit bus, so it's advantage against the IPad 4 dwindles sharply, especially if equipped with DDR3 instead of GDDR3. 4 ROPs and 8 Texture units finish off the specs. TDP comes in at roughly 15 watts.

    Apart from raw shader power and clock speed, there isn't a whole heck of a lot in the 6370's favor vs mobile GPUs.
  • Penti - Sunday, November 04, 2012 - link

    I know we can't. It's all about the capabilities and chip-specific strengths, drivers and whatever they use to overcome the limited bandwidth. Scaling down geforce and amd gpus into these sizes is just horrible. They require much more bandwidth to perform well. AMDs old tile-based line which they sold off, used by Qualcomm does quiet well though. Reply
  • AnotherHariSeldon - Saturday, November 03, 2012 - link

    Just look at amount of die area dedicated to the GPU's in the A6X

    I expect Samsung will have to revert to IMG rogue (from current ARM Mali) to remain competitive with Apple in the next iteration of product launches.

    Expecting to see ray-tracing tech becoming a factor as well...............
  • UpSpin - Saturday, November 03, 2012 - link

    As long as we don't know how the Exynos 5 Dual looks like, it's hard to say that Mali is worse than PowerVR. What we know is that the Exynos 5 Dual outperforms A5X and A6 in 'real world' gaming benchmarks. But gets beaten by the A6X.

    I doubt that the Mali T604 cores occupy as much die space on the Exynos 5 Dual as the PowerVR in the A6X do.
    I haven't found anything about the Exynos 5 Dual die size, transistor count, or some other sort of analysis. As long as we don't have those informations you can't judge about the GPU.
  • Tangey - Monday, November 05, 2012 - link

    5250 has not been implemented in a phone, so we have no idea what its performance is relative to the phone-only A6.

    I would not be surprised if a phone based 5250 clocks substantially lower than its tablet implementation.
  • Krysto - Saturday, November 03, 2012 - link

    It seems to be Egypt HD is the only benchmark that matters, because it's a complete graphics test at a high resolution. The others are only testing for specific stuff, which even if they have higher numbers, might be bottlenecked by other components in the system, so it could be irrelevant that it scores 1000 more points over another chip.

    Running Egypt HD is exactly like running a game. So then it seems Mali T604 is about 30% faster than the A5X GPU and the A6 GPU (iPhone 5), and about 35% slower than the A6X GPU (65% of 554MP4). If Apple doesn't come out with iPad 5 in spring (which would be pretty crazy if they did so soon), then I expect Mali T624 or T628 to take the crown again in the first half of next year.

    But that's just for raw performance. In terms of energy efficiency, Mali T604 seems to be about 30% more efficient than 554MP4, so normalized for energy efficiency, which Samsung went for here, because they wanted to use a smaller battery than Apple, to help undercut the iPad by $100, then 554MP4 is only about 10% faster than Mali T604 at the same power consumption level - maybe less than that.
  • AnotherHariSeldon - Saturday, November 03, 2012 - link

    "In terms of energy efficiency, Mali T604 seems to be about 30% more efficient than 554MP4, so normalized for energy efficiency."

    I'm not sure you're quite understanding that right ;)
  • djgandy - Monday, November 05, 2012 - link

    Do you work for ARM? Ex Nvidia employee maybe? Seem to have a case of nonsense diarrhea . Reply
  • qwerty0722 - Sunday, November 04, 2012 - link

    I think the result is really great, but as we can see almost the iOS platform score higher than Android platform so if we put the S4 Pro on iOS it could just like a SR-71, but we use A6X in Android it could result a horrible speed (like Ford GT Mustang etc.)

    I wanna say is when we saw different platform put into a same test, here we can realize : it's not A platform faster than B platform but they're just different platform, maybe iOS is a great environment for OpenGL, Android maybe not.

    In real world when you play a game, the resolution is higher than the iOS (1136x640 or 2048x1536), Android (2560x1600 or 1920x1200 or 1280x768 or 1280x720)

    it is also smooth in Android, so I think this test is just like use the same software like Resident Evil 6 --> PS3 vs XBOX360 ( THEY ALL PERFORM WELL !! )
  • djgandy - Monday, November 05, 2012 - link

    Drivers may be better on one platform yes. However that is often what benchmarks are for. They are much simpler than games and can therefore target specific parts of the GPU. Fill rate is a metric which can be easily calculated with a piece of paper.

    If the platform A is missing its expected fill with Chip A but platform B is hitting it then you have a driver issue on platform A. I don't think we've seen much evidence of this though. Android benchmarks generally delivery expected performance, look at SGX540 (as old as it is), the results are inline with the capabilities of that chip.
  • qwerty0722 - Monday, November 05, 2012 - link

    ok thanks for your reply, but I still don't get it ; )

    Did you mean SGX554MP4 really is a high performance GPU to make such a difference not just the driver of iOS ?

    because my personal view is why Apple use dual-core+SGX543MP3/SGX554MP4 can run this benchmark pretty far from Android (Quad-core+Adreno320), is this architecture's problem ? or..?

    can you make your point more simple (sorry for my comprehension)

    and thanks again for your reply~
  • ShAdOwXPR - Monday, November 05, 2012 - link

    What's the A6X CPU gflops? and combined? And will the next gen Apple CPU/GPU catch up to current consoles? (P.S. I know next-gen consoles will be 600gflops-1.5tflops)

    I found a few consoles gflops cpu/gpu numbers;
    Xbox | CPU: 1.5 GFLOPS | GPU: 5.8 GFLOPS | Combined: 7.3 GFLOPS
    Xbox360 | CPU: 115 GFLOPS | GPU: 240 GFLOPS | Combined: 355 GFLOPS
    Dreamcast | CPU: 1.4 GFLOPS | GPU: 0.1 GFLOPS | Combined: 1.5 GFLOPS
    Wii | CPU: 60 GFLOPS | GPU: 1 GFLOPS | Combined: 61 GFLOPS
    PS2 | CPU: 6 GFLOPS | GPU: 0 GFLOPS | Combined: 6 GFLOPS
    iPad 4 | CPU: --- GFLOPS | GPU: 78 GFLOPS | Combined: --- GFLOPS
  • ShAdOwXPR - Monday, November 05, 2012 - link

    Both xbox 360 and Wii U real gflops is the first one not the theoretical, apparently the real and theoretical are very different...

    Xbox | CPU: 1.5 GFLOPS | GPU: 5.8 GFLOPS | Combined: 7.3 GFLOPS
    Xbox360 | CPU: ???-115 GFLOPS | GPU: 77-240 GFLOPS | Combined: 355 GFLOPS
    Dreamcast | CPU: 1.4 GFLOPS | GPU: 0.1 GFLOPS | Combined: 1.5 GFLOPS
    Wii | CPU: 60 GFLOPS | GPU: 1 GFLOPS | Combined: 61 GFLOPS
    PS2 | CPU: 6 GFLOPS | GPU: 0 GFLOPS | Combined: 6 GFLOPS
    Wii U| CPU: 50-260 GFLOPS | GPU: ???-600 GFLOPS | Combined: 61 GFLOPS
    iPad 4 | CPU: ??? GFLOPS | GPU: 78 GFLOPS | Combined: ??? GFLOPS
  • tipoo - Monday, November 05, 2012 - link

    Do you think some more extensive and real world default browser testing would be possible with the Surface, the iPad, and whatever flagship Android tablet (Nexus 10 preferred)? I think that would be pretty cool. I'm more curious about the multi tab performance than synthetics. Clearly they optimize around benchmarks, I'd rather see page load times, how they render background tabs, how many tabs it takes to make them stutter, etc. Reply
  • Walkop - Wednesday, November 07, 2012 - link

    I have to say, I was actually quite disappointed to find that the Exynos chip under-performs compared to the A6X. That thing is a BEAST of a chip to beat out the best Exynos yet.

    However, Android users (and Apple) are forgetting one thing: the Nexus 10 looks to be the most powerful Android Tablet on the planet at release. Sure, the iPad is faster with the A6X and that is impressive, but the Exynos powers all those pixels while still beating out all Android-based competition in almost every category. Considering that new Android phones are super-smooth anyway, especially Nexus devices (and this thing BLOWS away all previous Nexus devices) we have the best Android tablet yet. Something that can really compete with the iPad, even with its faster SoC.

    Again, great job and a great win for Apple, but this shows that Android makers do have the potential to catch up to Apple eventually.

Log in

Don't have an account? Sign up now