A9's GPU: Imagination PowerVR GT7600

With so much time spent talking about A9 from the perspective of its manufacturing process and its Twister CPU, it’s all too easy to forget that Apple has been working on far more under the hood than just CPU performance. As has been the case for generations now, Apple continues to focus on GPU performance, laying the groundwork for significant performance improvements with every generation.

Going all the way back to the first iPhone and its Samsung-developed SoC, Apple has been a patron of Imagination Technologies and their PowerVR GPUs. This has been a productive relationship for both parties, and for A9 this hasn’t changed. To no surprise then, the GPU in the A9 is another design in Imagination’s PowerVR Rogue family, the GT7600.

Briefly, while Apple continues to not disclose the GPU used in their designs – referring to the A9’s GPU as iOS GPU Family 3 v1 – a look at the iOS Developer Library makes it clear what GPU family is being used. Apple still uses tile-based deferred rendering GPUs (to which only PowerVR fits the description), so the only real questions are which family is in use and how many cores are present.

With A8 and its GX6450, there was a pretty clear smoking gun to identify the GPU family via the inclusion of ASTC support, a feature only available on Series 6XT and newer GPUs. There aren’t any such smoking guns on the A9, but the Metal Feature tables indicate that there are a handful of new low-level features which are indicative of a newer revision of the PowerVR Rogue architecture. Coupled with the fact that Imagination announced PowerVR Series 7 nearly a year ago and Apple has proven to be able to implement a new PowerVR design in under a year, and it’s a safe bet that A9 is using a Series 7 design.

As for the configuration, the A9 die shot quickly answers that one. There are 6 distinct GPU cores on the A9 die, divided up into 3 pairs with a shared texture unit in between them. So it may have taken Apple a generation longer than I initially expected, but with A9 we’re finally looking at a 6 core GPU design for the iPhone.

From a feature and design standpoint then, the GT7600 is not a significant departure from the GPUs in the A8 and A7 SoCs, however it does have some notable improvements along with some optimizations to boost performance across the board. Notably, relative to the GX6450 it features a geometry tessellation co-processor as a base feature, a function that was merely optional on Series6XT and, at least in Apple’s case not used. Unfortunately, looking through Apple’s developer documentation it does not appear that tessellation support has been added for Metal, so assuming for the moment that Apple hasn’t stripped this hardware out, they definitely don’t have API support for it.

Otherwise the bulk of Imagination’s focus has been on small tweaks to improve the Rogue architecture’s overall efficiency. Among these, the Special Function Units can now natively handle FP16 operations, saving power versus the all-FP32 SFUs of Series6XT. SFU operations can now also be co-issued with ALU operations, which improves performance when SFUs are being issued (which in Imagination’s experience, has been more than expected). Finally, the Vertex Data Master (geometry frontend), Compute Data Master (compute frontend), and the Coarse Grain Scheduler have all been updated to improve their throughput, and in the case of the scheduler improving its ability to keep USCs from stalling on tile-interdependencies.

Looking at the broader picture, after initially being surprised that Apple didn’t jump to a 6 core design with A8, with A9 it makes a lot of sense why they’d do it now. GPUs have and continue to be the biggest consumers of memory bandwidth in high-performance SoCs, to the point where Apple has outfit all of their tablet-class SoCs with a wider 128-bit memory bus in order to feed those larger GPUs. Conversely, a 64-bit memory bus with LPDDR3 has always represented a memory bandwidth limit that would bottleneck a more aggressive GPU design. With the move to LPDDR4 however, Apple has doubled their memory bandwidth, and coupled with the larger L3 cache means that they now have the means to effectively feed a larger 6 core GPU.

Overall then, between the 50% increase in the number of GPU cores, Imagination’s architectural efficiency improvements, Apple’s own implementation optimizations, and what I don’t doubt to be at least a decent increase in the clockspeed of the GPU, Apple is promoting that A9 should see an incredible 90% increase in GPU performance relative to A8. And as we’ll see in our performance benchmarks, they are more than capable of delivering on that promise.

Mobile SoC GPU Comparison
  PowerVR SGX 543MP3 PowerVR G6430 PowerVR GX6450 PowerVR GT7600
Used In iPhone 5 iPhone 5s iPhone 6 iPhone 6s
SIMD Name USSE2 USC USC USC
# of SIMDs 12 4 4 6
MADs per SIMD 4 32 32 32
Total MADs 48 128 128 192
Theoretical
GFLOPS @ 300MHz
28.8 GFLOPS 76.8 GFLOPS 76.8 GFLOPS 115.2 GFLOPS
Pixels/Clock N/A 8 8 12
Texels/Clock N/A 8 8 12
A9’s CPU: Twister System Performance
Comments Locked

531 Comments

View All Comments

  • akdj - Tuesday, November 3, 2015 - link

    Hi Josh and Ryan,
    Many, MANY thanks for the insight and in depth review. I've just finished my second read (it was late last night I noticed the review and read through) and your experiences mimic mine. With a single exception. I'm a business owner, have been over 26 years now and use phones for the business and personal. I also outfit employees so I have a chance to stay 'ambidextrous', keeping a foot in Android, rest of the body in iOS ... But some things I do enjoy on both my older Note 4, & newer S6. No intrigue with the Note 5 other than its SoC, speed of internal storage and design over my older N4. As an S6 Edge owner I'm well aware of the speeds uninstalling, installing apps, opening them, the 'feel" of the newer 2015 Sammy phones as well as the exceptional speed of the Exynos processor. That said, you made a remark I don't quite agree with
    "The second generation of TouchID isn’t quite as life-changing, but it’s a welcome improvement nonetheless. Again, this is a case where there was friction in the user experience that wasn’t really noticeable until it was gone. Obviously, Apple is no longer the only one at this level of user experience with fingerprint scanners but they are keeping up."
    I'm not sure which phone you've found that parities the iPhone 6s/6+s for FP register. As it's certainly not the S6/S6+/Note 5 or LG (I've got one of their freebie 8" LG tabs from AT&T runnin LP). I'm hoping anyway lol. My silly S6 is just finally starting to correctly register 50% of the time with the 5.1 update. The previous six months I was lucky to have my thumb recognized 1 of 5 times. And it's registered as FOUR different 'fingers'.
    I'm also an owner of the 6+s and even checking the time or setting an Altman, turning the flashlight on, etc...it's so damn quick, I'm automatically on the home screen. It's ...pardon the pun, lightning fast and immediate. I guess I'm curious as to which OEM Apple is keeping up with as I had the 5s and 6+ standard as well. The Note 4 is a useless implementation and the S6, while better is a LONG way off from 'keeping up with...' Apple again IMHO. Genuinely curious as to the OEM making better or even similar performing and 'protective' measures than Apple.
    Other than that silly nitpick, I agree completely and haven't enjoyed an iPhone as much since the iPhone 4 and its HiDPI display. If I recall, another 'first', wasn't it? (Like the 5s FP reader, actually able to 'read an FP ;)). Maybe it's my aging mid 40s eyes but the higher resolutions and larger displays have literally kept pace with my deteriorating vision!

    Once again, many thanks for the perfectly balanced nerd/everyday 'Joe n Jane' subjective review of 'real world use'. Always refreshing to hear... I mean read your reviews, un-rushed to keep up with the herds the day after release or a week post NDA, minus the carrying around and using ...or simple resolution, 100% 'chart n number' reviews.
    Loved it. And I'm loving the iPhone 6s+. It's truly a computer in my pocket. I know you briefly touched on the expanded radios both WiFi and LTE, another maybe at first unnoticeable unless ...again as you mention an iCloud restoration of significant size, but a HUGE end user boon. These are incredibly fast, seemingly more 'stable' in 5MHz mode. (Maybe a bad word, stable but hard to put my finger on it, as older modems on the iPhone with AC/5MHz or is it GHz? Now I'm lost. This one seems faster, more efficient and stable than earlier versions )
    My wife has an identical iPhone 6+s. 128. Hers is Sammy mine TSMC. Neither has shown any significance in battery draw than the other. Mine measures 2238/4437 in GBench, hers 2242/4405 after six runs ...that's the mean. Power and efficiency are nearly identical after a weekend at our cabin we both had single digit %'s and used them nearly the same the entire weekend.
    Very VERY great phone
    J
  • MarcSP - Wednesday, November 4, 2015 - link

    Thanks for your explanation :-). Still, I think there must be something else. I mean, most Samsung phones also use amoled and did not get such a low score in browsing, and the Snapdragon 800 is not a very slow SoC. Even today there are many low and even middle-end phones sold with weaker SoC.
  • zodiacfml - Wednesday, November 4, 2015 - link

    i dont like apple but their engineering and design is very impressive. i wonder how the new cpu compares to a Core M.
  • tharun118 - Wednesday, November 4, 2015 - link

    The best phone? Seriously? I've seen a lot of people saying iPhone as THE BEST phone, but AnandTech? Come on.. I believe that there can never be a "THE BEST phone". Yes, iPhone 6+ has a very good SoC, reliable camera, 3D Touch, etc, but like every flagship phone, there are compromises and drawbacks. For me, I choose a smartphone based on 4 major aspects. First, the screen. I know Apple lovers always defend their 320+ PPI screen saying that's more than enough and they don't need anything more. But the truth is, they are far behind Samsung and that will likely change in 7 or 7s. Second, the camera, this is purely subjective, there are people who'd prefer photos from an iPhone and there are people who'd prefer photos from 2015 android flagships (S6, Note 5, G4, 6P, etc). Third, battery and performance: Apple is better here on a tiny margin due their vertical integration. I think Android phones will never reach the exact smoothness in performance and efficiency in power consumption of the latest iPhone, due to fragmentation. Fourth, customisation: No comments here, but I understand there are lots of people who'd happily use their phone the way their manufacture tells them to. I'm definitely not one among them. I try to balance all these 4 aspects and my choice this year was a Galaxy S6. Of course, there are bonus features such as, wireless charging, quick charge (very useful), IR port, etc. But still, I wouldn't call S6 as THE BEST. Neither is an iPhone 6+.
  • Vincog - Wednesday, November 4, 2015 - link

    I got iphone 6S with samsung chip here, and my battery will decrease 1% every 5 minute in use or 1% every 15 minute standby... ( take a note all background refresh off, location off, only hey siri on ) ..Even my iphone 5s is more better than this one!! 😭😭😭😭
  • Tigran - Wednesday, November 4, 2015 - link

    ***
    Looking at GFXBench, which is an infinite loop of the T-Rex on-screen benchmark to approximate intensive video gaming we see that the iPhone 6s doesn’t last very long either, but the performance throughout the test is incredible. Due to 1334x750 display resolution and strong GPU, the iPhone 6s manages to last the entire test without any notable throttling, and effectively pegged at the refresh rate of the display.
    ***
    Why V-Sync (which limits T-Rex on-screen by 60 fps) is ignored? And what about this throttling evidence (by 20-22% in GFXBench off-screen):
    http://forums.anandtech.com/showpost.php?p=3772777...
  • blackcrayon - Wednesday, November 4, 2015 - link

    They mentioned that the 6s+ throttled slightly due to the higher resolution, so it stands to reason that the 6 would also throttle when rendering a higher resolution offscreen. But it's nowhere near the throttling of any of the competitors, games are still remaining playable throughout a reasonable gaming session.
  • Tigran - Wednesday, November 4, 2015 - link

    You don't get it. It's not about resolution - it's about T-Rex on-screen which limits performance to 60 fps. Without this limit iPhone 6s performance would be much higher, so it is incorrect to mention T-Rex on-screen discussing iPhone 6s throttling. If there is throttling, it can decrease from 100 to 70 fps, but you will see only 60 fps during the whole test - because of V-Sync. And there is evidence off throttling in Manhattan (which doesn't reach 60 fps limit) actually - see my link above (20-22% throttling). I can add that popular Russian laboratory (overclockers.ru) tested throttling of iPhone 6s via Basemark Metal, and they found enormous throttling there - from 911 down to 525 (74%).
  • zhiliangh - Wednesday, November 4, 2015 - link

    Thank you! I have been waiting for your review before upgrading any phone this year. This is a must-read iphone review.
  • Spunjji - Wednesday, November 4, 2015 - link

    I have a bit of a gripe regarding the conclusions in the camera section. The LG G4 is clearly providing better images at night than the iPhone 6s and 6s Plus - granted there is "less motion blur" in the Apple images, but they're also quite clearly underexposed by at least a stop. It therefore seems odd to conclude that a product which produces grainier, less-detailed and murkier images than the competition is better. You could produce similarly non-blurry results on the G4 by adjusting exposure compensation and then have the best of both worlds!

Log in

Don't have an account? Sign up now