A8’s GPU: Imagination Technologies’ PowerVR GX6450

Last but not least on our tour of the A8 SoC is Apple’s GPU of choice, Imagination’s PowerVR GX6450.

When Apple first announced the A8 SoC as part of their iPhone keynote, they told us to expect a nearly 50% increase in graphics performance. Based on that information and on the fact that that Apple was moving to a denser 20nm process, we initially believed that Apple would be upgrading from A7’s 4-core PowerVR design to a 6-core design, especially in light of the higher resolution displays present on the iPhone 6 and iPhone 6 Plus.

Instead our analysis with Chipworks found that only four GPU cores were present on A8, which ruled out the idea of a 6-core design but did narrow down the options considerably. Based on that information and more importantly Apple’s Metal Programming Guide, we have been able to narrow down our options to a single GPU, the PowerVR GX6450.

The GX6450 is the immediate successor to the G6430 first used in the A7 and is based on Imagination’s PowerVR Series6XT architecture. Imagination first announced PowerVR Series6XT to the public at CES 2014, and now just a short eight months later we are seeing the first Series6XT hardware reach retail.

We have already covered the PowerVR Series6/Series6XT architecture in some detail earlier this year so we won’t go through all of it again, but we would encourage anyone who is interested to take a look at our architectural analysis for additional information. Otherwise we will be spending the bulk of our time looking at how GX6450 differs from G6430 and why Apple would choose this specific GPU.

From a technical perspective Series6XT is a direct evolution over the previous Series6, and GX6450 is a direct evolution over G6430 as well. Given a 4-core configuration there are only a limited number of scenarios where GX6450 outright has more hardware than G6430 (e.g. additional ALUs), and instead Series6XT is focused on adding features and improving performance over Series6 through various tweaks and optimizations to the architecture. Series6 at this point is actually over two years old – it was first introduced to the public at CES 2012 – so a lot has happened in the mobile GPU landscape over the past couple of years.

The closest thing to a marquee feature on Series6XT is support for Adaptive Scalable Texture Compression (ASTC), a next-generation texture compression technology that is slowly making its way into GPUs from a number of manufacturers. Designed by the consortium responsible for OpenGL ES, Khronos, ASTC is designed to offer better texture compression (with finer grained quality options) than existing texture compression formats while also being a universal format supported by all GPUs. In Apple’s case they have always been using PowerVR GPUs – and hence all products support PVRTC and more recently PVRTC2 – however ASTC being exposed allows them to take advantage of the quality improvements while also making game development and porting from other platforms easier.

Less visible to users but certainly important to Apple, Series6XT also includes new power management capabilities to reduce power consumption under idle and light workloads. Through finer grained power gating technology that Imagination dubs “PowerGearing G6XT”, GX6450 can now have its shading clusters (USCs) powered down individually, allowing only as many of them as are necessary to be fired up. As Apple continues to min-max their designs, being able to idle at a lower power state can be used to improve battery life and/or increase how often and how long the A8’s GPU uses higher power states, improving overall efficiency.


Apple iPhone GPU Performance Estimate: Over The Years

And, perhaps most importantly overall, Series6XT comprises a series of under-the-hood optimizations to improve overall performance. When it comes to the internals of PowerVR architectures we only have limited details from Imagination on how they operate, so in some areas we know quite a bit about what Imagination has been up to and in other areas their architectures are still something akin to a black box. At any rate Imagination’s goal for Series6XT was to improve performance by up to 50% – this seems to be where Apple’s 50% performance improvement claim comes from – though as we’ll see the performance gains on real world applications are not going to be quite as potent.

What we do know about Series6XT is that Imagination has made some changes to the structure of the USCs themselves. Series6XT still uses a 16-wide SIMD design, but in each pipeline they have added another set of medium/half-precision (FP16) ALUs specifically to improve FP16 performance. Now instead of 2x3 (6) FP16 ALUs, Series6XT bumps that up to 4x2 (8) FP16 ALUs. This is the only outright increase in shader hardware when you compare Series6 to Series6XT, and on paper it improves FP16 performance by 33% at equivalent clock speeds.

The focus on FP16 is interesting, though for iOS it may be misplaced. These half-precision floating point operations are an excellent way to conserve bandwidth and power by not firing up more expensive FP32 ALUs, but the tradeoff is that the numbers they work with aren’t nearly as precise, hence their use has to be carefully planned. In practice what you will find is that while FP16 operations do see some use, they are by no means the predominant type of floating point GPU operation used, so the FP16 increase is a 33% increase only in the cases where performance is being constrained by the GPU’s FP16 performance.

FP32 performance meanwhile remains unchanged. Each USC pipeline contains two such ALUs, for up to four FP32 FLOPS per clock, or to use our typical metric, 128 MADs (Multiply-Adds) per clock.

The rest of Series6XT’s optimizations take place at the front and back ends, where geometry processing and pixel fill take place respectively. Imagination has not told us exactly what they have done here, but both these areas have been targeted to improve sustained polygon rates and pixel fillrate performance. These more generic optimizations stand to be more applicable to general performance, though by how much we cannot say.

One final optimization we want to point out for Series6XT is that Imagination has made some additional under-the-hood changes to improve GPU compute performance. We have not talked about GPU compute on iOS devices thus far, as until now Apple has not exposed any APIs suitable for it (e.g. OpenCL is not available on iOS). With iOS8 Apple is releasing their Metal API, which is robust enough to be used for both graphics and now compute. How developers put this capability to use remains to be seen, but GX6450 should perform even better than G6430.

Mobile SoC GPU Comparison
  PowerVR SGX 543MP2 PowerVR SGX 543MP3 PowerVR SGX 543MP4 PowerVR SGX 554MP4 PowerVR G6430 PowerVR GX6450
Used In iPad 2/iPhone 4S iPhone 5 iPad 3 iPad 4 iPad Air/iPhone 5s iPhone 6/iPhone 6Plus
SIMD Name USSE2 USSE2 USSE2 USSE2 USC USC
# of SIMDs 8 12 16 32 4 4
MADs per SIMD 4 4 4 4 32 32
Total MADs 32 48 64 128 128 128
GFLOPS @ 300MHz 19.2 GFLOPS 28.8 GFLOPS 38.4 GFLOPS 76.8 GFLOPS 76.8 GFLOPS 76.8 GFLOPS
Pixels/Clock N/A N/A N/A N/A 8 8
Texels/Clock N/A N/A N/A N/A 8 8

The one wildcard when talking about performance here is going to be clock speeds. Apple doesn’t expose these and they aren’t easy to test for (yet), though in the long term Metal offers some interesting possibilities for nailing that down, or at least getting a better idea of relative clock speeds.

In any case, we’ll take a look at our GPU benchmarks in depth in a bit, but overall GPU performance compared to A7 and its G6430 is consistently better, but the exact performance gain will depend on the test at hand. Some tests will come very close to reaching 50% while others will be just 15-20%. The dependent factor generally seems to be whether the test is ALU-bound or not; because the USC has not changed significantly from G6430 to GX6450 outside of those additional FP16 ALUs, tests that hit the FP32 ALUs in particular show less of an improvement. Otherwise more balanced tests (or at least tests more defined by pixel fillrate performance) can show greater gains. In general we should be looking at a 30-35% performance improvement.

Why Four Cores?

One thing that admittedly surprised us in the revelation that A8 was using a 4-core PowerVR design was that we figured a 6-core design would be a shoe-in for A8, especially since Apple was on the receiving end of the density improvements from TSMC’s 20nm process. But upon further reflection an additional two cores is likely more than Apple needed nor wanted.

The biggest factor here is that coming from G6430 in the A7, performance has seen a solid improvement despite sticking to only four GPU cores. Due to the combination of performance improvements from the Series6XT architecture and any clock speed increases from Apple, A8 gets quite a bit more GPU performance to play with. The increased resolution of the iPhone 6 screen in turn requires more performance if Apple wants to keep native resolution performance from significantly regressing, which GX6450 is capable of delivering on. Never mind the fact that G6430 also drove the iPad Air and its much larger 2048x1536 pixel display.

PowerVR Series6/6XT "Rogue"
GPU # of Clusters # of FP32 Ops per Cluster Total FP32 Ops Optimization
G6200 2 64 128 Area
G6230 2 64 128 Performance
GX6240 2 64 128 Area
GX6250 2 64 128 Performance
G6400 4 64 256 Area
G6430 4 64 256 Performance
GX6450 4 64 256 Performance
G6630 6 64 384 Performance
GX6650 6 64 384 Performance

These performance improvements in Series6XT have a cost as well, and that cost is suitably reflected in the estimated die sizes for each GPU. The G6430 was 22.1mm2 on the 28nm A7, while the GX6450 is 19.1mm2 on A8. Though GX6450 is smaller overall, it’s nowhere near the roughly 11.1mm2 a pure and perfect die shrink of G6430 would occupy. Limited area scaling aside, GX6450’s additional functionality and additional performance requires more transistors, and at the end of the day Apple doesn’t see a significantly smaller GPU because of this. In other words, the upgrade from G6430 to GX6450 has delivered much of the performance (and consumed much of the die space) we initially expected to be allocated to a 6-core GPU.

Overall the choice of GX6450 seems to be one of picking the GPU best for a phone, which is an area the G6430 proved effective with A7. As a step below Imagination’s 6-core PowerVR designs, GX6450 delivers a better balance between performance and power than a larger GPU would, which in turn is clearly a benefit to Apple. On the other hand this means A8 is not going to have the GPU performance to compete with the fastest SoCs specifically designed for tablets, though what this could mean for the obligatory iPad update remains to be seen.

A8’s CPU: What Comes After Cyclone? CPU Performance
Comments Locked

531 Comments

View All Comments

  • bobobobo - Tuesday, September 30, 2014 - link

    solid phone, solid improvement.
  • AppleCrappleHater2 - Tuesday, September 30, 2014 - link

    Worship the holy apple.

    The apple way, selling over expensive crap to stupid consumers that like to
    get robbed.

    This has been a disastrous launch in every respect. The iwatch is such an
    ugly piece of crap, it is truly unbelievable how a company, formerly known for
    its remarkable design, dares to put out such a crap ton of shit. Some
    characteristics are glaringly obvious and inherent to it: over expensive,
    hardly innovative, limited functionality and usability (need of an iPhone to
    make it work), looks exactly like a toy watch and so on.

    There are of course way better smart watches out there, especially from the
    likes of Samsung, Sony, Motorola, Asus, LG, simply put, there is no need for
    another piece of over expensive junk.

    The iPhone 6 is technologically stuck in pre-2011 times, a base model with
    a capacity of 16GB without the possibility to use SD cards isn't even funny
    anymore. The screen resolution is horrendous, it isn't water proof, shock and
    dust resistant, it offers nothing innovative, just some incremental
    updates over its predecessor, both lacking severely behind their competitors at
    their respective launch dates.

    Now the Iphone 6 Plus offers a „Retina HD“ screen, full 1920x1080p, oh wow,
    where have you been for the past 4 years apple, talk about trailing behind.
    That’s pathetic. The interesting thing about that is the fact that apple
    always manages to sell backwards oriented, outdated crap to its user base, all
    while pretending to be an innovative technology leader. The similarities
    regarding any form of sectarian cult are striking.

    You gotta love how Apple always comes up with new marketing bullshit terms,
    aka "Retina HD", with the intention to manipulate its users while preventing easy
    comparisons with its competitors by withholding the actual specs. Apparently it’s
    not enough to have a 1080p screen, you have to call it "Retina HD" to make those
    suckers buy it, otherwise someone could look at the 4K Amoled and Oled screens
    form LG and Samsung devices and get outright disappointed. Same goes for
    everything else. Every outdated „feature“ needs to get its own marketing label
    to persuade buyers with crappy „experience“ and „usability“ ads, while covering
    the truth with marketing gibberish, knowing full well that only a fraction of
    aforementioned buyers cares to look at the facts and dares to compare them.

    Car engines come to mind. For comparisons shake let’s look at a 1.0 liter, turbo
    charged petrol engine and a V8 compressor. What’s better should be obvious, but
    by calling the former an „ecobooster“, thus giving it a special marketing label,
    this joke becomes a „feature“, something positive that can be added tot the list
    of features of a car.

    By doing so a negative aspect is transformed into a positive one, the
    reality is distorted, non tech savvy buyers are manipulated and comparisons are
    made more difficult (another layer of marketing bullshit to overcome), well done
    marketing department. You see , if something is seriously lacking (of course for
    profit, what else), don’t bother explaining, just give it a nice marketing term, distort
    reality, make it a feature and call it a day. Fuck that!!

    FACT: Apple has been forced to copy Android in style and size for
    years because people abandoned their tired, moribund and fossilized
    devices for superior and innovative Android devices.

    Steve Jobs said no one should want a 7" tablet until everyone went and bought
    Android devices forcing Apple to copycat with the iPad Mini. Apple
    didn't think anyone wanted a phone screen larger than a business card
    until they all bought Androids thus forcing the arrival this week of the
    iPhone Galaxy and iPhone Galaxy Note clone phones.

    Swipe down notifications that don't interfere? Copied from Android and WebOS. Siri?
    Bought and ruined from a private developer; Google Now crushes it.
    3rd-party keyboards? Welcome to 2010, iChumps! Widgets? Welcome to 2009
    except you can't place them on your home screen. Live wallpapers and
    hidden icons? Maybe Apple will get around to copying those in iOS X in
    2016. Who knows.

    Apple lacks creativity and honest people acknowledge it. Steve Jobs gets credited as an
    innovator when all he was, was a huckster who'd spot someone else's tech, polish it up nicely,
    then slap a gnawed fruit logo on the back, charge a premium price and
    wait for the rubes like Jim Smith to hand over their cash like the good
    iSheep they are.

    But after that initial iteration, Apple is incapable of actually innovating something new.
    They literally cannot make a product until someone else shows them how and they copy it.
    They are also unable refine things because they believe to improve is to
    admit something was imperfect the first time. (This is why QuickTime 4
    had a legendarily terrible UI that was never changed through QT7 a
    decade later.) All they can do is make things incrementally thinner or
    faster but it's just minor refinements since they can't invest their way
    out of a wet paper bag.

    For all their squealing about Retina displays, they never even had a HD display until now;
    8th time is the charm, though you need the iPhone Galaxy Note to get the 1080p that many Android
    users have had for at least a year and is now considered
    bare-minimum spec. At the rate Apple drags along, QHD screens should
    arrive in 2018. Maybe. A graphic went around after the reveal comparing
    the iPhone Galaxy to the Nexus 4 from 2012. Exactly.

    The Apple Iphone 1 and Ipad 1 might have been innovative at their time,
    but since then, the bitten apple has been continuously rotting from the inside
    outwards, always swarmed by millions of Iworms which regale themselves with its
    rotten flesh, not forgetting all other Americans who support apple by means of
    their tax dollars to finance its bought US Treasury/Government bond interest rates.

    Last but not least, every Apple product includes a direct hotlink to the NSA,
    free of charge, something that might make it a good value, after all.

    Ceterum censeo Applem esse delendam.
  • esterhasz - Tuesday, September 30, 2014 - link

    Since we're quoting Cato today, here's a good one: "grasp the subject, the words will follow".
  • uhuznaa - Tuesday, September 30, 2014 - link

    You seem to be a tiny bit obsessed.
  • iphone6splus - Tuesday, September 30, 2014 - link

    Yet, he didn't comment on Touch ID.
  • kevin_newell - Thursday, October 9, 2014 - link

    Apple is lagging far behind it's competitors both in user satisfaction (source: http://www.consumertop.com/best-phone-guide/) and innovation. I mean, who was first with large screens and phone cameras that work well in low light? It sure wasn't Apple.
  • Caliko - Tuesday, October 6, 2015 - link

    A large iPhone is NOT innovation.

    Sorry iPhoney fan.
  • lowtolerance - Tuesday, September 30, 2014 - link

    I can recommend some good therapists. You need one.
  • melgross - Tuesday, September 30, 2014 - link

    You sir, are a complete idiot!
  • Gondalf - Tuesday, September 30, 2014 - link

    To be fair, a >$600 phone deserves a good LCD.....at least good as competitors, more ram and a little SD expansion slot. Plain and simple. This is not a matter of "idiocy"

Log in

Don't have an account? Sign up now