GPU Performance Using Unreal Engine 3

In our iPad 2 review I called the PowerVR SGX 543MP2 Apple's gift to game developers. Apple boasted a roughly 9x improvement in raw GPU compute power over the A4 into the A5. The increase came through more execution resources and a higher GPU clock. The A5 in the iPhone 4S gets the same GPU, simply clocked lower than the iPad 2 version. Apple claims the iPhone 4S can deliver up to 7x the GPU performance of the iPhone 4, down from 9x in the iPad 2 vs. iPad 1 comparison. Why the delta?

The iPad 2 has both a larger battery and a higher resolution display. There are 28% more pixels to deal with on the iPad 2 vs the iPhone 4S and 9x vs 7x actually works out to be a 28% increase. The lower clocked GPU goes along with the lower clocked CPU in the 4S' version of the A5 to keep power consumption in check and because the platform doesn't need the performance as much as the iPad 2 with its higher resolution display.

Mobile SoC GPU Comparison
  Adreno 225 PowerVR SGX 540 PowerVR SGX 543 PowerVR SGX 543MP2 Mali-400 MP4 GeForce ULP Kal-El GeForce
SIMD Name - USSE USSE2 USSE2 Core Core Core
# of SIMDs 8 4 4 8 4 + 1 8 12
MADs per SIMD 4 2 4 4 4 / 2 1 ?
Total MADs 32 8 16 32 18 8 ?
GFLOPS @ 200MHz 12.8 GFLOPS 3.2 GFLOPS 6.4 GFLOPS 12.8 GFLOPS 7.2 GFLOPS 3.2 GFLOPS ?
GFLOPS @ 300MHz 19.2  GFLOPS 4.8 GFLOPS 9.6 GFLOPS 19.2 GFLOPS 10.8 GFLOPS 4.8 GFLOPS ?

GLBenchmark continues to be our go-to guy for GPU performance under iOS. While there are other reputable 3D benchmarks, GLBench remains the only good cross-platform (iOS and Android) solution we have today.

The performance gains live up to Apple's expectations (Update: our original 4S for Egypt/Pro were incorrect. We had two sets of graphs, one internal and one external - the latter had incorrect data. We have since updated the charts to reflect the 4S' actual performance. Sorry for the mixup!):

GLBenchmark 2.1 - Egypt - Offscreen

GLBenchmark 2.1 - Pro - Offscreen

GLBenchmark gets around vsync by rendering offscreen, so the 4S is allowed to run as fast as it can. Here we see a 6.46x higher frame rate compared to the iPhone 4.

It's obvious that GLBenchmark is designed first and foremost to be bound by shader performance rather than memory bandwidth, otherwise all of these performance increases would be capped at 2x since that's the improvement in memory bandwidth from the 4 to the 4S. Note that we're clearly not overly bound by memory bandwidth in these tests if we scale pixel count by 50%, which is hardly realistic. Most games won't be shader bound, instead they should be more limited by memory bandwidth.

At the iPhone 4S introduction Epic was on stage showing off Infinity Blade 2, which will have new visual enhancements only present on the 4S thanks to its faster GPU. Thus far Epic has been using GPU performance improvements to make its games look better and not necessarily run faster (although they do) since the target is playability on all platforms. What I wanted however was a true apples-to-apples comparison using Epic's engine as it is arguably the best looking platform to develop iOS games on today.

Epic offers a free license to Unreal Engine 3 to anyone who wants to use it for non-commercial use. If you want to sell your UE3 based iOS game, you don't have to pay a large sum to license Epic's engine up front. Instead you toss Epic $99 and pay royalties (25%) on any revenue beyond the first $50K. It's a great deal for aspiring game developers since you get access to one of the best 3D engines around and don't need any additional startup capital to use it. If your game is a hit Epic gets a cut but you're still making money so all is good in the world.

The process starts with UDK, the Unreal Development Kit. Epic actually offers a great deal of documentation on developing using UDK, making the whole process extremely easy. The freely available UDK can target Windows, Mac OS X and iOS platforms. If you want Android support you'll have to pay to license the dev kit unfortunately. Given how successful Infinity Blade has been under iOS, I suspect this is a move partially designed to keep Apple happy. It's also possible the Android UE3 dev kit is simply not as far along as the iOS version.

Along with every UDK download, Epic now provides the full source code to its well known iOS Citadel demo. With access to Citadel's source code and Epic's excellent (and freely available) development tools I put together a real-world GPU test for iOS.


What's that? A frame counter in iOS? Huzzah!

The test shows us frame rate over the course of a flythrough of Epic's Citadel demo. This is simply the standard Citadel guided tour but with UE3's frame recording statistics enabled. Once again, UDK gave me the tools needed to accurately profile what was going on. For developers this would be helpful in tuning the performance of your app, but for me it gave me the one thing I've been hoping for: average frame rate in a UE3 game for iOS.

The raw data looks like this, a graph of frame render times:


iPhone 4S frame time

You're looking at frame render time in ms, so lower numbers mean better performance. Notice how the iPhone 4S graph seems to remain mostly flat for the majority of the benchmark run? That's because it's limited by vsync. At 60Hz the frame render time is capped to 16.7ms, which is approximately where the 4S' curve flattens out to. The 4S could likely run through this demo even quicker (or maintain the same speed with a heavier graphical workload) if we had a way to disable vsync in iOS.


iPhone 4 frame time

On the iPhone 4 however, frame times are significantly higher - more than 2x on average. You also see significant spikes in frame time, indicating periods where the frame rate drops significantly. Not only does the 4S offer better average performance here but its performance is far more consistent, hugging vsync rather than wildly bouncing around.

The chart below summarizes the two graphs above by looking at the average frames rendered per second throughout the benchmark:

AnandTech UE3 Performance Test

The iPhone 4S averages 2.3x the frame rate of the iPhone 4 throughout our test. I believe this gives us a more realistic value than the 6x we saw in GLBenchmark. A major cause for the difference is the vsync limitations present in all iOS apps that render to the screen. On top of that, while we're obviously not completely limited by memory bandwidth, it's clear that memory bandwidth does play a larger role here than it does in GLBenchmark.

The Citadel demo by default increases rendering quality on the iPhone 4, but a quick look at the game's configuration files didn't show any new features enabled for the 4S. Chances are the version of Citadel included with the UDK was built prior to the 4S being available. In other words, the 4 and 4S should be rendering the same workload in our benchmark. To confirm I also grabbed a couple of screenshots to ensure the two devices were running at the same settings:


iPhone 4


iPhone 4S

This is actually the most stressful scene in the level, it causes even the 4S to drop below 30 fps. With the camera stationary in roughly the same position I saw a 74% increase in performance on the 4S vs the iPhone 4.

Most game developers still target the iPhone 3GS, but the 4S allows them to significantly ramp up image quality without any performance penalty. Because of the lower hardware target for most iOS games and forced vsync I wouldn't expect to see 2x increases in frame rate for the 4S over the 4 in most games out today or in the near future. You can expect a smoother frame rate and better looking games if developers follow Epic's lead and simply enable more eye-candy on the 4S.

The Memory Interface The A6: What's Next?
Comments Locked

199 Comments

View All Comments

  • metafor - Tuesday, November 1, 2011 - link

    Fair enough. But that really doesn't take away from the fact that the A5 is a relatively large chip and from the UV-scans of it, looks to use quite a bit of that die area for the GPU.

    I don't know if a similar scan has been done of Exynos but one can't safely say both chips are far bigger than SoC's traditionally used in this space.

    Though that trend appears to be moving forward with MSM8960 and Tegra 3.....
  • PeteH - Tuesday, November 1, 2011 - link

    That leads to an interesting question: will Apple always have the largest SoCs, and thus (most likely) the highest performance in the mobile space?

    The reason I could see this happening is that Apple doesn't have to sell their SoC's at a profit, so they're paying closer to cost for the chips (excluding the fab mark up). Other manufacturers (like NVIDIA) need to make a profit on their chips.
  • name99 - Thursday, November 3, 2011 - link

    "I'm not entirely sure why they had to use such a powerful GPU, though. "

    And you know EXACTLY how Apple use the GPU do you?
    Does Siri run some of its workload on the GPU? Does the faster camera stuff (eg fast HDR) run on the GPU? Does Apple already have OpenCL running (for internal use) on iOS?
  • doobydoo - Friday, December 2, 2011 - link

    He must be an Android fan.

    Androids new marketing campaign will offer a revolutionary 'new' feature - the ability to have a slower GPU than other phones!!!

    Magical.
  • InternetGeek - Monday, October 31, 2011 - link

    They might give AMD and nVidia a run for their money if they ever tried creating desktop products...
  • sprockkets - Monday, October 31, 2011 - link

    Kyro 2 was a good chip, but obviously went to focus on the desktop market.
  • tipoo - Tuesday, November 1, 2011 - link

    Maybe, but there's a reason such crossovers usually take so long. Look at Intel trying to get into this space, I don't doubt they will be good at it but it takes years of development. Imagination specializes in low power, it would take lots of development effort to get into the high power desktop game.
  • _tangent - Tuesday, November 1, 2011 - link

    I think this might be intentionally ironic given they got out of that game a long time ago :P

    On point though, anyone would given AMD and nVidia a run for their money with the right up front cash and expertise. I imagine the barrier to entry into that market is truly colossal though. Point is, the SGX543 MP2 is no evidence one way or the other.
  • lurker22 - Monday, October 31, 2011 - link

    Before buying many people who got a 4s on AT&T told me how much better it was than their prior AT&T iphones.

    Anand, thanks for confirming and explaining the reasons.
  • LordSojar - Monday, October 31, 2011 - link

    Can't we have reviews as detailed as this for the really big name Android phones? They are always far less detailed and lack a lot of the testing put into this.... thing....

    Apple makes a few adjustments, tweaks a few things, adds in the same processor that's in the iPad 2, and we have a highly detailed, scientific review that covers every single aspect, even if said aspects are the same. Samsung releases a new phone that has overall better features, faster CPU, faster NAND, a different and arguably better (or at least equal) screen, and mums the word?

    The bias is getting a bit out of hand at this point... We get that you're big time Apple fans, but cmon... At least do a major review of this caliber for the Droid RAZR and the Samsung Galaxy Nexus and the Galaxy S2 Skyrocket (LTE on AT&T!). Even if you combine them into one review, just make it THIS detailed for once instead of giving Apple the huge, super detailed ultra review!

Log in

Don't have an account? Sign up now