The Mali-400

Now that we've settled the issue of what type of GPU it is, let's talk about the physical makeup of the Mali-400. The Mali-400 isn't a unified shader architecture, it has discrete execution hardware for vertex and fragment (pixel) processing. ARM calls the Mali-400 a multicore GPU with configurations available with 1 - 4 cores. When ARM refers to a core however it's talking about a fragment (pixel shader) processor, not an entire GPU core. This is somewhat similar to NVIDIA's approach with Tegra 2, although NVIDIA counts each vertex and fragment processor as an individual core.

In its simplest configuration the Mali-400 features a single combined geometry front end and vertex processor and a single fragment processor. The 400 is also available in 2 and 4 core versions, both of which still have only a single vertex processor. The two core version has two fragment processors and the four core version has four fragment processors. Note that ARM decided to scale fragment shading performance with core count while keeping vertex performance static. This is likely the best decision given current workloads, but a risky one. NVIDIA on the other hand standardized on a 1:1 ratio between fragment and vertex processors compared to ARM's 4:1 on a 4-core Mali-400. The 4-core Mali-400 MP4 is what Samsung uses in the Exynos 4210.

ARM, like Qualcomm, isn't particularly interested in having the details of its GPUs available publicly. Unfortunately this means that we know very little about the makeup of each of these vertex and fragment processors. I suspect that both companies will eventually learn to share (just as AMD and NVIDIA did) but as this industry is still in its infancy, it will take some time.

Earlier documentation on Mali revealed that the GPU is a VLIW architecture, meaning each processor is actually a collection of multiple parallel execution units capable of working on vector data. There's no public documentation indicating how wide each processor is unfortunately, but we can make some educated guesses.

We know from history that AMD felt a 5-wide VLIW architecture made sense for DX9 class games, later moving down to a 4-wide architecture for DX11 games. AMD didn't have the die constraints that ARM and other SoC GPU suppliers do so a 5-wide unit is likely out of the question, especially considering that Imagination settled on a VLIW4 architecture. Furthermore pixels have four color elements (RGBA), making a VLIW4 an ideal choice.

Based on this as well as some internal information we can assume that a single Mali fragment shader is a 4-wide VLIW processor. The vertex shader is a big unknown as well, but knowing that vertex processing happens on two coordinate elements (U & V) Mali's vertex shader is likely a 2-wide unit.

Thus far every architecture we've looked at has been able to process one FP16 MAD (multiply+add) per execution unit per clock. If we make another assumption about the Mali-400 and say it can do the same, we get the following table:

Mobile SoC GPU Comparison
  PowerVR SGX 535 PowerVR SGX 540 PowerVR SGX 543 PowerVR SGX 543MP2 Mali-400 MP4 GeForce ULP Kal-El GeForce
SIMD Name USSE USSE USSE2 USSE2 Core Core Core
# of SIMDs 2 4 4 8 4 + 1 8 12
MADs per SIMD 2 2 4 4 4 / 2 1 ?
Total MADs 4 8 16 32 18 8 ?
GFLOPS @ 200MHz 1.6 GFLOPS 3.2 GFLOPS 6.4 GFLOPS 12.8 GFLOPS 7.2 GFLOPS 3.2 GFLOPS ?
GFLOPS @ 300MHz 2.4 GFLOPS 4.8 GFLOPS 9.6 GFLOPS 19.2 GFLOPS 10.8 GFLOPS 4.8 GFLOPS ?

Based on this estimated data alone, it would appear that a four-core Mali-400 has the shader compute power of a PowerVR SGX 543. In other words, half the compute horsepower of the iPad 2's GPU or over twice the compute of any smartphone GPU today. The Mali-400 is targeted at 275MHz operation, so its figures are likely even higher than the competition. Although MADs are quite common in shader execution, they aren't the end all be all - we need to look at application performance to really see how it stacks up.

Understanding Rendering Techniques GPU Performance: Staggering
Comments Locked

132 Comments

View All Comments

  • dagamer34 - Sunday, September 11, 2011 - link

    On pg 15, Galaxy S uses a SGX 540 GPU, not 530. Other than that, great review!
  • Synaesthesia - Sunday, September 11, 2011 - link

    Staggering review, you really are the most comprehensive and scientific reviewer around, bravo!

    Samsung have really impressed with this phone, in terms of how much effort they have invested in the hardware and software. One thing still stands out for me, the battery life. While good, it still doesn't hold a candle to the iPhone 4, as shown on the charts.
  • LostViking - Saturday, September 17, 2011 - link

    What do you mean?
    Its about 30% worse when web browsing (mostly because of the much larger screen I reckon), but better in the other tests.

    If you are one of those old timers who actually use the phone for talking the SGSII is about 30 better ;)
    When I am low on battery, and don't have access to a charger, that's usually what I would prioritize.
  • xdrol - Sunday, September 11, 2011 - link

    For me the mentioned Cat 5 limit looks reasonable - you don't get user-level 2.0 Mbps because of the overhead of the PDCP/RLC/MACd protocols (about 15% -> 2.0 Mbps is 1.7 Mbps for IP).
  • wilky76 - Sunday, September 11, 2011 - link

    Alot of people that have the Samsung Galaxy S2 are suffering from framerate problems when using either 720p or 1080p in low light including myself.

    What basically happens is when the camera tries to focus in lowlight the framerates drop to around 13fps, then jump back upto around 30fps again, basically making any HD video recording useless in low light because of the stuttering, the only fix that is known is to drop the exposure to -2 as this stop the stuttering, or use 480p when indoors or poor light.

    Some folks have returned their SGS2 because of this problem, only to receive another with the same problem.

    There has been a couple of camera firmware updates on Samung own app site, which to this date still hasn't sorted the problem out & in some cases people that weren't suffering from this problem, now have it after updating the camera firmware.

    Can any of you guys at Anandtech test your SGS2 in low light with either 720p or 1080p to see if the mobile you received for reviewing also suffer from this problem.

    But what is strange is that not everbody has the framerate problem, so it could be due to which sensor you get with your SGS2, and could proberly be sorted with a firmware update eventually.

    Anyways people with this problem and there is a few can be found in this post over at XDA

    http://forum.xda-developers.com/showthread.php?t=1...
  • DrSlump - Monday, September 12, 2011 - link

    Hi, i have exactly the same problem with my samsung galaxy s2.
    I got casual stuttering (a frame loss) during normal light conditions and severe stuttering under low light conditions.
    As soon as the firmware raises the sensor gain to match the detected light, the framerate goes down to 25fps and when autofocus occours the framerate goes down to 13fps, and then returns back to 25fps when the autofocus is finished.
    I olso noticed that when i try to frame a tv or a monitor, severe banding occours. Even taking a video when the light source is a tv or a monitor, banding occours. Seems like the isp isn't able to compensate the frequency of the light source.
    In a lot of situations it's impossible to take a video due to the severe stuttering :(
    Any one of you has these problems? How to solve it?

    I would like to ask to the autor:
    did you notice some problem with the display? There is a thread in the xda-developers forum that speaks about the yellow tinting or faded out left side of the screen. Please can you report about this problem?
  • B3an - Sunday, September 11, 2011 - link

    Why dont you just admit it's the best phone around hands down? :) Not just the best Android phone. It's clearly miles superior to the outdated iPhone 4.

    Shame you yanks have had to wait forever to get it, only to get 3 different versions that dont even look as good and have ridiculous names. I've been using a GSII since April and it's just unmatched.
  • ph00ny - Sunday, September 11, 2011 - link

    This yank got it on the UK launch day and i've been enjoying it since
  • steven75 - Sunday, September 11, 2011 - link

    It's not better in battery life, audio quality, display resolution and sharpness, or the many ways that iOS is better (AirPlay, app selection AND quality), immediate OS updates, etc).

    Gread Android phone though for those interested in 4.3" displays, which definitely isn't everyone. Personally, I'd wait for the Prime.
  • steven75 - Sunday, September 11, 2011 - link

    Oh and outdoor display brightness, which even at 100% isn't a match for iPhone, but then it's even capped at 75% for temp reasons.

Log in

Don't have an account? Sign up now