The Mali-400

Now that we've settled the issue of what type of GPU it is, let's talk about the physical makeup of the Mali-400. The Mali-400 isn't a unified shader architecture, it has discrete execution hardware for vertex and fragment (pixel) processing. ARM calls the Mali-400 a multicore GPU with configurations available with 1 - 4 cores. When ARM refers to a core however it's talking about a fragment (pixel shader) processor, not an entire GPU core. This is somewhat similar to NVIDIA's approach with Tegra 2, although NVIDIA counts each vertex and fragment processor as an individual core.

In its simplest configuration the Mali-400 features a single combined geometry front end and vertex processor and a single fragment processor. The 400 is also available in 2 and 4 core versions, both of which still have only a single vertex processor. The two core version has two fragment processors and the four core version has four fragment processors. Note that ARM decided to scale fragment shading performance with core count while keeping vertex performance static. This is likely the best decision given current workloads, but a risky one. NVIDIA on the other hand standardized on a 1:1 ratio between fragment and vertex processors compared to ARM's 4:1 on a 4-core Mali-400. The 4-core Mali-400 MP4 is what Samsung uses in the Exynos 4210.

ARM, like Qualcomm, isn't particularly interested in having the details of its GPUs available publicly. Unfortunately this means that we know very little about the makeup of each of these vertex and fragment processors. I suspect that both companies will eventually learn to share (just as AMD and NVIDIA did) but as this industry is still in its infancy, it will take some time.

Earlier documentation on Mali revealed that the GPU is a VLIW architecture, meaning each processor is actually a collection of multiple parallel execution units capable of working on vector data. There's no public documentation indicating how wide each processor is unfortunately, but we can make some educated guesses.

We know from history that AMD felt a 5-wide VLIW architecture made sense for DX9 class games, later moving down to a 4-wide architecture for DX11 games. AMD didn't have the die constraints that ARM and other SoC GPU suppliers do so a 5-wide unit is likely out of the question, especially considering that Imagination settled on a VLIW4 architecture. Furthermore pixels have four color elements (RGBA), making a VLIW4 an ideal choice.

Based on this as well as some internal information we can assume that a single Mali fragment shader is a 4-wide VLIW processor. The vertex shader is a big unknown as well, but knowing that vertex processing happens on two coordinate elements (U & V) Mali's vertex shader is likely a 2-wide unit.

Thus far every architecture we've looked at has been able to process one FP16 MAD (multiply+add) per execution unit per clock. If we make another assumption about the Mali-400 and say it can do the same, we get the following table:

Mobile SoC GPU Comparison
  PowerVR SGX 535 PowerVR SGX 540 PowerVR SGX 543 PowerVR SGX 543MP2 Mali-400 MP4 GeForce ULP Kal-El GeForce
SIMD Name USSE USSE USSE2 USSE2 Core Core Core
# of SIMDs 2 4 4 8 4 + 1 8 12
MADs per SIMD 2 2 4 4 4 / 2 1 ?
Total MADs 4 8 16 32 18 8 ?
GFLOPS @ 200MHz 1.6 GFLOPS 3.2 GFLOPS 6.4 GFLOPS 12.8 GFLOPS 7.2 GFLOPS 3.2 GFLOPS ?
GFLOPS @ 300MHz 2.4 GFLOPS 4.8 GFLOPS 9.6 GFLOPS 19.2 GFLOPS 10.8 GFLOPS 4.8 GFLOPS ?

Based on this estimated data alone, it would appear that a four-core Mali-400 has the shader compute power of a PowerVR SGX 543. In other words, half the compute horsepower of the iPad 2's GPU or over twice the compute of any smartphone GPU today. The Mali-400 is targeted at 275MHz operation, so its figures are likely even higher than the competition. Although MADs are quite common in shader execution, they aren't the end all be all - we need to look at application performance to really see how it stacks up.

Understanding Rendering Techniques GPU Performance: Staggering
Comments Locked

132 Comments

View All Comments

  • kreacher - Monday, September 12, 2011 - link

    I was disappointed to see that there is no mention of the screen 's inability to display 24bit gradients while Samsung claims its a screen capable of displaying 16M colors.
  • supercurio - Monday, September 12, 2011 - link

    See my answer for Astri.

    Maybe you only checked a gradient in Web Browser or a specific app that forces 16bit surfaces.
    Each app has the ability to choose how the rendering is done in this regard.
    Internally, the Super AMOLED controller works in much more than 24bit in order to proceed to complex color-space conversions between the a digital frame buffer and an analog, very large dynamic range OLEDs.
  • lemmo - Monday, September 12, 2011 - link

    Great review, and incredible detail on the audio quality. Shame the SGS2 has taken a step backwards on audio. Any info on the likely spec for the audio on the Samsung Nexus Prime?

    Also, any recommendations on reviews of best smartphones for audio quality? cheers :)
  • supercurio - Monday, September 12, 2011 - link

    Thanks for the feedback!
    I really need it in order to improve next review.

    Organize it differently for better readability, maybe evaluate other aspects as well (like recording)

    I have no info about what's in Nexus Prime but I'd like to :P
    If somebody can send me a report: https://market.android.com/details?id=org.projectv... I'll study it.

    Best devices I know in terms of audio quality:
    - with Voodoo Sound: Nexus S, Galaxy S family, Galaxy Tab 7".
    - with or without Voodoo sound: Asus Transformer
    - soon with Voodoo Sound: Galaxy Tab 10.1 (incredible power stage for the headphone amp)
    - iPhones/iPad: clean DAC but boring headphone amp (unable to drive many cans to adequate levels)

    And.. many I don't know! (yet?)
  • lemmo - Wednesday, September 14, 2011 - link

    Thanks supercurio, very helpful. I'll keep looking for more info on the Nexus Prime.

    Shame none of the other phones/tablets you mention have got the right spec for me.

    In practice, do you think the 'average user' will notice the poorer audio quality on the SGS2?
  • yellowchilli - Monday, September 12, 2011 - link

    a very very good read thank you
    i've owned the sgs2 since its EU launch..it's interesting to see the slight differences/improvements samsung has put into the US release (e.g. the power button, camera ui)
  • mcquade181 - Monday, September 12, 2011 - link

    I've had my SGS2 here in Australia for two months now and on a recent snow trip noticed some deficiencies compared to my friends Nokia N8. We both use the same provider (Telstra 3G on 850MHz):
    1. Whilst travelling there were periods where I completely lost reception whereas the N8 still had a signal and was able to make calls. This suggests that the SGS2 is a bit lacking in cellular sensitivity (and note that the N8 is not all that flash either when compared to the old Nokia N95).
    2. In our snow accomodation I could not get a reliable WiFi signal from the local hotspot whereas the N8 could (it was marginal, but it did work).
    3. Bluetooth on the SGS2 is unreliable with some devices. It keeps disconnecting after a few minutes.

    That said I do like my SGS2 and is better in many other ways to the Nokia N8 - in particular earphone volume and call clarity where the N8 is deficient. Of course android has a much wider selection of available apps than does the Nokia, although surprisingly ALL my favourite apps are also available for the Nokia.

    Regards from down under, Graham Rawolle.
  • willstay - Monday, September 12, 2011 - link

    What a coincidence. After two Androids, I actually bought N8 and later sold it to get SGS2. Before Belle, swype was only available in landscape and my must-have apps are not there for Symbian.
  • jcompagner - Tuesday, September 13, 2011 - link

    yes that is the only drawback i also can find of the SGS2..
    Wifi reception is really not up to standards.
  • kmmatney - Monday, September 12, 2011 - link

    I'm not happy with the battery tests - they don't show real life usage. I'd still like to know what happens with the battery if you just leave the phone in your pocket for most of the day, or what happens if you leave it in standby overnight. All of my co-workers complain about battery life with their Android phones, and all want to get iPhones the next time around. The batteries seem to drain excessively with the phones doing nothing, and they are often dead when they go to use them. Who cares if you can browse the web for 7 hours or whatever...I just want the phone to be ready to use if its been sitting on my desk for half the day, or if I forget to charge it overnight. This is way more important - at least for someone like me who travels. (actually, I work a lot in wafer fabs around the world, and crappy reception in the fabs often drain battery quickly).

    I guess it will depend on what Apps are installed, and you use push notifications, but it would be useful to have a test where you charge the phone, and then let it sit for 8 hours doing nothing, and then report the battery life. The older Android phones seemed terrible at this, while my iPhone 3GS is great.

    This phone looks awesome, but I would need this information before I would consider buying it.

Log in

Don't have an account? Sign up now