The Fastest Smartphone SoC Today: Samsung Exynos 4210

Samsung has been Apple's sole application processor supplier since the release of the original iPhone. It's unclear how much Samsung contributes to the design process, especially with later SoCs like the A4 and A5 carrying the Apple brand. It's possible that Samsung is now no more than a manufacturing house for Apple.

Needless to say, the past few years of supplying SoCs for the iPhone and iPad have given Samsung a good idea of what the market wants from an application processor. We first got the hint that Samsung knew what it was up to with its Hummingbird SoC, used in the Galaxy S line of smartphones.

Hummingbird featured a 1GHz ARM Cortex A8 core and an Imagination Technologies PowerVR SGX 540 GPU. Although those specs don't seem very impressive today, Hummingbird helped Samsung ship more Android smartphones than any of its competitors in 2010. At a high level, Hummingbird looked a lot like Apple's A4 used in the iPad and iPhone 4. Its predecessor looked a lot like Apple's 3rd generation SoC used in the iPhone 3GS.

Hummingbird's successor however is Samsung's first attempt at something different. This is the Exynos 4210 application processor:

We first met the Exynos back when it was called Orion at this year's Mobile World Congress. Architecturally, the Exynos 4210 isn't too far from Apple's A5, NVIDIA's Tegra 2 or TI's OMAP 4. This is the same CPU configuration as all of the aforementioned SoCs, with a twist. While the A5, Tegra 2 and OMAP 4 all have a pair of ARM Cortex A9 cores running at 1GHz, Exynos pushes the default clock speed up to 1.2GHz. Samsung is able to hit higher clock speeds either through higher than normal voltages or as a result of its close foundry/design relationship.


Exynos 4210 with its PoP LPDDR2

ARM's Cortex A9 has configurable cache sizes. To date all of the A9 implementations we've seen use 32KB L1 caches (32KB instruction cache + 32KB data cache) and Samsung's Exynos is no exception. The L2 cache size is also configurable, however we haven't seen any variance there either. Apple, NVIDIA, Samsung and TI have all standardized on a full 1MB L2 cache shared between both cores. Only Qualcomm is left with a 512KB L2 cache but that's for a non-A9 design.

Where we have seen differences in A9 based SoCs are in the presence of ARM's Media Processing Engine (NEON SIMD unit) and memory controller configuration. Apple, Samsung and TI all include an MPE unit in each A9 core. ARM doesn't make MPE a requirement for the A9 since it has a fully pipelined FPU, however it's a good idea to include one given most A8 designs featured a similar unit. Without MPE support you run the risk of delivering an A9 based SoC that occasionally has lower performance than an A8 w/ NEON solution. Given that Apple, Samsung and TI all had NEON enabled A8 SoCs in the market last year, it's no surprise that their current A9 designs include MPE units.

NVIDIA on the other hand didn't have an SoC based on ARM's Cortex A8. At the same time it needed to be aggressive on pricing to gain some traction in the market. As a result of keeping die size to a minimum, the Tegra 2 doesn't include MPE support. NEON code can't be executed on Tegra 2. With Tegra 3 (Kal-El), NVIDIA added in MPE support but that's a discussion we'll have in a couple of months.

Although based on Qualcomm's own design, the Snapdragon cores include NEON support as well. Qualcomm's NEON engine is 128-bits wide vs. 64-bits wide in ARM's standard implementation. Samsung lists the Exynos 4210 as supporting both 64-bit and 128-bit NEON however given this is a seemingly standard A9 implementation I believe the MPE datapath is only 64-bits wide. In other words, 128-bit operations can be executed but not at the same throughput as 64-bit operations.

The same designs that implemented MPE also implemented a dual-channel memory controller. Samsung's Exynos features two 32-bit LPDDR2 memory channels, putting it on par with Apple's A5, Qualcomm's Snapdragon and TI's OMAP 4. Only NVIDIA's Tegra 2 features a single 32-bit LPDDR2 memory channel. 

ARM Cortex A9 Based SoC Comparison
  Apple A5 Samsung Exynos 4210 TI OMAP 4 NVIDIA Tegra 2
Clock Speed Up to 1GHz Up to 1.2GHz Up to 1GHz Up to 1GHz
Core Count 2 2 2 2
L1 Cache Size 32KB/32KB 32KB/32KB 32KB/32KB 32KB/32KB
L2 Cache Size 1MB 1MB 1MB 1MB
Memory Interface Dual Channel LP-DDR2 Dual Channel LP-DDR2 Dual Channel LP-DDR2 Single Channel LP-DDR2
NEON Support Yes Yes Yes No
Manufacturing Process 45nm 45nm 45nm 40nm

Like most of its competitors, Samsung's memory controller does allow for some flexibility when choosing memory types. In addition to LPDDR2, the Exynos 4210 supports standard DDR2 and DDR3. Maximum data rate is limited to 800MHz regardless of memory type.

Based on everything I've said thus far, the Exynos 4210 should be among the highest performing SoCs on the market today. It has the same clock for clock performance as an Apple A5, NVIDIA Tegra 2 and TI OMAP 4430. Samsung surpassed those designs by delivering a 20% higher operating frequency, which should be tangible in typical use.

To find out let's turn to our CPU performance suite. We'll start with our browser benchmarks: SunSpider and BrowserMark:

SunSpider Javascript Benchmark 0.9

Rightware BrowserMark

Despite the 20% clock speed advantage the Galaxy S 2 isn't any faster than Motorola's Droid 3 based on a 1GHz TI OMAP 4430. Unfortunately this doesn't tell us too much since both benchmarks take into account browser performance as well as total platform performance. While the Galaxy S 2 is clearly among the fastest smartphones we've ever reviewed it looks like Motorola's browser may actually be a bit more efficient at javascript execution.

Where we do see big gains from the Exynos' higher clock speed is in our Linpack tests. The single-threaded benchmark actually shows more scaling than just clock speed, indicating that here are other (possibly software?) factors at play here. Either way it's clear that the 20% increase in clock speed can surface as tangible if the conditions are right:

Linpack - Single-threaded

Linpack - Multi-threaded

A clock speed advantage today is nice but it's something that Samsung's competitors will be able to deliver in the not too distant future. Where Samsung chose to really differentiate itself was in the graphics department. The Exynos 4210 uses ARM's Mali-400 MP4 GPU.

Shipping in smartphones today we have GPUs from three vendors: Qualcomm (Adreno), Imagination Technologies (PowerVR SGX) and NVIDIA (GeForce). Of those vendors, only Qualcomm and NVIDIA produce SoCs - Imagination simply licenses its technology to SoC vendors.

Both Apple and Intel hold significant amounts of Imagination stock, presumably to protect against an eager SoC vendor from taking control of the company.

ARM also offers GPU IP in addition to its CPU designs, however we've seen very little uptake until now. Before we get to Mali's architecture, we need to talk a bit about the different types of GPUs on the market today.

Audio Quality Explored by François Simond Understanding Rendering Techniques
Comments Locked

132 Comments

View All Comments

  • Mugur - Tuesday, September 13, 2011 - link

    Well, for most Android devices I've tried (I currently own 3), if you just leave them doing nothing overnight (even with wifi on on some of them, but no 3G/HSDPA, no GPS etc.) the battery drain is like 2-3%. Of course, if some app or push email or an updating widget wakes them, the drain could reach 20-25%.

    You just have to play a bit with the phone and find out what is mostly consuming your battery, even get one of the "green" apps on the Market. Through experimentation, I'm sure most people (excluding the really heavy users) will get 50% more time of the battery.
  • wuyuanyi - Monday, September 12, 2011 - link

    It must be the final answer for my pending problem.my GS II has this problem and I has been very annoyed.the CPU current produce a EMI on the output circuit ,for the BT earphone DOESN'T play such hiss and noisy.apprecite it to solve my problem rather than suspect whether it is my own case. but the next question is how to solve it ? can we manual fix the shield or , generate a noisy that is against the noisy --with reverse wave?
    hehe

    sorry for my poor ENGLISH
  • awesomedeleted - Monday, September 12, 2011 - link

    This is a fresh copy of my current phone...Samsung Infuse 4G...which came out in May. I hate the newer Galaxy S round home button thingy too. What's so special, the name?
  • awesomedeleted - Monday, September 12, 2011 - link

    Although I now notice a few small differences in hardware, such as 1.2Ghz Dual-core A9 vs. my Infuse's 1.2Ghz Single-core A8, and the 1GB RAM.
  • supercurio - Monday, September 12, 2011 - link

    Infuse 4G is a Galaxy S "repackaged" with a Galaxy S II look, screen and probably camera sensor for AT&T.
  • bmgoodman - Monday, September 12, 2011 - link

    So I understand that the audio quality of this phone is a step down from the original galaxy. My question is how big a step down? For a non-"audiophile" who just wants to connect the headphone jack into the AUX port on his OEM car stereo to listen to his variable bit rate MP3 (~128 bps IIRC) music collection, is this something that's likely to disappoint? Is it a notable shortcoming for a more typical music fan?
  • supercurio - Monday, September 12, 2011 - link

    No doubt cars are in general a noisy environment.
    Furthermore its very rare to find cars benefiting from good speakers and implementation, resulting in far from linear frequency response, left/right imbalance, resonance in other materials etc :P

    Trained ears or sensible people are capable of detecting subtle difference in sound like nobody can imagine ^^ but I don't think it will Galaxy S II DAC issues described will make a noticeable difference when listening to music while driving a car for most people.

    Note: I have no idea how was the original Samsung Galaxy phone on this regard, but its a regression over Galaxy S.

    Headphones.. that's something else because even cheap ones (price doesn't matter) can provide some low distortion levels and let your perceive fine details.
  • Deusfaux - Monday, September 12, 2011 - link

    It is there and does work, speaking from experience with a Nexus S.
  • Deusfaux - Monday, September 12, 2011 - link

    An HTC I used did it best though, with integrating the feature right into the browser settings. No special URL strings needed to access functionality.
  • aNYthing24 - Monday, September 12, 2011 - link

    But isn't there a version of the Tegra 2 that is clocked at 1.2 GHz? It's going to be at that clock speed in the Fusion Grid table.t

Log in

Don't have an account? Sign up now