Original Link: http://www.anandtech.com/show/4686/samsung-galaxy-s-2-international-review-the-best-redefined



SGS2 Intro

The road to our Galaxy S2 review has been a long one. The first time we saw the device was at Mobile World Congress, where it was initially announced. There, Anand and myself played with and hurriedly benchmarked one and came away more than a bit disappointed with performance. I set my expectations based on our initial experience and came away from the conference prepared to be underwhelmed when the device launched internationally and stateside. There was never any doubt in my mind that the device would be a runaway success just like the first one, but I still came away disappointed.

Boy was I wrong. The device that launched internationally is completely different, in the positive sense.

It took a long time for us to get an international Galaxy S 2 in our hands, but we finally got one, and for the past few weeks I’ve been using it as my primary device here in the US on AT&T. It’s not an exaggeration at all to say that we’ve received more requests for a Samsung Galaxy S 2 (henceforth SGS2) review than any other smartphone, by at least an order of magnitude. The tomes of information already written about this phone has made it all the more daunting to dive head-first into a comprehensive exploration of the device, and we’ve tried to do our best.

Physical Impressions

First things first, how does the phone feel? I keep going back to MWC because that’s when first impressions were made, and thankfully in-hand feel didn’t change at all since then. The SGS2’s back eschews the plasticky smooth back of the previous SGS, and instead includes a textured battery door which snaps on. The new battery cover doesn’t wrap around to the front, instead only snapping into the back. There’s a notch in the top right which you can get a thumbnail into and pry the battery cover off with.

Moving to a textured surface instead of the SGS’ oft-derided slick plastic back gives the device a much needed boost of in-hand-feel. When I first encountered the phone at MWC I was shocked how much of a difference this simple change made. The edges are still lipped in smooth glossy plastic, but on the whole the device feels much more competent than SGS1. It appears to be the same material, but now there’s much less of a propensity for scratching or slipping around. I feel like the design language of the original SGS is still here, but it’s all grown up.

It’s impossible to go any further without noting just how thin the SGS2 is. Samsung has taken something of an obsession with holding the thickness crowns for its products, something the SGS2 did indeed hold for some time, at 8.49 mm at its thinnest point. Like the other SGS phones, there’s a bulge at the very bottom which throws a bit of a wrench into the device being completely uniform in profile. It is here that Samsung also locates antennas and the loudspeaker, though the real purpose for this bulge seems to be at least somewhat ergonomic. This bulge is around 10.1 mm thick, which is 1.61 mm thicker than the rest of the phone. For comparison, the old SGS was around 10.0 mm thick all over. As a result, the SGS2 is on the whole noticeably thinner, all while dramatically increasing power and features which we’ll get to in a moment.

Next up is the relocation of the microUSB port. The original SGS’ microUSB port placement drew a lot of discussion, as it’s at top left and behind a small door on the international variant and most regional variants. The SGS2 however relocates the microUSB port to dead center of the bottom face of the device, and just right of it is the primary microphone port. I got used to seeing the microUSB port up top and understood that choice, but having things at the bottom just feels more natural.

The volume rocker and power button are placed basically in the same position on SGS2 as the previous generation. Power is on the right about three quarters of the way up the right side of the device, and volume on the left side positioned with the top of the volume rocker in-line with the top of the power button. Thankfully these buttons are communicative and clicky as ever, I can’t find any fault with them.

There’s a notch which looks like the slot for a hand strap just above the volume rocker. There’s no microphone attached on the inside to this area, so it’s a bit unclear to me what this is for if it isn’t for a hand strap.

At top is another microphone port for both noise cancellation during calls using a discrete solution by Audience. Right next to it is the standard 3.5mm headphone jack.

The front of SGS2 is one unbroken glass surface, just like you’d expect for a top-tier smartphone launching this generation. Starting at the top is the 2 MP front facing camera, and right next to it is the proximity sensor and ambient light sensor. Of course, the real centerpiece is the 4.3" WVGA Super AMOLED+ display (henceforth SAMOLED+) which we’ll talk more about in a moment.



Physical Impressions

The button arrangement on SGS2 continues the trend set by the international version of the original SGS, and eschews the search button, instead going with (left to right) menu, home, and back. Of course, regional variants are going to have different button arrangements, but this three-button approach seems to be a mainstay of the international market. When I hand the phone to most people, there’s usually a bit of confusion about what the home button does, and many mistake it for an optical or capacitive trackpad. Instead, the button is just that - a button. They’re backlit, and there are options to define backlighting behavior in the stock ROM - when in the dark, for a few seconds, and so forth.

Update: You can alternatively search by pressing and holding menu. Thanks everyone!

I have to be honest that continuing to shun the search button confuses me. Not just because not having it means we can’t run kwaak3 and get to console without lots of work, but because not having it made me realize how much I use it. Thankfully almost everywhere that I’d use the search button there’s a contextual shortcut - menu, then search. It’s just an added button press in the occasional spot, which can be alien if you’re used to having that button.

As I mentioned earlier, the battery cover is one piece of plastic which pries off and is held on with clips. It isn’t particularly sturdy, so thankfully getting the battery cover off isn’t a harrowing experience. Underneath is the SGS2’s large 6.11 Whr battery, microSD slot, and SIM slot. The microSD card can’t be accessed without a battery pull, and the card clicks in and clicks out. You can get the SIM out without a battery pull, however, and word has it you can even change SIMs without rebooting despite the prompts. At the very top is a ribbon antenna which is pretty evident, and below that is the camera module with adjacent LED flash.

There’s really not much to say about the phone with the battery cover off, everything is perfect here, and it’s clear just how much of the device’s internal volume is dedicated to the SGS2's relatively large 6.11 Whr battery.

Overall the SGS2’s in-hand feel is much better than its predecessor - it’s incredible how much a different back texture and 1.6mm of reduced waistline can make a phone feel. Where I waver back and forth is the weight department. The competition has largely gone in a design direction that employs metal and thus results in heavier devices. As a result, SGS2’s light weight seems to imply a certain level of cheapness where really there is none. I guess that’s the problem - even though SGS2 has metal internally for structure, the exterior is entirely plastic, and that’s what’s ultimately the material that sets user perception. The good thing is that though it feels light, SGS2 has solid build quality.

There are no rattles when the vibrator is going, no flimsy parts that might snap off or break (like the old microUSB door), and few places where dirt can encroach. There’s also very little flex. It’s impressively solid after you get over the hurdle that is its light weight.

Physical Comparison
  Apple iPhone 4 HTC Sensation Samsung Galaxy S Samsung Galaxy S 2
Height 115.2 mm (4.5") 126.3 mm (4.97") 122.4 mm (4.82") 125.3 mm (4.93")
Width 58.6 mm (2.31") 65.5 mm (2.58") 64.2 mm (2.53") 66.1 mm (2.60")
Depth 9.3 mm ( 0.37") 11.6 mm (0.46") 9.9 mm (0.39") 8.49 mm (0.33")
Weight 137 g (4.8 oz) 148 g (5.22 oz) 119 g (4.20 oz) 115 g (4.06 oz)
CPU Apple A4 @ ~800MHz 1.2 GHz Dual Core Snapdragon MSM8260 1.0 GHz Hummingbird S5PC110 Cortex A8 1.2 GHz Exynos 4210 Dual Core Cortex A9
GPU PowerVR SGX 535 Adreno 220 PowerVR SGX 540 ARM Mali-400
RAM 512MB LPDDR1 (?) 768 MB LPDDR2 512 MB LPDDR2 1 GB LPDDR2
NAND 16GB or 32GB integrated 4 GB NAND with 8 GB microSD Class 4 preinstalled 16 GB NAND with up to 32 GB microSD 16 GB NAND with up to 32 GB microSD
Camera 5MP with LED Flash + Front Facing Camera 8 MP AF/Dual LED flash, VGA front facing 5 MP AF, VGA front facing 8 MP AF/LED flash, 2 MP front facing
Screen 3.5" 640 x 960 LED backlit LCD 4.3" 960 x 540 S-LCD 4.0" 800 x 480 SAMOLED 4.27" 800 x 480 SAMOLED+
Battery Integrated 5.254Whr Removable 5.62 Whr Removable 5.92 Whr Removable 6.11 Whr

 



Software

Before we go much further, I think it’s important to go over all of the software running on the international version of the SGS2 we’ve been loaned. I’ve made explicit mention of the fact that we were loaned this device because as a result it makes testing things with custom and leaked ROMs somewhat interesting. For the most part, carriers and OEMs don’t care as long as everything makes it back to them in exactly the same state they were shipped out, but it’s always a grey area. For that reason, I’ve been testing and using the device exclusively with the latest ROM for the phone as shown in Samsung Kies. As of this writing, that’s still Android 2.3.3 and firmware XWKF3. I realize there’s a leaked ROM which is 2.3.4, however we’ve opted to just go with official at this point.

The original SGS1 started Samsung’s trend of adding UI skins to high-end devices, and drew a firestorm of criticism from critics all over. Thankfully it seems as though Samsung has heard those complaints and has lightened things up considerably this go-around with TouchWiz 4.0 which runs on the SGS2. Where TouchWiz 3.0 (from SGS1) looked like a strange attempt at making Android 2.1 and 2.2 look like iOS, TouchWiz 4.0 is a much cleaner, less claustrophobic, and considerably less garish experience.

 

Starting with the lock screen, TouchWiz 4.0 continues the tradition of changing things here. Unlocking is achieved by moving the large graphic and clock off the screen, unless of course you’ve defined a custom lock pattern or PIN. Alerts such as new SMSes can be handled by sliding the notification ribbon across the screen. Of course this background is customizable and discrete from the main background as well. There’s really not much to say about this beyond that I’m still surprised TouchWiz didn’t take a nod from HTC’s Sense 3.0 and add shortcut functionality into this menu.

 

The main application launcher and home screens are what make or break a skin, and here I think there’s more positive than negative with TouchWiz 4.0. To start, home screen one is the far left, not the center. Switching between these is accomplished either by swiping back and forth or dragging on the dots at the bottom. This animation is extremely fluid - I get the impression that the entire TouchWiz 4.0 experience does leverage the GPU for composition and as a result feels very speedy.

 

There’s a contextual menu as well where new widgets, shortcuts, and folders can be added. In fact, most of the home screen customization takes a similar - screen on top, menu on bottom - approach, which makes a lot more sense than stock Android’s popup bubble schema. Tapping widgets gives you a long list of available widgets which tilt as you scroll through them. Just like other UI skins, there’s an assortment of skin-specific widgets that support resizing.

For the most part, I find that TouchWiz 4.0 moves away from the social-hub augmented with weird widgets motif set by the last generation of UI skins. That’s definitely a good thing, because most of the time that last generation failed to really deliver social experiences that came close to true first-party experiences.

TouchWiz 4.0 does still keep the bottom row of applications which is another throwback to iOS, and like other skins puts the application launcher shortcut in the far right.

 

By default the applications launcher presents icons in a 4x4 paginated layout, though you can toggle a list view as well and just scroll up and down. Menu edit brings you to a view just like the home screen customization page, where you can move icons around and also change the bottom row of shortcuts. There’s also folder support for organization. One major plus is that icons no longer have the chicklet-like background colors that made everything square and applications difficult to identify quickly. Thankfully, that’s gone, and the result feels far less tacky than the previous iteration.

Just like the home screen, you can change between pages of applications by tapping on the page number dot, or scroll back and forth quickly by sliding along the bar. This results in the same animated sliding view that the homescreen shows. I guess that’s one positive thing this go-around with TouchWiz, if anything you can’t criticize it for being inconsistent. For the most part honestly the launcher and homescreen TouchWiz components are pretty tolerable.

Just like in the past, the notifications shade drop down includes toggles for WiFi, Bluetooth, GPS, Sound, and Auto Rotate. It all works just like you’d expect it to. One small thing here is that if you’re in manual brightness mode, press and holding on the notifications bar and dragging left or right will change brightness along the scale. It’s a quick way to get analog control over brightness if you’re in manual setting mode.



Keyboards

By default, the SGS2 comes with Swype and the Samsung keypad preinstalled. I’ve moved away from Swype in recent months and started taking to the default gingerbread keyboard quite a lot, and it’s odd to see that Samsung has removed it from their stock ROMs. For me, this was one of the major enhancements that came with 2.3, and it’s puzzling how many different OEMs choose to purposefully not include it, and instead include their own strange keyboard in its place.

 

The Samsung keypad honestly is less than ideal and feels like it belongs back in the Android 2.1 world from whence it came, which is likely why Swype is set as default. It lacks autocorrect functionality by default and generally just looks drab.

 

Getting autocorrection enabled requires diving into the menus and enabling it for your given language, and even then isn’t that great. I guess I’m confused why Samsung would elect to not include the excellent 2.3 keyboard and instead force users to install the APK themselves.

Messaging

SMS is one of those things that each phone needs to do perfectly, and I think it’s especially worth taking a formal look at when an OEM moves away from the stock Android application. Bring up messaging and you get a list of ongoing conversations sorted by last activity, just like you’d expect.

 

Tapping new gives you a nice, clean composition page complete with character count. The conversation view is threaded and in large speech bubbles, complete with date and time stamps on each message.

Honestly I can find no fault with the Samsung messaging application. It doesn’t make the mistake that other OEMs have made by making font overly huge or decorations take away from usability and vertical space, though the composition box could stand to be a row shorter so more of the thread is visible. In addition, I spent a lot of time hammering on the SGS2’s messaging stack to try and make it slow dramatically like I’ve seen a few other Android phones do - no such lag took place, which is a great sign, even after a few weeks without deleting anything.

Browser

Like the original Galaxy S, on SGS2 samsung has made enhancements to the browser that dramatically increase smoothness. At the time we could only explain the performance increase by shrugging and claiming it was GPU accelerated. We know a bit more now about what enhancements are required to make browsing smooth in this fashion, and the answer lies in a backing store. A backing store is essentially a nice way of saying cache, and in this case what’s being cached is the rendered page itself, which is either rendered into a texture or some intermediary that’s a step above final rendering.

A backing store is what makes iOS’ browser so smooth, and you can see it render into the texture (or if you overscroll beyond the render, where it hasn’t yet) with those little grey rectangles. Render into a big texture, and then it’s a relatively free GPU operation to transform and clip that texture when a user scrolls around the page, though zooming will require a re-draw. Until Android 3.x, however, the stock Android browser hasn’t had a backing store, which is why translating around feels choppy. As a result, it has been the burden of OEMs to make their browsers feel snappy by incorporating their own backing stores. HTC works with Qualcomm to bring an appropriate level of smoothness to their devices, I already mentioned Android 3.x has one (which will no doubt carry over to Ice Cream Sandwich), and Samsung again has one this go-around in SGS2 just like they did with the original SGS.

 

So how good is SGS2’s browser backing store? Very good. Far and away this is the smoothest Android 2.x browsing experience, by a large margin. The only downside to the whole thing is that the browser has 16 bit color, again undoubtedly to make this an easy texture for manipulation by the GPU. I’ve also noticed one or two times that the browser will go to a white screen instead of showing the content after it’s loaded, which to me indicates that getting the backing store always working perfectly with a big page can be a challenge - perhaps GPU memory is at a real premium when this happens. I’m told this is fixed in newer firmware editions. That said, the tradeoff is well worth it, as zooming, translating, just about everything is buttery smooth. Browser smoothness is finally basically at parity with iOS.

What’s very impressive is that Samsung even manages to keep Flash 10.3 plugins animated while panning and scrolling around, something that currently HTC temporarily halts while translating around in their browser. It’s hard to communicate just how smooth and fluid the SGS2 browser is, and I’d encourage interested parties to watch our video which demonstrates it.

Finally, there’s one last semi-hidden browser feature - custom user agents. Enter “about:useragent” into the URL bar, and you can pull up a menu and select between a number of different user agents and masquerade the SGS2 as an iPhone, Galaxy S, Desktop (OS X 10.5.7 Safari), Nexus One, Lismore, or custom. This is something again I wish the stock Android browser would offer similar control over.



Applications

The next thing is a bit of enumeration of the skinned or custom applications that come bundled with SGS2 as part of TouchWiz 4.0. I’ve taken some screenshots of the default application bundle and some of the apps and tossed them into a gallery, and for the most part there isn’t much to talk about in detail.

Contacts takes you into samsung’s dialer application which thankfully is smart dial enabled, just like HTC’s.

 

Among the extras are a voice recorder, task manager, FM radio app, and of course Kies air. Voice recorder gets the job done and is pretty basic, as it should be. The TouchWiz task manager also is snappy and has some nice - kill everything - buttons to free up all RAM. The FM radio app supports multiple regions, RDS, and auto search. It has a nifty analog-feeling manual tuner too.

 

There’s also a video editing and photo editing application bundled. Photo editor lets you make some basic changes like crop, saturation, and some filters. It’s actually pretty decent.

 

Video editor does what you’d expect and seems to be a rather basic facsimile of iMovie for iOS, complete with a few themes and basic editing. The interface does a surprisingly good job at letting you trim and combine video clips, complete with transitions, and also stills. The live preview is a bit low framerate, which seems surprising to me, though my source material was 1080p video captured on the camera. Export is limited to 720p and does take a while.

Storage

Our SGS2 was the 16 GB unit, which came partitioned as follows:

Filesystem             Size   Used   Free   Blksize
/dev                   418M    76K   418M   4096
/mnt/asec              418M     0K   418M   4096
/mnt/obb               418M     0K   418M   4096
/mnt/usb               418M     0K   418M   4096
/app-cache               7M     4M     2M   4096
/system                503M   456M    47M   4096
/cache                  98M     4M    94M   4096
/efs                    19M     8M    11M   4096
/data                    1G   402M     1G   4096
/mnt/sdcard             11G     1G    10G   32768
/mnt/sdcard/external_sd     7G   977M     6G   32768

What’s a bit curious to me is that it’s very well known that SGS2 has 2 GB of internal storage, however the /data partition above clearly shows only 1 GB. Apparently this is a known rounding error with the version of df in the firmware we’re running, and newer leaked 2.3.4 images show 2 GB for data appropriately.

Either way, having 2 GB is more than enough for application storage and shouldn’t result in anyone running out of space - this isn’t the 150 MB or so that early Android 2.x devices offered. Of course you can also add a microSD card for additional external storage and move apps to it, like I’ve done above as shown in the sdcard/external_sd mount. What’s really good, however, is that RFS is gone right out of the box, and in its place is EXT4:

/dev/block/mmcblk0p9 /system ext4 ro,relatime,barrier=1,data=ordered 0 0
/dev/block/mmcblk0p7 /cache ext4 rw,nosuid,nodev,noatime,barrier=1,
data=ordered 0 0
/dev/block/mmcblk0p1 /efs ext4 rw,nosuid,nodev,noatime,barrier=1,
data=ordered 0 0
nil /sys/kernel/debug debugfs rw,relatime 0 0
/dev/block/mmcblk0p10 /data ext4 rw,nosuid,nodev,noatime,barrier=1,
data=ordered,noauto_da_alloc,discard 0 0
/dev/block/mmcblk0p4 /mnt/.lfs j4fs rw,relatime 0 0

The result is none of the filesystem lag that plagued the original SGS, looks like Samsung has learned its lesson here.

Software Conclusions

There are a bunch of other small things part of TouchWiz 4, including the ability to change the system font (which is becoming a pretty common feature) and motion-based gestures in some parts. Probably the most subtle extra I’m grateful for is screenshot functionality - screenshots can be taken by holding home and pressing power quickly.

For the most part, the experience is pretty pleasant and Samsung does make some welcome additions that improve browser and UI smoothness in Android 2.3 that likely won’t be part of mainline until Ice Cream Sandwich.



Display

One of the highlights of SGS2 is its 4.3“ SAMOLED+ display, which we’ve seen before on phones like the Droid Charge, and a 4.5” version of on the Samsung Infuse 4G. Though the panel is the same as what we’ve seen in the past, the controller and software are different.

As a quick refresher, Samsung has now passed through three variants of AMOLED. The first was straight up AMOLED which we saw on phones like the Nexus One and Incredible in a 3.7“ WVGA format with RGBG PenTile. The next was Super AMOLED, which was 4.0” WVGA with PenTile and adorned Galaxy S. The main improvements with Super were integration of the digitizer with top glass and use of optically transparent adhesive to reduce air gaps and subsequent fresnel reflections that add glare and reduce transmissivity. The net effect of that was improved outdoor readability and potentially some power savings from losing less light to back reflections.

 
Left: Super AMOLED Plus, Right: Super AMOLED

So now we’re up to Super AMOLED Plus (SAMOLED+), so what does this add? Well first, size is now 4.3“ or 4.5” (depending on what tickled some carrier’s fancy), and resolution is still WVGA (800x480), but the big change is that PenTile RGBG is gone. In its place is a standard RGB stripe. I’ve been rather critical of RGBG PenTile in the past purely because it emulated higher effective resolutions by using fewer subpixels (2 per logical pixel) and as a result had a characteristic grain in some circumstances. On AMOLED especially it wound up being distracting more than it was novel, and on 4" displays, it seemed that subpixels were visible with the naked eye and average visual acuity. Furthermore there were some issues with an offset pattern like RGBG and the UI direction Android was taking. Single pixel wide UI elements, some text, and solid primary colors were the main culprits where RGBG could, without considerable scrutiny, look characteristically grainy.

So why is it gone now? The big reason is probably because a corresponding move to a larger panel increases the size of those subpixels, and no doubt 4.3“ WVGA with PenTile would look even more grainy despite having the same ”effective“ resolution. Four inches was pushing it for a grid that started life at 3.7”, and 4.3“ probably was a step too far. In addition, subpixels are also correspondingly larger in the 4.3” RGB stripe (and the process mature enough now) that certain color subpixels being more prone to failure than others (and this needing to be sized appropriately) should no longer be a concern. Samsung also claims that power drain has been reduced in SAMOLED+ by almost 20% from the previous generation, no doubt partially thanks to fabrication maturity and changes made that come with better understanding of the process.

The same benefits apply with SAMOLED+ as the previous generations though - absolutely black blacks due to the subpixels not emitting any light in the off state, and potentially super vibrant colors (if calibrated properly). Unfortunately the few issues we saw with SAMOLED+ on other phones continues here as well - white point that varies with brightness level, a chance of overheating, and a bit of lingering sharpening.

Let’s start with the first one, which a lot of users have dubbed ‘yellowing’. For a while now we’ve been gathering white point data at various brightness levels. Obviously we did the same thing with SGS2.

White Point Tracking

I’ve measured brightness (full-screen white and black) and white point at six brightness levels on the SGS2. Before I measured the SGS2, I noted that subjectively there’s the most visible change in color temperature after you dip below the 50% brightness mark. To that extent, I took more measurements below that halfway point. I also tossed in the Samsung Infuse 4G (which we received but didn’t formally review) which has a 4.5" SAMOLED+ display that no doubt is identical to what’s headed to the Sprint and T-Mobile SGS2 variants, though with a different display controller. I also tossed in the Samsung Galaxy S 4G as a SAMOLED data point, and the Nexus One as an AMOLED data point, just so you can see how things have changed over the now 3 generations of AMOLED panels Samsung has shipped.

The data bears out the effect that numerous subjective parties have noticed - SGS2’s display temperature gets warmer at lower brightnesses, and varies between 7328K at 0% brightness and 8600K at 100% brightness. It’s enough of a delta in white point to be unfortunately very visible to the naked eye. There’s also an interestingly large amount of variance between the three SAMOLED+ phones we’ve measured, though the same shape curve is just translated around for the Infuse, the Charge appears to be very blue everywhere. Bear in mind again that the SGS2 uses a different display controller than the previous generation of devices.



The other part of the story is Samsung’s mobile Digital Natural Image engine, or mDNIe, profile set on the SGS2. Numerous people have noticed that under Display -> Background Effect, lurks a page with a sample image and three presets - Dynamic, Standard, and Movie.

On previous Galaxy S devices there was a box in the camera app marked ‘outdoor viewing’ which increased brightness and contrast. I always wondered how that worked, and the answer is through mDNIe profiles. Inside /system/etc/ are a bunch of files prefixed with ‘mdnie_tune’ and then some more text, for example ‘mdnie_tune_camera_outdoor_mode’ and ‘mdnie_tune_standard_mode’. Of course, these are how the various settings are defined, and there are a bunch of them.

Inside are settings which control sharpening, saturation, and other things which are governed by mDNIe. For example, the mdnie_tune_ui_standard_mode file looks like this:

//start
0x0001,0x0000,  //
0x002c,0x0fff,  //DNR bypass 0x003C
0x002d,0x1900,  //DNR bypass 0x0a08
0x002e,0x0000,  //DNR bypass 0x1010
0x002f,0x0fff,  //DNR bypass 0x0400
0x003A,0x000d,  //HDTR DE_off CS : de on = d , de off = 9
0x003B,0x0001,  //DE SHARPNESS(0~1023)  off
0x003C,0x0000,  //NOISE LEVEL
0x003F,0x001e,  //CS GAIN 30
0x0042,0x0030,  //DE TH (MAX DIFF)
0x0028,0x0000,  //Register Mask
//end

Movie and Standard just differ in CS (Chroma Saturation) Gain (from 30 to 50), and dynamic boosts that to 300 along with another field whose purpose I’m not certain of. I’m told by Francois that Dynamic also changes white point through mDNIe by clamping and thus results in some dynamic range being lost. Unfortunately there’s no - everything off - mode with no sharpening or chroma gain that makes colors less oversaturated out of the box, though if you have root obviously you can change and experiment with these. Now that we’ve mentioned it, all measurements I’ve done on the SGS2 were in the Standard mode.

Now what about brightness across the spectrum of user-selected intensity percentages?

Bright SAMOLED

It’s redundant to show black brightness since each device measures 0 nits due to black pixels not emitting any light, so AMOLED remains super contrasty, even if brightness is about the same with SAMOLED+ as it was with SAMOLED. Thankfully the curve is nice and linear.

Display Brightness

On the big display graph though, SAMOLED+ still isn’t as bright as the competition, though again having infinite contrast does make the display subjectively awesome indoors.

Outdoors SAMOLED+ is about the same as the previous generation. It isn’t very easy to see the display contents outside in direct sunlight, but then again what phone does look as good outside as it does inside? SAMOLED+ as mentioned earlier still leverages the optical bonding benefits (fewer reflections) that SAMOLED brought, so if you were pleased with view-ability there expect much of the same with this update.

The only major issue outdoors is something else entirely. I noticed pretty quickly with the Infuse 4G and Droid Charge that outside in my climate’s environment (~100+F outdoor temps, lots of sunlight) that the phones would clamp brightness to about 75% to prevent overheating. This is in part a measure to protect the display panel and of course other internal components. I set out to find out whether SGS2 implements the same thermal restrictions, and it does.

I broke out my contactless IR thermometer and went outside into the midday sun on my patio and set the phone down. Overheating and clamping down the display brightness doesn’t take long in this climate, about 5–10 minutes will do it. At around 115F (~45C) surface display temperature you’ll get clamped to 75% maximum until temperature drops down. I actually subjectively don’t think SGS2 is as prone to overheating as the Charge or Infuse.

Some other people have reported SGS2 crashing or encountering a thermal shutoff after a certain point, so I braved the heat and stayed outside even longer using the device until it hit well over 140F (60C) and still no system shutdown or overheating happened. That’s not to say it isn’t possible, as the SGS2 clearly does have thermal monitoring, for example the following lines from dmesg suggest some thermal monitoring going on, though I definitely crossed these boundaries to no ill effects:

<6>[    0.047638] thr_low: 83, thr_high: 98  warn_low: 97 c warn_high 106
<6>[    0.047715] tq0_signal_handle_init
<6>[    0.047751] tmu_initialize: te_temp = 0x00000048, low 8bit = 72, 
high 24 bit = 0
<6>[    0.047765] Compensated Threshold: 0x7d
<6>[    0.098087] Cooling: 82c  THD_TEMP:0x80:  TRIG_LEV0: 0x89     
TRIG_LEV1: 0x99 TRIG_LEV2: 0xa0

Back to the display, next up are viewing angles, which the SGS2 thankfully preserves from the previous generation. I tossed the SGS2, SGS4G, and Optimus 2X in the lightbox and took pictures at various extreme angles. I realize the Sensation is a comparison point people are interested in, unfortunately that went back a while ago.

Viewing angles are awesome on all three - the SGS4G’s SAMOLED display (left), SGS2’s SAMOLED+ (middle), and Optimus 2X’s IPS display (right).

Another small thing about the SGS2’s SAMOLED+ is that I’ve noticed that high contrast images can be persistent for a few seconds. It isn’t burn-in, but a persistence that stays for a few seconds and can be very visible. For example, leaving the Android keyboard up (which is black, grey, and white) and dragging the shade down, a shadow of the keyboard remains visible until it fades after a few seconds. This persists even on other applications as well, and I can only hope doesn’t become something permanent if left up too long.

Wrapping up SAMOLED+ is difficult, because whether or not you like it over traditional LCD alternatives is ultimately a very subjective (and as I’ve learned in discussions, sensitive) matter. We’ve codified the differences between SAMOLED+ and previous generations, and other IPS displays, but really it’s impossible to communicate every subtle difference.

Personally, I prefer higher PPI IPS-LCD displays, though at 4.3“ SAMOLED+’s WVGA (800x480) isn’t a slouch, and the change from RGBG PenTile to an RGB stripe helps matters. Where WVGA starts to become a problem is at 4.5”. Scaling up area and increasing the diagonal size by 0.2“ doesn’t sound like a problem, but r^2 is a bitch, and at that size both the Android UI elements and subpixels look absurdly huge. Luckily, the international SGS2’s 4.3” is completely tolerable with WVGA.

MHL

Last but not least, the SGS2 supports HDMI out through USB MHL. For those that haven’t encountered the term before, MHL (Mobile High definition Link) is just a way of getting HDMI out through a low pin-count port alongside supplying power. So far, all MHL I’ve seen has worked over microUSB, but other interfaces possibly may support MHL in the future as well.

I had a Samsung MHL adapter laying around from a Samsung Infuse 4G, which has a microUSB port on the side for connecting to a charger, a full size HDMI port, and the microUSB connector which plugs into the host device. With all this connected, you can then get HDMI mirroring working, which does work on the SGS2.

I connected the SGS2 over HDMI up to an ASUS PA246Q and saw it negotiate a 1080i link and do HDMI mirroring flawlessly. Landscape is also supported, thankfully, and seems to work just like it should.



Camera - UI and Video Quality

The cameras on SGS2 get a sizable upgrade from the previous generation. To start, SGS1 shipped with a 5 MP rear facing camera with AF, and VGA front front facing camera, (though the USA variants only got one when SGS4G packed one), and devices that are essentially SGS at their core have since shipped with them, and sometimes with flash.

SGS2, however, includes an 8 MP rear facing camera with LED flash and autofocus, and a 2 MP front facing camera. Thus far it appears as though things are going to be considerably more constant across SGS2 variants, with all three USA-bound variants including the same specs on paper at least.

I suppose a good a place as any to start is the camera UI, which gets a significant revamping in SGS2. The UI is much cleaner and looks more mature, with less of the bubbly rounded edges and more of a clean square look. There’s also no toggling UI elements on and off for a ‘lite’ view, you get this and this only. The bottom or right bar mirrors iOS’ camera application, with a toggle for photo and video, preview on the other side, and rectangular capture button in the center.

Tapping anywhere in the preview does an autofocus for that region, though auto exposure is either done by center metering, spot, or a matrix - tapping doesn’t change that.

You can long-press on either of the two top left icons (switch cameras, change flash) and add two more shortcuts. The settings button on the bottom left brings up an overlay with more capture options, which there is a wealth of.

Self shot functionality, flash, shooting modes, scene modes, exposure, focus modes (auto, macro, or face), resolution, white balance, ISO, metering, outdoor visiblity (an mDNIe toggle), anti-shake (electronic), auto contrast, blink detection, image quality, GPS, and storage location, phew - anyone still with me? That’s about everything, and I’d encourage checking out the gallery for a tour of all of it.

Switching to video mode keeps much the same settings, just with the differences you’d expect to accommodate video. Video capture resolutions include VGA, 480p, 720p, and 1080p, flash is either on or off, and there are fewer shooting modes. For some reason, the SGS2 uses 480p by default instead of 720p or 1080p, honestly I don’t know why anyone would use anything but those two higher settings.

The UI also correspondingly goes transparent to accommodate the 16:9 aspect ratio of these modes, though it doesn’t disappear or go away fully.

I suppose that’s as good a time as any to talk about video quality on SGS2. The device has continual autofocus, which you can see working in our test videos. We’ve done the usual thing and taken videos at the bench location with a reference Canon Vixia HF20 alongside the phone-under-test on a dual-camera bracket. I’ve taken comparison video from the camcorder and the SGS2 and uploaded the lot to YouTube, in addition to putting zipped up copies on the server (415.1 MB) for interested parties to download and see without the YouTube transcode.

In 1080p mode, SGS2 records 1080p30 video in H.264 High Profile with 1 reference frame at 17.0 Mbps. This is 2 Mbps above the Droid 3’s 15 Mbps High Profile 1080p video which we were a fan of, and it now appears that Exynos 4210 has just as competent of a hardware encoder as OMAP 4430, supporting high profile features and delivering high bitrate at the same time. Audio however is just single channel AAC at 60 Kbps, which is disappointing considering the SGS2 has two microphones, though it appears that top mic is used exclusively for Audience.

Subjectively the 1080p30 video shot on SGS2 looks like the best we’ve seen so far, there’s no blocking in the dark regions, great high spatial frequency detail, and really nothing to complain about. Exynos also supports 16x16, 8x8, 4x4, and 8x16 DCTs, but only encodes with backward references and one reference frame (much like OMAP4’s encoder). The point is that there’s still room for even better encoder efficiency which would come from encoders that use 2–4 reference frames and forward references. Sadly such encoders probably won’t be around for a while however.

The 720p30 preset records at 12.0 Mbps with the same encoding features as 1080p, meaning its encoding is of similar quality at this preset.

There’s a difference in crop factor that takes place when switching between 720p and 1080p shooting modes. 1080p clearly puts the sensor in a mode where it only reports a square 1920x1080 shaped region back, whereas 720p appears to perhaps use a 2x2 binning, and 480p or lower resolutions appear to just decimate the full sensor output. The result is that as you move to lower video resolutions, you get a wider field of view.

Samsung Galaxy S 2 - 720p Sample
 
Samsung Galaxy S 2 - 1080p Sample
 
Canon Vixia HF20 - Reference Sample

Again video recording quality on SGS2 is decidedly awesome in 1080p mode, though 720p could be better with better encode settings, the device shoots some of the best video we’ve seen out of a smartphone to date.



Behind SGS2's Camera - Still Quality

Now the next subject is still image capture on SGS2. Before I go any further, I think now is as good at time as any to talk about what sensors are in the device. Getting to the bottom of this took some poking around, and where I started was the camera firmware. Usually getting what sensors are used in a given device is pretty straightforward - look for driver messages in dmesg when the kernel boots, and then see which ones correspond to cameras. However, on SGS2 the thing is hidden behind a custom ISP that talks over I2C to Exynos, which didn’t lead me much further than just finding out what particular ISP is onboard.

I opened the camera firmware (from /system/etc/firmware ) in a hex editor and fired away. There are number of interesting things which pop up. First up is this:

Softune REALOS/FR is Realtime OS for FR Family, based on micro-ITRON
COPYRIGHT(C) FUJITSU LIMITED 1994-1999

So we know that the ISP is Fujitsu. Then there’s a line like this:

Copyright (c) 2005-2008 by FotoNation. All rights reserved.
Face Detection Library v.1.2.58.7

and finally:

OBED04 Fujitsu M5MOLS

all strewn among a bunch of padded bits and compiled code incorporated into the SGS2’s “camera” firmware. So what’s the real story? Well, SGS2 uses a Fujitsu Milbeaut M–5MO ISP paired with one of two cameras. To find out which camera SGS2 uses, I took a look in Francois’ SGS2 kernel repo under the actual M5MO C driver file. Inside, there’s a line like this inside a function named “m5mo_camera_type_show”:

    if (state->exif.unique_id[1] == 'B') {
    strcpy(type, "SONY_IMX105PQ_M5MOLS");
} else if (state->exif.unique_id[1] == 'C') {
    strcpy(type, "SLSI_S5K3H2YX_M5MOLS");
} else {
    cam_warn("cannot find the matched camera type\n");
    strcpy(type, "SONY_IMX105PQ_M5MOLS");
}

So we now know that inside SGS2 is either a Sony IMX105, or Samsung S5K3H2YX sensor. This is basically the same exact camera lottery situation that the MyTouch 4G Slide is in, as it in fact has the same two exact sensors listed, though F/2.2 optics. Both are basically the same on paper and should offer similar performance - 1/3.2“ size, 1.4µm backside illuminated pixels, and 8.13 MP (3264 x 2448). The front-facing camera uses a Samsung S5K5BAF 2 MP sensor sized 1/5” and with 1.75µm square pixels.

Interestingly enough, I believe I was able to find the actual module which Samsung uses inside the SGS2 on a Samsung fiber optics website, using the Sony IMX105 module. Take note of the appearance of this module, as it’s virtually identical to what I saw inside the device as I’ll show in a moment.

Having two sensor suppliers isn’t anything new, Apple has done it (and will continue to do so), HTC is doing it, and now Samsung is doing it too. With the same on-paper sensor performance and the same autofocus + optical system module, things should all work out and photos should look the same no matter what sensor is inside.

Other specs about the camera module are that EXIF reports an F/2.7 aperture and 4.0 mm focal length. This is a bit odd to me since F/2.8 is on the typical full-stop scale (2*sqrt(2)), and then F/2.4 is a next half-stop, and I’m only aware of IMX105 coming in F/2.4 and 2.8 modules. Just goes to show that sometimes EXIF data is weird. The module is most definitely the F/2.8, f=4.15 mm variant with a 28.1 degree horizontal field of view and 4 plastic aspheric lenses.

As an aside, if this whole system sounds familiar, it’s because the Sony IMX105 module with F/2.4 optics is the oft-rumored camera going into the next iPhone.

So that brings me to the infamous magenta circle issue which numerous people have reported seeing on their SGS2s. The last time we saw this was with the iPhone 4, where a green circle is readily apparent under certain light conditions or when photographing a homogenous color or texture. Some users have reported seeing a similar magenta circle on the SGS2 camera when photographing under similar conditions, so I set out to replicate it.

The closest I can get to the magenta circle

For better or worse, I can’t see the magenta circle on the SGS2 we were given, though I don’t doubt that some devices do show it. It doesn’t take much to extrapolate and come to the conclusion that is in part due to what’s becoming a CMOS lottery - now not only is there a display lottery (like what notebook buyers have been dealing with for a long time), but a CMOS lottery for sensors.

The magenta circle I see on this SGS2 is faint and nowhere near as pronounced as the green iPhone 4 circle, nor the SGS2-captured images I’ve seen online. Further, I haven’t been able to devise a method to tell which of the two possible sensors are inside this particular SGS2. I’ve taken some photos of completely white objects at a variety of focus positions and under different lighting conditions for your own perusal.

We’ve done the usual thing too and taken photos with the SGS2 inside our lightbox test scene, with the lights on and lights off. With the lights on, the SGS2 has a hard time nailing white balance with the test illuminated in auto mode, and in manual mode (set appropriately) it still has the wrong color temperature. This is just a bit unfortunate since otherwise sharpness is excellent, there’s little noise, and little to no chromatic fringing at the edges. I’m very impressed with camera performance here and would encourage viewing those images 1:1.

On the front facing camera, we get performance that looks actually surprisingly good. So good that it could actually pass for rear facing camera quality (resolution notwithstanding) of some previous generation devices.

With the lights out, the SGS2’s single LED flash illuminates the test scene nicely and gives good color temperature. SGS2 also does the right thing and fires up the LED for autofocus in the dark.

Next, we took photos with the SGS2 at the usual test locations, and it’s here that SGS2’s camera really shines. As a reminder, test locations 3, 4, 6, and 7 are the only ones remaining that I can visit, so skip 1, 2, and 5. SGS2 just really has great well-corrected sharpness and performance even out at the edges where aberrations take off, good colors without insane saturation, and great dynamic range.

Finally, I captured a large number of miscellaneous photos with the SGS2’s rear facing camera as well. I think in these real-world scenarios we get to see a better example of the SGS2’s camera performance, which is extremely good among the smartphones we’ve seen so far. Samsung also doesn’t make the mistake of putting the last vertex of the camera system behind a piece of plastic integrated into the battery cover. Instead, the module juts out through the battery cover in a way that doesn’t allow dirt and dust to collect.



Inside SGS2

There have been a number of teardowns of the SGS2, and for the most part usually there isn’t a need to open up devices unless the FCC photos don’t suffice. This time around I still have a number of questions about what component choices had been made for the SGS2, so it went under the screwdriver (and emerged unscathed) just so we could get a glimpse at the goods.

SGS2 comes apart easily enough, with a few philips (no Torx bits) screws on the back and then a couple prods with a plastic tool to get the snaps off. Construction in this regard is very similar to the original SGS. After that, you have access to the PCB, backside, and frame.

You can see how SGS2 achieves its thinness by looking at the layout. The PCB doesn’t run underneath the battery - the majority of device thickness is defined by the SAMOLED+ panel plus battery thickness.

In addition, the SGS2 locates the cellular antenna at bottom in a modular speaker plus antenna module that snaps in and out of the plastic backside. The depth of SGS2 at its thickest seems governed by this speaker and its resonating chamber. Off to the other side is the bottom microphone, cellular feed cable, and gold contacts for getting that connected to the silver antenna.

There’s another second antenna to the opposite side of the module, which is for Bluetooth and WiFi.

If we turn our attention to the PCB we can see the rest of SGS2’s interesting bits.

The EMI cans thankfully snap off easily, and underneath we can see right next to the microSD card slot is the Infineon/Intel X-Gold 626 HSPA+ (HSDPA Cat. 14 - 21 Mbps / HSUPA Cat. 7 11.5 Mbps) baseband.

On the opposite side is the Audience 1026 noise cancelation IC, MAX8997 PMIC, and Yamaha YMU823 audio codec.

On the same side further down is the GPS that SGS2 uses, which is a SiRF GSD4T GPS. That particular die is absolutely tiny and difficult to photograph. More on the SGS2 GPS in a moment, however.

The other side of the PCB is much more interesting.

With the cans off, first we get a shot of Exynos 4210 with its PoP memory. This particular part has two mobile LPDDR2 die in the stack. Next to it, a Samsung combo NAND + DRAM part, with 16 GB of NAND and 64 MB of RAM, no doubt for the Infineon baseband.

Moving right is the Infineon Smarti UE2 RF transciever marked 5712, and the large IC below that marked RFMD RF6260 is a quad-band multi mode power amp which is a bit interesting. It works between 1710 and 1980 MHz, and 824–915 MHz, supplanting somewhat the need for individual power amps for each band.

Down on the long and skinny part of the PCB is a large package which I believe probably houses the BCM4330 WLAN module (more on that later), and next to it is a button cell battery, which seems curious.

Now remember that camera part I mentioned earlier? Well, at the top of the board you can see a ZIF slot, a relatively large IC< and then the camera module.

Compare these two, and it seems pretty obvious that this is exactly that same camera module from earlier, and most likely the large IC with Korea written on it is the Fujitsu M5MO ISP which controls it.

Also up at the very top is another gold connector which meets up with the SGS2’s third antenna, whose purpose is either for WLAN/GPS or another Rx finger for the cellular baseband.

The circular thing with a foam backside is the SGS2’s vibration motor, and other than that there really isn’t much more to talk about. Heat gets carried away from ICs through the EMI cans which double as heatsinks, and on the backside of the plastic back you can see a metal region and small thermal pad.



Cellular

So we’ve already mentioned that SGS2 contains Intel/Infineon’s latest and greatest X-Gold 626 baseband, which supports HSDPA 21.1 (Category 14) and HSUPA 11.5 (Category 7) support, though SGS2 actually only supports HSUPA 5.76 (Category 6) according to Samsung. Of course, this international edition includes quadband UMTS and GSM support.

I used the SGS2 on AT&T in the USA, and my particular market only has PCS 1900 MHz support, meaning both GSM and WCDMA carriers both only sit in the PCS 1900 MHz band. I remember that one of the first things I did with the SGS2 at MWC was check whether all of the same excellent dialer codes worked, and thankfully they do.

  

Samsung continues to have the absolute best field test / engineering menus of any handset vendor, and on the SGS2 dialing *#0011# gives you access to information about the current connected carrier, band, RCC state (what signaling state you’re in) and signal (ECIO and RSCP at the bottom). There’s a field marked HSPA+ used which I think has confused some people - this shows 1 when data is being transacted (DCH state). I should also mention that I’m incredibly grateful that SGS2 shows all the correct and proper status indicators for network connectivity at the top - 3G, H, and H+ appropriately, instead of this trend in the USA of calling every UMTS connectivity state “4G” - ugh. As an aside, it’s normal to see 3G when in the idle state, and then a negotiation up to H+ when in the DCH (Dedicated CHannel) state if you’re on an HSPA+ network. I haven’t seen H+ show when in the FACH (Forward Access Channel) state.

Samsung Galaxy S 2 - Network Support
GSM/EDGE Support 850 / 900 / 1800 / 1900 MHz
UMTS/HSDPA/HSUPA Support 850 / 900 / 1900 / 2100 MHz
HSDPA/HSUPA Speeds 21 Mbps / 5.76 Mbps
Baseband Hardware Infineon/Intel X-GOLD 626 HSPA+

I ran 318 speedtests on the SGS2 using the Ookla Speedtest.net application, and did our usual thing and come up with a histogram showing throughput for those tests. Again, this is more indicative of AT&T speed than what the SGS2 is capable of, given that I’ve seen other SGS2 users seeing much faster on other WCDMA networks - I’m insanely jealous of all of you. I tested throughout my 1900 MHz market in Tucson, Phoenix, and on the positively dreadful 850 / 1900 WCDMA network in Las Vegas, which remains completely unusable even when CES or any other conference isn’t happening. But I digress.

First up is downstream, which develops a nice little normal distribution when you run enough tests like we’ve done here.

Downstream Performance

Again this is really more indicative of what you’re going to see in the markets I’ve tested in with AT&T. Speeds top out at 7 or 8 Mbps if you’re very lucky, with performance most of the time between 2 to 4 Mbps. The average here is 3.11 Mbps, with a standard deviation of 1.56 Mbps. That sounds about right to me given how many of these things I run when I’m not even testing a phone.

I’m also aware of the whole AT&T HSDPPB (“4G” unlimited data) versus DPPB (3G unlimited data) SOC code thing and the corresponding difference in APN. I used them interchangeably for a week or so and honestly didn’t see any difference.

Upstream is next, where AT&T continues to employ lots of artificial shaping, limiting upstream to at maximum 1.7 Mbps.

Upstream Performance

 

I’ve heard speculation that AT&T is limiting the HSUPA category to 2 or 3 (which is 1.46 Mbps), or category 5 (2.00 Mbps), but neither of those line up nicely with the artificial-looking wall that seems to exist on AT&T at 1.7 Mbps. I’m very positive however that there’s shaping going on here, the last remaining question is whether it’s enforced by only allowing a certain HSUPA category, or shaping somewhere else in the network. It’d make sense to me at least to do the latter of those two. It’s disappointing because there’s definitely the potential for much speedier upstream than what I see here.

Last is latency, which looks pretty typical, though there are some outliers in the data entirely from the abysmal Las Vegas performance tests:

Latency

Average latency works out to be 147 ms, which is pretty par for UMTS as far as I’m concerned, unless you’re lucky enough to be somewhere with much better backhaul and a flatter IP-based network architecture.

For the most part, I’m very pleased with SGS2’s cellular connectivity situation, though there’s a bit more to talk about. I noticed that sometimes cellular connectivity will stop and become unresponsive for anywhere between a few seconds, and minutes at a time, requiring a battery pull or lots of patience before working again. Toggling airplane mode doesn’t work when that happens, and usually it’s manifested by the data-type indicator disappearing. I’m not sure what the story is here, but it seems like I’ve seen a lot of Samsung phones having data sessions randomly lock up and then come back after a while, lately.

In addition, Samsung makes the mistake of going with a signal bar visualization with very compressed dynamic range. Since the whole iPhone 4 debacle, I’ve seen something of a trend towards a strict linear scale (which makes more sense), but SGS2 definitely doesn’t go that route. It’s not a huge deal however, just something to be aware of. I’m willing to overlook that issue considering that getting the real story on connectivity is no harder than dialing *#0011# and looking at the real number.

I’ve also read a bunch of accounts which claim that the SGS2 has iPhone 4-like deathgrip, which needless to say piqued my interest. Of course, I’ve been religiously measuring unintended signal attenuation on every device I’ve encountered ever since, so the SGS2 doesn’t get spared that treatment.

Signal Attenuation Comparison in dB - Lower is Better
  Cupping Tightly Holding Naturally Holding in Case On an Open Palm
Samsung Galaxy S 2 18.4 5.9 - 12.2
Droid 3 16.0 11.3 - 5.0
HTC Sensation 15.0 10.0 8.0 0.0
Samsung Droid Charge 10.0 10.0 5.0 0.0
HTC Thunderbolt - LTE 5.3 2.5 - 4.4
HTC THunderbolt - EVDO 6.5 0.8 - 7.2
Verizon iPhone 4 16.5 15.5 9.0 7.9
LG Optimus 2X 13.7 9.3 - 5.9
Nexus S 13.3 6.1 - 4.3
Droid 2 11.5 5.1 - 4.5
BlackBerry Torch 15.9 7.1 - 3.7
Dell Streak 14.0 8.7 - 4.0
Droid X 15.0 5.1 - 4.5
AT&T iPhone 4 24.6 19.8 7.2 9.2
iPhone 3GS 14.3 1.9 3.2 0.2
HTC Nexus One 17.7 10.7 7.7 6.7

The data is actually quite interesting, with the SGS2 showing more than the 15 dB average attenuation in worst case, and an unusually high open-palm result as well. If you go back to the disassembly and look at that antenna module, you can start to see why this is so bad. It’s located right in the plastic bulge, and the active region of the antenna printed on the plastic is less than a mm separated from the exterior. The result is that though there’s obviously no galvanic contact (there’s a plastic insulating layer between), there still is some coupling and attenuation in the near field right here.

I honestly don’t think it’s an iPhone 4-level problem at ~18 dB in this worst case (which I’ll remind you literally involves both hands clasped around the device as close as possible), but it’s still more than average.



WiFi, GPS

The SGS2 has both 2.4 and 5 GHz WiFi connectivity courtesy of Broadcom’s newest combo chip, the BCM4330. It is the logical successor to BCM4329, which we saw adopted darn-well almost universally in the previous generation, from the iPhone 4 to SGS1. BCM4330 still is a single spatial stream combo solution, but what’s different is that alongside Bluetooth 4.0+HS support is an on-chip power amp for 5 GHz WLAN in addition to last gen’s 2.4 GHz power amp (or another version which had both). There’s also still FM receive and transmit support.

That puts SGS2 in an incredibly small list of smartphones that include 5 GHz WLAN support, which is critical going forwards as the 2.4 GHz ISM band gets even more crowded. I’ll spare you my usual rant about how at every conference and trade show 2.4 GHz turns into a completely unusable nightmare.

As with every other radio, we have to do the receive sensitivity dance and make sure nothing is broken. I tested the SGS2 alongside an SGS 4G at my house with both an Airport Extreme (5th Gen) and WRT54G-TM boosted to 184 mW. SGS2 WiFi reception on 2.4 GHz is darn near identical to the previous generation.

There’s something deceptive about this however, and it’s that although Samsung has chosen to go the usual compressed-dynamic-range route with cellular bars, the WLAN bars seem to be more linearized. Thus where I’m used to seeing every other smartphone show max (until you’re right about to fall off), the SGS2 actually doesn’t lie to me and shows fewer bars. Until I ran around and looked at RSSI in dBm, I suspected SGS2 had WLAN sensitivity issues where there don’t appear to be any. One small thing I did notice is that SGS2 (and BCM4330) seems to only connect at long guard interval (eg 65 Mbps maximum for single stream, 20 MHz channels), where SGS1 and BCM4329 connected at 72 Mbps short guard interval.

In practice, actually using the 5 GHz radio on SGS2 is a bit challenging, since Android 2.x has no proper prioritization for 5 GHz over 2.4 GHz when presented with the same SSID running on both bands. I’d obviously like to see the less-crowded 5 GHz band used before the more crowded 2.4 GHz band.

WiFi Performance

Performance on 2.4 GHz with BCM4330 in SGS2 is scorching, at 34.6 Mbps when downloading a 100+ MB PDF over the local network. On 5 GHz performance drops a bit for some reason.

GPS

To say that SGS1’s GPS was a disaster is a bit of an understatement, at least on the variants that I got my hands on. What’s worse, for a lot of those phones, GPS is still broken to this day. Thankfully Samsung learned from that experience and didn’t make the same mistake twice, and SGS2 has a different GPS entirely and much better time to first fix as a result.

I mentioned it earlier, but SGS2 uses a SiRFstarIV GSD4t GPS this time around. It’s a bit interesting that Samsung is using a discrete GPS considering the fact that Exynos 4210 has its own integrated GPS baseband.

 

I measured time to a warm fix at around 3–5 seconds with the AGPS data already downloaded, which is pretty in line with modern devices. From a cold start, it’s anywhere between 10–15 seconds, though sometimes faster. I’ve seen faster on some other phones I won’t name, but GPS works this time around, and works well. I took the SGS2 on a 7-hour long road trip with me and used its GPS continually with no issues.

Call Audio

Inside the SGS2 is an Audience 1026 voice processor, which rejects noise which is common between the primary microphone at the bottom of SGS2 and secondary microphone at the top. Discrete noise canceling solutions are pretty par for the course lately, and it’s good to see SGS2 not excluded from that trend.

Inside the SGS2’s excellent ServiceMode menu is an option to enable and disable Audience processing, which naturally we explored. I recorded a call placed from the SGS2 in the presence of very loud background noise with Audience turned on and off, and you can hear the difference between the two, running through the same test. Only at the most extreme ambient volume level is background noise noticeable on SGS2.

Samsung Galaxy S II - Noise Rejection with Audience A1026 by AnandTech

Samsung Galaxy S II - Noise Rejection with Audience A1026 Disabled by AnandTech

Call quality on the SGS2 is a bit more interesting, I placed a test to the local ASOS and recorded it over line-in as we’ve done before. Inside ServiceMenu the device will even show what type of voice coder is used given present network conditions, which is AMR-NB on AT&T.

Samsung Galaxy S 2 - AMR-NB on AT&T by AnandTech

Here the SGS2 doesn’t sound quite as good as other phones I’ve recorded on UMTS, unfortunately.

Speakerphone Volume

Speakerphone is the last thing on the list, and unfortunately at maximum volume during a voice call, the SGS2 doesn't measure extremely well. This is puzzling, since for navigation and other system sounds, the SGS2 is very loud. Clearly something isn't set properly, and the SGS2 has the potential to be louder on speakerphone for calls with appropriate tweaking.

Speakerphone Volume



Intro

We've asked Francois Simond (supercurio), creator of the very popular project-voodoo and voodoo-sound  improvement packages, and Android hacker focused on sound, video, and image, to set the record straight on Samsung Galaxy S 2's sound quality. In addition, Francois will help us test smartphone and mobile-device sound quality and continue being a contributor as it quickly becomes an important industry focus.

Context

Galaxy S II comes with a lot of expectations in the audio department. Samsung’s previous flagship family, Galaxy S, (aka Vibrant, Captivate, Fascinate, Galaxy S 4G, Epic 4G in the US) set the bar high, using a good quality implementation of Wolfson Micro WM8994 codec. Existing custom mods which tuned WM8994 usage have even been able to push the quality higher than most expected, as well as the headphone output levels.


Yamaha YMU823 - Encircled in Red Above

For the Galaxy S II, however, Samsung changed audio IC suppliers, preferring the popular Japanese brand Yamaha making a big entry in the the low power codec for Smartphones area. The exact chip used in Galaxy S II devices is named C1-YMU823 (also refereed as MC-1N2).

Its datasheet is not public but it’s a chip designed designed to compete with latest Wolfson and TI offerings and probably a custom product designed to follow Samsung requirements.

As the growing interest of readers and recent HTC and Beats by Dre strategic alliance shows, smartphone audio capabilities constantly gain importance. Of course, solid voice call performance remains a major concern. Many smartphone owners now use their device as a primary music player, sometimes with high-end headphones. 

Audio Performance

Music

Demanding enthusiasts expected As music player Galaxy S II to sound at least as good as its Apple competitor: iPhone 4 and supersede its older brother Galaxy S performance.

Unfortunately, it fails at both.

As music player, Galaxy S II performance can be described as:

  • Average for a smartphone.
  • Below average for a high end smartphone.

Yamaha’s MC-1N2 codec has some nice theoretical specs, but the promised sound fidelity turns into a boring rendering affected by several outstanding issues, relegating Samsung flagship far from the audiophile category. Worse: only some of those issues can be remedied by using additional equipment like an attenuator or an active headphone amplifier.

Galaxy S II audio output as music player has issues

Audible CPU and Radio noise

Today most listeners enjoy music with isolating earphones, as they are useful for listening to music or podcasts in loud environments without having to pump the volume up and introduce listening fatigue. Most in-ear gear is highly sensitive. Combined with low impedance and 20dB isolation, hiss and other noise are quickly noticeable.

Galaxy S II is not recommended to drive directly sensitive in-ears because you’ll easily hear the CPU working. Fixing the CPU frequency to its maximum (rooted phones only) doesn’t prevent this annoying noise reminding us of cheap integrated audio codecs a dozen years ago. Admittedly, hiss and noise levels of Galaxy S II headphone output are a lot lower, but today’s standard mobile headphones reveal them easily.

If you’re using sensitive in-ear headphones, radio GSM / EDGE noise is as audible, indicating a probable hardware design flaw of the codec or the board. The culprit is poor EMI shielding.

Description

Galaxysii-cpu-edge-noise-volume1-volume0 by AnandTech

  • 0:00 to 0:02  sound card noise only.
  • 0:02 pop on codec power up, music start to play. EDGE activated, Volume 1/15
  • 0:02 to 0:35 music playing, you can hear GSM and CPU noises despite the signal
  • 0:35 Volume set from 1/15 to 0/15. Music keeps playing in background but is silent
  • 1:03 WiFi enabled, EDGE is automatically disabled. Moving some UI elements

Due to its bursty nature, this flaw is hard to expose in measurements and fortunately less audible with medium sensitivity headphones, not at all with low sensitivity ones.

You can solve this issue by using an attenuator or amplifier: maximizing digital Android output level and adjusting the volume to your need with the amp.

DAC Distortion

With today’s Android audio implementation, all kind of media are sent to the DAC as a 44100 / 16bit / Stereo stream. Despite the usage of a fixed rate, Yamaha’s codec is not able to provide a very clean output.

Galaxy S II DAC output quality is limited by several kind of distortions. So far, no firmware was able to fix those despite early Korean updates describing “audio clarity improvements”.

When playing music, those artifacts are perceived as “lack of clarity”, “reduction of stereo separation”, “loss of detail” and “lifeless sound” (opposite of lively).


From 20 Hz to 20 kHz, dB: -0.42, +0.04
Galaxy S II Frequency response: no load (line in)

Not the best ever but reasonably flat. The slight oscillation in frequency response starting at 1kHz gives a clue about what we’ll see in the next graphs.


Noise Levels
RMS power (A-weighted), dB: -95.1, -95.6
Peak level, dB FS: -71.7, -74.4


Dynamic Range

Dynamic range (A-weighted), dB: +95.4, +95.9

On Noise Levels and Dynamic range graph, I added measurements of Apple iPad and a reference sound card for comparison puposes.

What we see here is good performance. In theory and measured in ideal conditions Galaxy S II has low noise levels and very good dynamic range. However, if noise levels are remarkably low on high frequencies, they increase on lower frequencies which is not something expected nor a good sign as those are more likely to be heard.


Total Harmonic Distortion
THD, %: +0.0036, +0.0035
THD + Noise, %: +0.0390, +0.0388
THD + Noise (A-weighted), %: +0.0425, +0.0424
THD + Noise (A) equivalent: -67.4 dB
Same Graph adding iPad results

THD is calculated by measuring harmonics generated by the electronic circuits when a signal at 1kHz is played. As you can guess by this graph’s shape, there’s an issue here.

In terms of sound and perception, harmonics add colour to the sound; sometimes pleasing like the kind of distortion tube amps add.

This graph show all kind of distortions introduced by Yamaha’s DAC, ie:

  • Jitter-like frequencies distributed around 1kHz
  • All sort of unexpected spikes at higher frequencies

As the noise and other distortions are high, the THD measurement itself becomes kind of irrelevant. Unfortunately the value of  0.0036% is not of the actual performance here.

Other reviews might have use this THD value as base for invalid conclusions.


InterModulation Distortion 
IMD + Noise (A-weighted), %: +0.0655, +0.0655
IMD + Noise (A) equivalent: -63.7dB

What IMD + Noise means needs an explanation: I believe it’s safe to describe it as “all kind of noise and distortions happening when you play a signal” − at the opposite of the signal itself.

IMD + Noise importance is often underrated, like it was useless as we already have another distortion value (THD). Still, it’s often more representative of the general performance and of the sound quality perceived.

No matter how low is the theoretical noise floor, Yamaha MC-1N2 DAC has issue when playing signals. As you can guess by this latest comparison graph and the number of spikes indicating some sound that shouldn't be there. iPad DAC is not perfect either, still it provides a much cleaner sound. The reference DAC shows how the graph should be.

-63.7dB level for “noise and distortions” is far from the level of performance expected from a last-gen audio IC.


InterModulation distortion + noise (swept freqs)
IMD + Noise at 5000 Hz: 0.0108, 0.0109
IMD + Noise at 10000 Hz: 0.0108, 0.0109
IMD + Noise at 15000 Hz: 0.0108, 0.0109

This test consist of a single sine played going from a very low frequency to 22kHz.

Actually, when playing something as simple as a single frequency at one time result is not so bad; music is rarely made of simples sines.


Stereo Crosstalk.
dB: -82.3

My guess is that Yamaha’s codec internal behavior is perturbed by a jittery clock source (being Exynos AP PLL clock). If not? it  would mean the DAC design is flawed. The first hypothesis is more likely: implementation on small very low power board is always tricky.

You cannot avoid the distortions described here by using an external amplifier or any other equipment.

Exploring sound with spectrograms

A spectrogram lets you “see” the sound, why I’ll use this colorful presentation to show you some examples of Galaxy S II audio output.

Udial


Galaxy S II udial output sampled at 96kHz (FLAC sample)

Reference, re-sampled to 96kHz with sox
For comparison: Apple iPad udial output (FLAC sample)

udial is a very interesting sample circulating in forums for more than ten years. I didn’t manage to find its author to thank him for his clever idea. udial is a very efficient stress test that enables an easy test for clipping, re-sampling and some types of jitter.

Galaxy S II performance is not terrible but not good either. Artifacts are audible and can be seen in this spectrogram. I must admit some are a mystery to me like the “delayed” ones.

Bass sines

Simple bass sines allow to check a few things: clipping, unwanted EQ, Bass Boost or Dynamic Range Compression but also buggy DC Servo setups.

The sample used here contains 7 tones: 100Hz, 80Hz, 60Hz, 50Hz, 40Hz, 30Hz, 20Hz.


Galaxy S II bass sine waves output (FLAC sample)


For comparison, Apple iPad bass sine waves output (FLAC sample)

Once again Galaxy S II exhibits artifacts. If you download the associated FLAC record phone’s output you may be able to hear those appearing as lines on the spectrograph (more noticeable on last 2 notes).

But it’s not all bad: Samsung’s phone spectral representation exposes a nice hardware feature called Digital Noise Gate.

DNG analyzes the signal played and quickly shut down parts of the codec. It helps reducing the perceived hiss and reducing power consumption a little.

Eventually (after about 2 seconds of nothing played) the entire audio hardware is shut down by Android OS but the best part is that Digital Noise Gate feature is extremely efficient as anti-pop.

MC-1N2 performance is class leading on this regard.

Headphone amp

Galaxy S II built-in headphone amp is able to drive in-ears or full size cans to satisfying levels. I know for sure mobile devices are never LOUD enough. Samsung’s flagship is louder than iPhones, iPads, a first-gen Galaxy S.

This part may become a dedicated article but here are some facts already:

At Max level (15/15):

AC Tension, no load (line-in): 0.703V

Driving Sennheiser HD 650 (300 Ω)

  • AC Tension under load: 0.621 V
  • Power (left+right): 2.11 mW

Driving Head-Direct RE0 (64 Ω in specs, mines measured at 58 Ω)

  • AC Tension under load 0.381 V
  • Power (left+right): 4.14 mW

Question is: Is it loud enough? When driving HD 650, it reaches comfortable listening levels but won’t reach “loud” levels. For RE0 and most consumer headphones, yes it's loud enough.

Remember isolating headphones are recommended when in loud environments rather than pumping volume too high and damage your hearing.

To me the quality of the amp itself is average-to-okay. Its hard to speak much about it as it’s most of the time amplifying the signal sent by a flawed DAC.

As distortion amount rises at highest levels, its probably MC-1N2 amp power stage implementation on Galaxy S II board does not have much headroom. More measurements may be welcome here.

High output impedance

One characteristic of Galaxy S II's headphone driver is its relatively high impedance: higher than competitor chips. Power efficiency diminishes when the output impedance grow: energy lost in heat. This is why Yamaha’s choice of design is surprising.

I measured mine at 49 Ω.

A notable side effect of this characteristic is that the headphone output becomes less loud with low impedance cans. 

Output impedance doesn’t have much effect on 300 Ω gear, but on common 16 Ω earphones:

  • Gain is lowered, hiss level is reduced: nice bonus
  • Frequency response shift: less bass, more highs. Okay if the tiny speakers was too bassy, terrible if the headphones were bright already.

As a result, Galaxy S II might play well with some equipment but also reveal the worst side of other with harsh and aggressive rendering.

This output impedance is why opinions about Galaxy S II audio are so contradictory in forums: experience can vastly differ depending on your choice of headphones.

Appreciation note

There are many issues or flaws listed and demonstrated in this article. However it doesn’t mean Galaxy S II is unable to play music.

Compared to the average Android phone it probably sounds better already. Explanations mean to show where there’s headroom for improvement on the next devices.

We're more than willing to discuss audio testing methodology with manufacturers to help improve the next generation of phones.

Samsung Music Player

Samsung updated their Google music player replacement look and feel with TouchWiz 4.0.

It still supports FLAC natively, which is a nifty feature for audiophiles and its simple interface and efficient will satisfy most users.

This pre-installed music player still lacks a much desired gapless feature, which means listening to mixes, concerts records or classical music won’t be as pleasing as it should.

You’ll find an 8 band graphic EQ in settings as well as EQ presets and additional sound effects adding reverberation, spatialization or stereo enhancements. Some effects are interesting if you’re not looking especially for an hi-fi accurate sound reproduction. However I would suggest to stay away of the EQ presets as much as possible:

The way the EQ engine uses a suboptimal DRC implementation makes constant volume changes audible.

If you really need to use this EQ, one workaround in using the graphic EQ manually instead: only attenuate but never increase some frequencies, those you find too present. This is in general the best way to use an EQ and find a more balanced sound response to correct headphone or speaker response.

This tip will avoid triggering the DRC.


Typical graphic EQ setup using only negative gains for frequency response correction

As usual with Android OS, you’ll be able set an alternative music player as the default one without any restriction, adding gapless , more compression codecs  support or a different UI.



The Fastest Smartphone SoC Today: Samsung Exynos 4210

Samsung has been Apple's sole application processor supplier since the release of the original iPhone. It's unclear how much Samsung contributes to the design process, especially with later SoCs like the A4 and A5 carrying the Apple brand. It's possible that Samsung is now no more than a manufacturing house for Apple.

Needless to say, the past few years of supplying SoCs for the iPhone and iPad have given Samsung a good idea of what the market wants from an application processor. We first got the hint that Samsung knew what it was up to with its Hummingbird SoC, used in the Galaxy S line of smartphones.

Hummingbird featured a 1GHz ARM Cortex A8 core and an Imagination Technologies PowerVR SGX 540 GPU. Although those specs don't seem very impressive today, Hummingbird helped Samsung ship more Android smartphones than any of its competitors in 2010. At a high level, Hummingbird looked a lot like Apple's A4 used in the iPad and iPhone 4. Its predecessor looked a lot like Apple's 3rd generation SoC used in the iPhone 3GS.

Hummingbird's successor however is Samsung's first attempt at something different. This is the Exynos 4210 application processor:

We first met the Exynos back when it was called Orion at this year's Mobile World Congress. Architecturally, the Exynos 4210 isn't too far from Apple's A5, NVIDIA's Tegra 2 or TI's OMAP 4. This is the same CPU configuration as all of the aforementioned SoCs, with a twist. While the A5, Tegra 2 and OMAP 4 all have a pair of ARM Cortex A9 cores running at 1GHz, Exynos pushes the default clock speed up to 1.2GHz. Samsung is able to hit higher clock speeds either through higher than normal voltages or as a result of its close foundry/design relationship.


Exynos 4210 with its PoP LPDDR2

ARM's Cortex A9 has configurable cache sizes. To date all of the A9 implementations we've seen use 32KB L1 caches (32KB instruction cache + 32KB data cache) and Samsung's Exynos is no exception. The L2 cache size is also configurable, however we haven't seen any variance there either. Apple, NVIDIA, Samsung and TI have all standardized on a full 1MB L2 cache shared between both cores. Only Qualcomm is left with a 512KB L2 cache but that's for a non-A9 design.

Where we have seen differences in A9 based SoCs are in the presence of ARM's Media Processing Engine (NEON SIMD unit) and memory controller configuration. Apple, Samsung and TI all include an MPE unit in each A9 core. ARM doesn't make MPE a requirement for the A9 since it has a fully pipelined FPU, however it's a good idea to include one given most A8 designs featured a similar unit. Without MPE support you run the risk of delivering an A9 based SoC that occasionally has lower performance than an A8 w/ NEON solution. Given that Apple, Samsung and TI all had NEON enabled A8 SoCs in the market last year, it's no surprise that their current A9 designs include MPE units.

NVIDIA on the other hand didn't have an SoC based on ARM's Cortex A8. At the same time it needed to be aggressive on pricing to gain some traction in the market. As a result of keeping die size to a minimum, the Tegra 2 doesn't include MPE support. NEON code can't be executed on Tegra 2. With Tegra 3 (Kal-El), NVIDIA added in MPE support but that's a discussion we'll have in a couple of months.

Although based on Qualcomm's own design, the Snapdragon cores include NEON support as well. Qualcomm's NEON engine is 128-bits wide vs. 64-bits wide in ARM's standard implementation. Samsung lists the Exynos 4210 as supporting both 64-bit and 128-bit NEON however given this is a seemingly standard A9 implementation I believe the MPE datapath is only 64-bits wide. In other words, 128-bit operations can be executed but not at the same throughput as 64-bit operations.

The same designs that implemented MPE also implemented a dual-channel memory controller. Samsung's Exynos features two 32-bit LPDDR2 memory channels, putting it on par with Apple's A5, Qualcomm's Snapdragon and TI's OMAP 4. Only NVIDIA's Tegra 2 features a single 32-bit LPDDR2 memory channel. 

ARM Cortex A9 Based SoC Comparison
  Apple A5 Samsung Exynos 4210 TI OMAP 4 NVIDIA Tegra 2
Clock Speed Up to 1GHz Up to 1.2GHz Up to 1GHz Up to 1GHz
Core Count 2 2 2 2
L1 Cache Size 32KB/32KB 32KB/32KB 32KB/32KB 32KB/32KB
L2 Cache Size 1MB 1MB 1MB 1MB
Memory Interface Dual Channel LP-DDR2 Dual Channel LP-DDR2 Dual Channel LP-DDR2 Single Channel LP-DDR2
NEON Support Yes Yes Yes No
Manufacturing Process 45nm 45nm 45nm 40nm

Like most of its competitors, Samsung's memory controller does allow for some flexibility when choosing memory types. In addition to LPDDR2, the Exynos 4210 supports standard DDR2 and DDR3. Maximum data rate is limited to 800MHz regardless of memory type.

Based on everything I've said thus far, the Exynos 4210 should be among the highest performing SoCs on the market today. It has the same clock for clock performance as an Apple A5, NVIDIA Tegra 2 and TI OMAP 4430. Samsung surpassed those designs by delivering a 20% higher operating frequency, which should be tangible in typical use.

To find out let's turn to our CPU performance suite. We'll start with our browser benchmarks: SunSpider and BrowserMark:

SunSpider Javascript Benchmark 0.9

Rightware BrowserMark

Despite the 20% clock speed advantage the Galaxy S 2 isn't any faster than Motorola's Droid 3 based on a 1GHz TI OMAP 4430. Unfortunately this doesn't tell us too much since both benchmarks take into account browser performance as well as total platform performance. While the Galaxy S 2 is clearly among the fastest smartphones we've ever reviewed it looks like Motorola's browser may actually be a bit more efficient at javascript execution.

Where we do see big gains from the Exynos' higher clock speed is in our Linpack tests. The single-threaded benchmark actually shows more scaling than just clock speed, indicating that here are other (possibly software?) factors at play here. Either way it's clear that the 20% increase in clock speed can surface as tangible if the conditions are right:

Linpack - Single-threaded

Linpack - Multi-threaded

A clock speed advantage today is nice but it's something that Samsung's competitors will be able to deliver in the not too distant future. Where Samsung chose to really differentiate itself was in the graphics department. The Exynos 4210 uses ARM's Mali-400 MP4 GPU.

Shipping in smartphones today we have GPUs from three vendors: Qualcomm (Adreno), Imagination Technologies (PowerVR SGX) and NVIDIA (GeForce). Of those vendors, only Qualcomm and NVIDIA produce SoCs - Imagination simply licenses its technology to SoC vendors.

Both Apple and Intel hold significant amounts of Imagination stock, presumably to protect against an eager SoC vendor from taking control of the company.

ARM also offers GPU IP in addition to its CPU designs, however we've seen very little uptake until now. Before we get to Mali's architecture, we need to talk a bit about the different types of GPUs on the market today.



Understanding Rendering Techniques

It's been years since I've had to describe the differences in rendering techniques but given the hardware we're talking about today it's about time for a quick refresher. Despite the complexities involved in CPU and GPU design, both processors work in a manner that's pretty easy to understand. The GPU fundamentally has one function: to determine the color of each pixel displayed on the screen for a given frame. The input the GPU receives however is very different from a list of pixel coordinates and colors.

A 3D application or game will first provide the GPU with a list of vertex coordinates. Each set includes the coordinates for three vertices in space, these describe the size, shape and position of a triangle. A single frame is composed of hundreds to millions of these triangles. Literally everything you see on screen is composed of triangles:

Having more triangles (polygons) can produce more realistic scenes but it requires a lot more processing on the front end. The trend in 3D gaming has generally been towards higher polygon counts over time.

The GPU's first duty is to take this list of vertices and convert them into triangles on a screen. Doing so results in a picture similar to what we've got above. We're dealing with programmable GPUs now so its possible to run code against these vertexes to describe their interactions or effects on them. An explosion in an earlier frame may have caused the vertices describing a character's elbow to move. The explosion will also impact lighting on our character. There's going to be a set of code that describes how the aforementioned explosion impacts vertices and another snippet of code that describes what vertices it impacts. These code segments run and modify details of the vertices at this stage.

With the geometry defined the GPU's next job is rasterization: figure out what pixels cover each triangle. From this point on the GPU stops dealing in vertices and starts working in pixel coordinates.

Once rasterized, it's time to give these pixels some color. The color of each pixel is determined by the texture that covers that pixel and/or the pixel shader program that runs on that pixel. Similar to vertex shader programs, pixel shader programs describe effects on pixels (e.g. flicker bright orange at interval x to look like fire).

Textures are exactly what they sound like: wallpaper for your polygons. Knowing pixel coordinates the GPU can go out to texture memory, fetch the texture that maps to those pixels and use it to determine the color of each pixel that it covers.

There's a lot of blending and other math that happens at this stage to deal with corner cases where you don't have perfect mapping of textures on polygons, as well as dealing with what happens when you've got translucency in your textures. After you get through all of the math however the GPU has exactly what it wanted in the first place: a color value for every pixel on the screen.

Those color values are written out to a frame buffer in memory and the frame buffer is displayed on the screen. This process continues (hopefully) dozens of times per second in order to deliver a smooth visual experience.

The pipeline I've just described is known as an immediate mode renderer. With a few exceptions, immediate mode renderers were the common architectures implemented in PC GPUs over the past 10+ years. These days pure immediate mode renderers are tough to find though.


IMRs render the full car and the tree, even though part of the car is occluded

Immediate mode renderers (IMRs) brute force the problem of determining what to draw on the screen. They take polygons as they receive them from the CPU, manipulate and shade them. The biggest problem here is although data for every polygon is sent to the GPU, some of those polygons will never be displayed on the screen. A character with thousands of polygons may be mostly hiding behind a pillar, but a traditional immediate mode renderer will still put in all of the work necessary to plot its geometry and shade its pixels, even though they'll never be seen. This is called overdraw. Overdraw unfortunately wastes time, memory bandwidth and power - hardly desirable when you're trying to deliver high performance and long battery life. In the old days of IMRs it wasn't uncommon to hear of 4x overdraw in a given scene (i.e. drawing 4x the number of pixels than are actually visible to the user). Overdraw becomes even more of a problem with scene complexity.

Tile Based Deferred Rendering

On the opposite end of the spectrum we have tile based deferred rendering (TBDR). Immediate mode renderers work in a very straightforward manner. They take vertices, create polygons, transform and light those polygons and finally texture/shade/blend the pixels on them. Tile based deferred renderers take a slightly different approach.

TBDRs subdivide the scene into smaller tiles on the order of a few hundred pixels. Vertex processing and shading continue as normal, but before rasterization the scene is carved up into tiles. This is where the deferred label comes in. Rasterization is deferred until after tiling and texturing/shading is deferred even longer, until after overdraw is eliminated/minimized via hidden surface removal (HSR).

Hidden surface removal is performed long before we ever get to the texturing/shading stage. If the frontmost surface being rendered is opaque, there's absolutely zero overdraw in a TBDR architecture. Everything behind the frontmost opaque surface is discarded by performing a per-pixel depth test once the scene has been tiled. In the event of multiple overlapping translucent surfaces, overdraw is still minimized. Only surfaces above the farthest opaque surface are rendered. HSR is performed one tile at a time, only the geometry needed for a single tile is depth tested to keep the problem manageable.

With all hidden surfaces removed then, and only then, is all texture data fetched and all pixel shader code executed. Rendering (or more precisely texturing and shading) is deferred until after a per-pixel visibility test is passed. No additional work is expended and no memory bandwidth wasted. Only what is visible in the final scene is rasterized, textured and shaded on each tile.

The application doesn't need to worry about the order polygons are sent for rendering when dealing with a TBDR, the hidden surface removal process takes care of everything.

In memory bandwidth constrained environments TBDRs do incredibly well. Furthermore, the efficiencies of a TBDR really shine when running applications and games that are more shader heavy rather than geometry heavy. As a result of the extensive hidden surface removal process, TBDRs tend not to do as well in scenes with lots of complex geometry.

What's In Between Immediate Mode and Deferred Rendering?

These days, particularly in the mobile space, many architectures refer to themselves as "tile based". Unfortunately these terms can have a wide variety of meanings. The tile based deferred rendering architecture I described above really only applies to GPUs designed by Imagination Technologies. Everything else falls into the category of tile based immediate mode renderers, or immediate mode renderers with early-z.

These GPUs look like IMRs but they implement one or both of the following: 1) scene tiling, 2) early z rejection.

Scene tiling is very similar to what I described in the section on TBDRs. Each frame is divided up into tiles and work is done on a per-tile basis at some point in the rendering pipeline. The goal of dividing the scene into tiles is to simplify the problem of rendering and better match the workload to the hardware (e.g. since no GPU is a million execution units wide, you make the workload more manageable for your hardware). Also by working on small tiles caches behave a lot better.

The big feature that this category of GPUs implements is early-z rejection. Instead of waiting until after the texturing/shading stage to determine pixel visibility, these architectures implement a coarse test for visibility earlier in the pipeline.

Each vertex has a depth value and using those values you can design logic to find out what polygons (or parts of polygons) are occluded from view. GPU makers like ATI and NVIDIA introduced these early visibility tests years ago (early-z or hierarchical-z are some names you may have heard). The downside here is that early-z techniques only work if the application submits vertices in a front-to-back order, which does require extra work on the application side. IMRs process polygons in the order they're received, and you can't reject anything if you're not sure if anything will be in front of it. Even if an application packages up vertex data in the best way possible, there are still situations where overdraw will occur.

The good news is you get some of the benefits of a TBDR without running into trouble should geometry complexities increase. The bad news is that a non-TBDR architecture will still likely have higher amounts of overdraw and be less memory bandwidth efficient than a TBDR.

Most modern PC GPUs fall into this category. Both NVIDIA's Fermi and AMD's Cayman GPUs do some amount of tiling although they have their roots in immediate mode rendering.

The Mobile Landscape

Understanding the difference between IMRs, IMRs with early-z, TBRs and TBDRs, where do the current ultra mobile GPUs fall? Imagination Technologies' PowerVR SGX 5xx is technically the only tile based deferred renderer that allows for order independent hidden surface removal.

Qualcomm's Adreno 2xx and ARM's Mali-400 both appear to be tile based immediate mode renderers that implement early-z. This is particularly confusing because ARM lists the Mali-400 as featuring "advanced tile-based deferred rendering and local buffering of intermediate pixel states". The secret is in ARM's optimization documentation that states: "One specific optimization to do for Mali GPUs is to sort objects or triangles into front-to-back order in your application. This reduces overdraw." The front-to-back sort requirement is necessary for most early-z technologies to work properly. These GPUs fundamentally tile the scene but don't perform full order independent hidden surface removal. Some aspects of the traditional rendering pipeline are deferred but not to the same extent as Imagination's design.

NVIDIA's GeForce ULP in the Tegra 2 is an IMR with early-z. NVIDIA has long argued that its design is the best for future games with increasing geometry complexities as a result of its IMR design.

Today there's no real benefit to not building a TBDR in the ultra mobile space. Geometry complexities aren't very high and memory bandwidth does come at a premium. Moving forward however, the trend is likely going to mimic what we saw in the PC space: towards more polygon heavy games. There is one hiccup though: Apple.

In the evolution of the PC graphics industry the installed base of tile based deferred renderers was extremely small. Imagination's technology surfaced in two discrete GPUs: STMicro's Kyro and Kyro II, but neither was enough to stop NVIDIA's momentum at the time. Since immediate mode renderers were the norm, games simply developed around their limitations. AMD and NVIDIA both eventually implemented elements of tiling and early-z rejection, but TBDRs never took off in PCs.

In the ultra mobile space Apple exclusively uses Imagination Technologies GPUs, which I mentioned above are tile based deferred renderers. Apple also happens to be a major player, if not the biggest, in the smartphone/tablet gaming space today. Any game developer looking to put out a successful title is going to make sure it runs well on iOS hardware. Game developers will likely rely on increasing visual quality through pixel shader effects rather than ultra high polygon counts. As long as Imagination Technologies is a significant player in this space, game developers will optimize for TBDRs.



The Mali-400

Now that we've settled the issue of what type of GPU it is, let's talk about the physical makeup of the Mali-400. The Mali-400 isn't a unified shader architecture, it has discrete execution hardware for vertex and fragment (pixel) processing. ARM calls the Mali-400 a multicore GPU with configurations available with 1 - 4 cores. When ARM refers to a core however it's talking about a fragment (pixel shader) processor, not an entire GPU core. This is somewhat similar to NVIDIA's approach with Tegra 2, although NVIDIA counts each vertex and fragment processor as an individual core.

In its simplest configuration the Mali-400 features a single combined geometry front end and vertex processor and a single fragment processor. The 400 is also available in 2 and 4 core versions, both of which still have only a single vertex processor. The two core version has two fragment processors and the four core version has four fragment processors. Note that ARM decided to scale fragment shading performance with core count while keeping vertex performance static. This is likely the best decision given current workloads, but a risky one. NVIDIA on the other hand standardized on a 1:1 ratio between fragment and vertex processors compared to ARM's 4:1 on a 4-core Mali-400. The 4-core Mali-400 MP4 is what Samsung uses in the Exynos 4210.

ARM, like Qualcomm, isn't particularly interested in having the details of its GPUs available publicly. Unfortunately this means that we know very little about the makeup of each of these vertex and fragment processors. I suspect that both companies will eventually learn to share (just as AMD and NVIDIA did) but as this industry is still in its infancy, it will take some time.

Earlier documentation on Mali revealed that the GPU is a VLIW architecture, meaning each processor is actually a collection of multiple parallel execution units capable of working on vector data. There's no public documentation indicating how wide each processor is unfortunately, but we can make some educated guesses.

We know from history that AMD felt a 5-wide VLIW architecture made sense for DX9 class games, later moving down to a 4-wide architecture for DX11 games. AMD didn't have the die constraints that ARM and other SoC GPU suppliers do so a 5-wide unit is likely out of the question, especially considering that Imagination settled on a VLIW4 architecture. Furthermore pixels have four color elements (RGBA), making a VLIW4 an ideal choice.

Based on this as well as some internal information we can assume that a single Mali fragment shader is a 4-wide VLIW processor. The vertex shader is a big unknown as well, but knowing that vertex processing happens on two coordinate elements (U & V) Mali's vertex shader is likely a 2-wide unit.

Thus far every architecture we've looked at has been able to process one FP16 MAD (multiply+add) per execution unit per clock. If we make another assumption about the Mali-400 and say it can do the same, we get the following table:

Mobile SoC GPU Comparison
  PowerVR SGX 535 PowerVR SGX 540 PowerVR SGX 543 PowerVR SGX 543MP2 Mali-400 MP4 GeForce ULP Kal-El GeForce
SIMD Name USSE USSE USSE2 USSE2 Core Core Core
# of SIMDs 2 4 4 8 4 + 1 8 12
MADs per SIMD 2 2 4 4 4 / 2 1 ?
Total MADs 4 8 16 32 18 8 ?
GFLOPS @ 200MHz 1.6 GFLOPS 3.2 GFLOPS 6.4 GFLOPS 12.8 GFLOPS 7.2 GFLOPS 3.2 GFLOPS ?
GFLOPS @ 300MHz 2.4 GFLOPS 4.8 GFLOPS 9.6 GFLOPS 19.2 GFLOPS 10.8 GFLOPS 4.8 GFLOPS ?

Based on this estimated data alone, it would appear that a four-core Mali-400 has the shader compute power of a PowerVR SGX 543. In other words, half the compute horsepower of the iPad 2's GPU or over twice the compute of any smartphone GPU today. The Mali-400 is targeted at 275MHz operation, so its figures are likely even higher than the competition. Although MADs are quite common in shader execution, they aren't the end all be all - we need to look at application performance to really see how it stacks up.



GLBenchmark 2.1 Solves Our Resolution Problems

Modern Android smartphones either run at 800 x 480 (WVGA) or 960 x 540 (qHD). The iPhone 4 features a 960 x 640 (DVGA) display, while the iPad 2 has a 1024 x 768 (XGA) panel. To complete the confusion Honeycomb tablets run at 1280 x 800 (WXGA). While measuring 3D performance at native resolution is useful in determining how well games will run on a device, it's not particularly useful in comparing GPUs. Fill rate and memory bandwidth requirements increase with pixel count. Even just between Android devices, those with a qHD display have 35% more pixels to render than their WVGA counterparts.

Unfortunately not all benchmarks give us the ability to perform tests at a common resolution. To make matters worse, not all devices are even capable of running at the resolutions we'd want to test. BaseMark ES2, the rebranded 3DMarkMobile allows us to specify display resolution which we have done in previous reviews. For smartphones we standardize on 640 x 480 and for tablets it's 1024 x 768. GLBenchmark however hasn't given us the ability to do that until recently.

GLBenchmark 2.1 now includes the ability to render the test offscreen at a resolution of 1280 x 720. This is not as desirable as being able to set custom resolutions since it's a bit too high for smartphones but it's better than nothing. The content remains unchanged from GLBench 2.0, there are still two primary tests that measure overall OpenGL ES 1.0 and 2.0 performance in addition to a number of specific synthetic feature tests.

We'll start with some low level tests to give us an idea of what we're looking at. First up is a raw triangle throughput test:

Triangle Throughput - GLBenchmark 2.1 Triangle Test

GLBenchmark 2.1 made some changes to the fill rate and triangle throughput tests so these numbers aren't comparable to the 2.0 results. Although the Nexus S' single core CPU, older drivers and lower clocked GPU put it at the bottom of the list, the LG Optimus 3D is the best showing of the PowerVR SGX 540. The SGX 540 in the LG phone ends up at around half the peak triangle rate of the iPad 2, perhaps due to better drivers or a higher clock speed. Here we see the true limitations of ARM's 4:1 pixel to vertex shader architecture. The Mali-400 barely outperforms the Nexus S and offers around 1/3 of the triangle rate of the PowerVR SGX 540 in the Optimus 3D. The Adreno 220 does well here and ends up at around 2x the performance of the Mali-400.

Triangle Throughput - GLBenchmark 2.1 Textured, Vertex Lit Triangle Test

As we move to a more complex triangle test the PowerVR SGX 540 in the Optimus 3D is now only 85% faster than the Mali-400. The Nexus S' performance, despite using the same GPU, is simply abysmal. The Adreno 220 drops to only 37% faster than the Mali-400. No matter how you slice it, the 4-core Mali-400 just can't compete in geometry performance with today's GPUs. Luckily for ARM however, most mobile games aren't geometry bound - what we really need here is pixel processing power and that's something Mali-400 does deliver quite well.

Fill Rate - GLBenchmark 2.1 Texture Fetch

GLBenchmark 2.1's fill test paints a different picture for Mali-400. Here the SGX 540 is less than half the speed while the iPad 2's SGX 543MP2 is about twice the speed. The Mali-400's texturing performance is very solid, no GPU currently shipping in a smartphone can touch it.

What about in a game-like workload? For that we turn to the standard GLBenchmark game tests: Egypt and Pro.

GLBenchmark 2.1—as its name implies—tests OpenGL ES 2.0 performance on compatible devices. The suite includes two long benchmarking scenarios with a demanding combination of OpenGL ES 2.0 effects - texture based and direct lighting, bump, environment, and radiance mapping, soft shadows, vertex shader based skinning, level of detail support, multi-pass deferred rendering, noise textures, and ETC1 texture compression.

GLBenchmark 2.1 is the best example of an even remotely current 3D game running on this class of hardware—and even then this is a stretch. If you want an idea of how the Mali-400 stacks up to the competition however, GLBenchmark 2.1 is probably going to be our best bet (at least until we get Epic to finally release an Unreal Engine benchmark).

First let's look at the 1280 x 720 results from 2.1:

GLBenchmark 2.1 - Egypt - Offscreen

GLBenchmark 2.1 - Pro - Offscreen

Despite huge disadvantages in geometry performance the Mali-400 does extremely well in the Egypt test, outpacing most of its competitors by a factor of 2. Only the iPad 2 is faster but that's to be expected based on the raw horsepower of its GPU. Given current workloads, ARM's Mali-400 is clearly the fastest GPU available on a smartphone today.

RightWare Basemark ES 2.0 V1 - Taiji

RightWare Basemark ES 2.0 V1 - Hoverjet

The dominance continues in the Basemark ES 2.0 tests, the Galaxy S II consistently delivers frame rates more than 2x those of its competitors. It's a shame that 3D gaming isn't a bigger deal on Android today because it'd be nice to really see ARM's high end GPU get a chance to flex its muscle on a regular basis.

For comparison to our older phones we've got our standard GLBenchmark 2.0 graphs below:

GLBenchmark 2.0 - Egypt

GLBenchmark 2.0 - PRO

Scrolling Performance

The Galaxy S II is by far the smoothest scrolling Android device we've ever reviewed. Architecturally it has all of the right components to deliver a buttery smooth UI: gobs of memory bandwidth and a very high speed GPU. The software appears to complement it very well. Once again we turn to Qualcomm's Vellamo benchmark to quantify scrolling performance on the Galaxy S II:

Qualcomm Vellamo Benchmark - Scrolling Performance Tests
WVGA Unless Otherwise Noted Ocean Flinger Image Flinger Text Flinger
HTC EVO 3D (Adreno 220 - qHD) 68.98 26.03 41.79
Motorola Photon 4G (GeForce ULP) 62.07 17.64 35.21
Samsung Galaxy S 4G (PowerVR SGX 540) 55.98 26.27 31.83
Samsung Galaxy S 2 (Mali-400 MP4) 91.02 35.14 51.19

Vellamo produces its scores directly from frame counters, so what you're looking at is a direct representation of how fast these devices scroll through the three web tests above. The Galaxy S II is 20 - 35% faster than the Photon 4G and 45 - 100% faster than the EVO 3D. We simply have no complaints here.

Flash Performance

Thus far NVIDIA's Tegra 2 has delivered the best overall GPU accelerated Flash expierence of any SoC on the market today. With the latest update to Flash enabling NEON support on OMAP 4 both it and the Exynos 4210 now match what NVIDIA delivers here:

Flash Performance

Until we hit 2012 and meet NVIDIA's Kal-El in smartphones (tablet release in 2011) and Qualcomm's first Krait designs, Samsung's Exynos 4210 looks like the best SoC for Android smartphones.

 



Battery

Even though it’s almost at the end of our review, battery life is hugely important, and measuring up how SGS2 does compared to the competition is a large part of what makes things pretty positive for the device. As a reminder, we measure battery life by having the browser load through a few dozen pages with brightness set on 200 nits until the phone dies, on both WiFi and cellular (WCDMA). The SGS2 has a capacious 6.11 Whr battery, which is among a small number of devices I’ve seen that come with over a 6 Whr battery by default.

Smartphone Web Browsing Battery Life

WiFi Web Browsing Battery Life

3G Talk Time Battery Life

The SGS2 outperforms its predecessors pretty handily, and I’ve highlighted in orange those results from the Galaxy S 4G and Fascinate. When you factor in that SGS 4G has the same capacity 6.11 Whr battery, it’s obvious how much of the gains are both SAMOLED+ efficiency and a dual core SoC.

WiFi Hotspot Battery Life Time

In the WiFi hotspot test, the SGS2 actually trounces everything else I’ve seen thus far as well, edging out the Inspire 4G. As a reminder, that test consists of loading 4 sets of the page load test alongside a 128 kbps MP3 stream with the display off until the phone dies.

The last thing to talk about with respect to battery life is the infamous “AOS Bug,” where AOS references the Android OS line item in the battery use window. I’ve read just about everything there is I could find on this bug, and believe it to just be related to how Android reports this metric based on CPU time that a process and its children use. Some have speculated this is something which has showed up with dual core SoCs. To be completely honest, I don’t put much stock in the line-item breakdown of battery use to begin with, what I look at is the graph view. Either way, the battery numbers above speak for themselves, and SGS2 battery life is definitely superior to the predecessor, AOS issue or not.

Conclusions and Final Thoughts

It’s always difficult to sum up a device like the SGS2, because this is such a major launch and so much has already been written and discovered about the phone. I find myself again thinking back to how long it’s been since we first played with the SGS2 at MWC and just how far the device has come. It literally is a completely different device today than what Anand and I played with chained to a table in Barcelona.


From back at MWC in Barcelona

There’s no doubt in my mind that SGS2 is the most powerful smartphone out right now, both in the synthetics and in just subjective feel. That’s thanks in large part to Exynos 4210’s dual core Cortex A9s at 1.2 GHz and ARM’s Mali–400 GPU. The end result is an experience that’s buttery smooth and rarely shows any signs of being want for more power. Mali–400 alone is twice as fast as any other smartphone GPU out right now, and Exynos 4210 seems likely to vie for performance crown in Android-land until the start of 2012.

The original Galaxy S was a hugely popular Android phone, and thankfully the few issues that were around that generation have been ironed out this second time around. The result is a device that is better in almost every category. Battery life is longer than the predecessor. Performance is much higher. Super AMOLED uses the much more readable RGB stripe. GPS works this go around. Camera stills and video are awesome. The list goes on.


Some Photos Courtesy Sarah Trainor

That said there are still a few lingering areas which the SGS2 wavers. Audio quality from the Yamaha codec in the SGS2 isn’t up to the level of quality the Wolfson was capable of, and there are some potentially frustrating baseband instability issues we ran into as well. There’s also the notable omission of NFC in all but the Korean version of the SGS2, and it looks as though only certain variants coming to the USA will have NFC.

The international market is a whole lot more efficient than the situation we have to deal with here in the USA. Phones launch in largely the form the manufacturer originally intended them to, and as a result there’s a single target for both enthusiast ROM modders and the handset vendor to build and test software on. More and more, it’s really that kind of long-term support that makes a handset valuable, and SGS2 is such a huge success already that it isn’t likely to be obsolete in just a few months, even with Kal-El phones and a new Nexus looming on the horizon.

I really have to admit that I went into this review expecting to be massively underwhelmed with Galaxy S 2. Here we are at the end though, I find my thoughts about the device completely changed. Even taking into account the near term Android roadmap, Galaxy S 2 is the Android smartphone I’d absolutely buy today.

Log in

Don't have an account? Sign up now