In recent times, Samsung has seen the erosion of their dominance in the Android ecosystem. The reasons for why this is are many, but at least some of the criticism has been focused on the products themselves. At the high end, many criticisms have been leveled at the industrial and material design of Galaxy S and Note devices, and outside of the hardware itself, TouchWiz has received a great deal of criticism for performance issues and poor design. This brings us to the Galaxy S 6 and S 6 edge, which represents a fundamental shift in the way Samsung approaches the way their phones are made and designed. While we’ve seen these changes in the form of the Galaxy A line and the Galaxy Note 4, the Galaxy S6 represents the first phone that has been made from the ground up with a focus on industrial and material design.

This focus on design is immediately apparent as the Galaxy S 6 is the first Galaxy S phone with a unibody design. There are no visible seams or screws, and there is no apparent gap between the glass back and the metal frame of the phone. In person, the design of the Galaxy S 6 is really quite shocking when compared to the Galaxy S 5 or any previous Galaxy S device. It seems that there has been a great deal of thought and care put into each aspect of the device. The metal frame is far from a simple curve, and is somewhat rounded along the top and bottom, but flattens out along the sides for better grip. The edges of the frame are slightly chamfered as well, in order to make edge swipes off of the display smooth and natural. The display is centered on the device, with symmetrical top and bottom bezels, and thin side bezels. The back cover is clean, with a glass back that has little in the way of distractions outside of the camera, LED flash module, and a Samsung logo. Overall, the Galaxy S6 has been a massive departure from everything else that has come before it.

While the design is one aspect of the Galaxy S 6, the specs are another. While it’s often popular to repeat that specs don’t matter, they represent the foundation for the entire user experience. To start, we’ve placed the usual spec sheet below to get the basics down.

  Samsung Galaxy S5 Samsung Galaxy S6 Samsung Galaxy S6 Edge
SoC MSM8974ACv3 2.45 GHz Snapdragon 801 Exynos 7420 2.1/1.5GHz A57/A53 Exynos 7420 2.1/1.5GHz A57/A53
RAM/NAND 2GB LPDDR3
16/32GB NAND + microSD
3GB LPDDR4-1552
32/64/128GB NAND
3GB LPDDR4-1552
32/64/128GB NAND
Display 5.1” 1080p SAMOLED HD 5.1” 1440p SAMOLED 5.1” 1440p SAMOLED, Dual Edge
Network 2G / 3G / 4G LTE (Qualcomm MDM9x25 UE Category 4 LTE) 2G / 3G / 4G LTE (Category 6 LTE) 2G / 3G / 4G LTE (Category 6 LTE)
Dimensions 142 x 72.5 x 8.1 mm, 145 grams 143.4 x 70.5 x 6.8mm max, 138 grams 142.1 x 70.1 x 7.0mm max, 132 grams
Camera 16MP (5132 x 2988) Rear Facing with 1.12 µm pixels, 1/2.6" CMOS size, 31 mm (35mm effective), f/2.2

2MP Front Facing
16MP (5132 x 2988) Rear Facing w/ OIS, f/1.9, object tracking AF

5MP Front Facing, f/1.9
16MP (5132 x 2988) Rear Facing w/ OIS, f/1.9, object tracking AF

5MP Front Facing, f/1.9
Battery 2800 mAh (10.78 Whr) 2550 mAh (9.81 Whr) 2600 mAh (10.01 Whr)
OS Android 4.4 w/TouchWiz Android 5 (64-bit) w/TouchWiz Android 5 (64-bit) w/TouchWiz
Connectivity 802.11a/b/g/n/ac 2x2 + BT 4.0, USB3.0, GPS/GNSS, MHL, DLNA, NFC 2x2 802.11a/b/g/n/ac + BT 4.1, USB2.0, GPS/GNSS, NFC 2x2 802.11a/b/g/n/ac + BT 4.1, USB2.0, GPS/GNSS, NFC
Wireless Charging N/A WPC 1.1 (4.6W) & PMA 1.0 (4.2W) WPC 1.1 (4.6W) & PMA 1.0 (4.2W)
Fingerprint Sensor Swipe Touch Touch
SIM Size MicroSIM NanoSIM NanoSIM

As one can see, there are a few key highlights of note in the Galaxy S 6. The Exynos 7420 SoC is the first SoC to be built on Samsung’s 14nm FinFET process. While this isn’t comparable to Intel’s 14nm process due to the use of a 20nm metal interconnect, there are some density improvements in areas that aren’t gated by interconnect pitch. While transistors are an area where we can see significant improvements to clock speed and power consumption, the metal interconnects can influence performance as well due to power dissipated by resistance in the interconnects, in addition to limitations on clock speed due to RC delay. This means that Intel continues to hold a significant process lead, as reducing interconnect pitch is exponentially more difficult past the 20nm node as resistance and capacitance issues increase dramatically. On the GPU side of things for example, the voltage drop is huge, showing an average of -200mV up to -300mV decrease at the 700MHz state. Overall, this move to 14nm should dramatically reduce power consumption as the effect of leakage is nearly eliminated.

Outside of process node, the Exynos 7420 is a rather standard big.LITTLE SoC, with four Cortex A57s at 2.1 GHz and four Cortex A53s at 1.5 GHz. However, the Exynos 7420 represents the first Exynos SoC to have full AArch64 support in software, unlike the Exynos 5433. The GPU is upgraded to a Mali T760MP8 solution running at up to 772MHz as the top frequency and 700MHz as the secondary maximum state, a huge improvement as the voltages top out at 825mV. We should be seeing very impressive battery efficiency improvements due to the 14nm process. The SoC supports LPDRR4 running at 1552MHz, and Samsung has equipped the Galaxy S6 with a UFS 2.0 storage solution. It remains to be seen whether this is a major point of differentiation this year, but in practice it seems that the Galaxy S 6 was smooth. Areas like the multitasking interface were noticeably faster to open and close, but there were still some scenarios where I saw some slight frame drops which is likely due to the pre-release software.

Another major area of focus for Samsung for the S6 was refining the camera. While the sensor remains the same Sony sensor that we saw in the Galaxy Note 4, Samsung has improved the optics to have a maximum aperture size of f/1.9 compared to the f/2.2 that we saw with the Galaxy S5 in addition to an IR sensor in order to improve white balance detection. OIS is also introduced to the Galaxy S lineup for this generation, and in practice the stabilization is as effective as the Galaxy Note 4. Samsung strongly prioritized shooting speed and general camera speed with this generation, as they introduced object tracking AF, a double tap camera gesture, and further refinement of the PDAF system in order to make the camera experience much better than before. The object tracking AF is similar to what we've seen on phones like the Huawei Ascend Mate 7 before, but double-tapping the home button to wake up the camera was almost instant compared to pretty much any other method I've seen before. The long start-up time that I saw with the Galaxy S5's camera application has also disappeared for the most part, and in general places like the gallery and camera application are much faster than before. The front-facing camera is also a 5MP camera, which represents an upgrade over the Galaxy Note 4 but shares a f/1.9 aperture. Unfortunately, I was unable to really test the camera at all but it should be at reasonable improvement over the Galaxy Note 4's camera.

For the S6 Samsung has also improved on the AMOLED display, both in quality and pixel density/resolution. Although we have relatively little detail on this, Samsung claims 600 nit luminance, up from their claimed 500 nits from the Galaxy S5 which means that this isn't the same panel as what we saw in the GS5 LTE-A. In addition, Samsung has included wireless charging support for both WPC1.1 and PMA 1.0 standards built into every Galaxy S6. The new fingerprint sensor is also amazing in comparison the experience with the Galaxy S5, and works about as well as Apple's TouchID system. Samsung is bundling this with Samsung Pay, which allows for payments with the fingerprint sensor for authentication and while not available at launch, Samsung will also support legacy magstripe terminals for mobile payments on the Galaxy S6. Samsung has also improved the speaker dramatically from the Galaxy S5, and it should have significantly improved sound quality and volume, in addition to the improved placement on the bottom of the phone. TouchWiz seems to remain relatively similar in design, but there's a big reduction in the number of pre-installed applications. There are some applications preinstalled by Microsoft such as OneNote, OneDrive, and Skype, but in general there's almost no bloat to speak of.

Overall, the Galaxy S6 seems to be quite promising. Although the design seems to be somewhat inspired by other devices on the market, the industrial and material design is a massive step forward from everything else we've seen from Samsung before. In general, it seems that Samsung has managed to put together a device that can truly compete with devices like the One M9 and iPhone 6 in every aspect, although it'll take a full review to really get a good idea for how it shapes up against the competition. The Samsung Galaxy S6 and S6 edge will be available globally starting April 10th with 32, 64, and 128 GB storage SKUs. The Galaxy S6 and S6 edge will both have white, black, and gold colors available, but the blue color will be limited to the Galaxy S6 and the emerald green will be limited to the Galaxy S6 Edge.

Comments Locked

206 Comments

View All Comments

  • Solandri - Monday, March 2, 2015 - link

    Right. It's AMOLED, not AMOLED+, so is probably RGBG pentile. That is, each "pixel" is an alternating RG or BG set of just 2 subpixels. The effective layout (of R to R, or B to B subpixels) is diagonal*, so it's not really comparable to RGB. But if you did compare, it'd be:

    1080p RGB stripe = 5760x1080 = 6.22 million subpixels
    1080p RGBG pentile = 3840x2160 = 4.15 million subpixels
    1440p RGBG pentile = 5120x1440 = 7.37 million subpixels

    So in terms of subpixel density (luminosity resolution), 1440p pentile is only slightly higher than 1080p RGB.

    I don't usually like characterize to screens this way because it emphasizes subpixel count, and leads to the misleading conclusion that RGB is superior. It's not. RGB allocates an equal number of subpixels to each color, but your color vision is much better in green than in red or especially blue. So by the time you have enough G subpixels to fool 20/20 vision, you've got way more R and B subpixels than you need. Pentile's RGBG better approximates your eyes' color resolution, so can achieve the same threshold (fooling 20/20 vision) using fewer subpixels.
    http://nfggames.com/games/ntsc/visual.shtm

    * This diagonal symmetry is actually why pentile RGBG is so attractive for phones and tablets. With the right layout, RGBG is symmetric horizontally and vertically. The same subpixel rendering algorithm can be used in landscape and portrait mode. In contrast, a subpixel rendering algorithm in RGB no longer works if you rotate the screen 90 degrees. Meaning the software needs to be aware of subpixel orientation on the panel. To the best of my knowledge tablets and phones using RGB don't even bother trying to do subpixel rendering, which is a shame since you're leaving potential resolution gains on the table.
  • Xenonite - Wednesday, March 4, 2015 - link

    Solandri, I simply cannot understand why you would willingly spread such obvious FUD about the little technological progress that we actually do achieve these days. The absolute last thing that we need is more people pushing for stagnation in the "good-enough" computing age.
    I do also want to give my opinion regarding this announcement and about the display resolution race in general, but first I would like to take this opportunity to correct, from your post, several points which are quite simply, either factually inaccurate or a grossly misinterpreted account of the truth.

    1) Fiction: "So in terms of subpixel density (luminosity resolution), 1440p pentile is only slightly higher than 1080p RGB."

    Truth: In the best case scenario this is fairly accurate, however for a display, pentile's "black-and-white" resolution is actually a bit worse than that of 1080p RGB panels (especially when displaying colourless text at 1-pixel line widths).

    Explanation: The subpixel density of an additive colour display (i.e. where each subpixel is part of a set of, mostly independent, primary colours) does not equal the density of the resulting image's luma plane. To differentiate between luma and luminance it is necessary to compute the gamma correction values, for the display primaries that your display uses, and then use that corrected values in the calculation of the luminance value of each pixel. To simplify my explination, I am not going to refer to gamma correction's relatively minor (in comparison to the error I am trying to point out) contribution.
    The actual "colourless" or "black-and-white" resolution of a display is equal to the amount of pixels for which distinct "brightness" values can be obtained. Since a pixel's luminance is equal to the weighted sum of all of the primary colour values that are associated with that pixel, it would be reasonable to assume that the resulting luma plane would have a resolution equal to the amount of complete 3-subpixel pixels in that display.
    While this is generally true for content with a spacial resolution significantly below the Nyquist resolution being sampled at (this is why a Bayer filter works on a high-end, high-resolution camera), it starts breaking down very quickly once your image data approaches the Nyquist limit. The reason for this breakdown is, of course, that each neighbouring pixel is not guaranteed to have approximately the same intensity (or "brightness") anymore; consequently one cannot, for example, use a neighbouring pixel's "red" primary to complete the computation of the current pixel's (having only a "green" and a "blue" primary subpixel) intensity value.

    2) Fiction: "your color vision is much better in green than in red"

    Truth: "The distributions turned out to be largely random but with the ratios of red to green cones varying by a factor of 40" (image url: http://www.laserfocusworld.com/content/dam/lfw/pri... Reprinted from Vision Research, 51, David R. Williams, "Imaging single cells in the living retina," 1379–1396, 2011

    Explanation: On average, a normal, healthy eye (with no colour blindness) has many more (about 3x as much) red than green cones. Image comparing the Beyer distribution with an average retinal mosaic (image url: http://ivrg.epfl.ch/files/content/sites/ivrg/files... Meylan, D. Alleysson and S. Susstrunk, A Model of Retinal Local Adaptation for the Tone Mapping of Color Filter Array Images, Journal of the Optical Society of America A (JOSA A), Vol. 24, Nr. 9, pp. 2807-2816, 2007
    This is partly responsible for the horrible colour imagery obtained from such Beyer filters, however it also casts doubt on any compression technique which makes use of the so-called "luminosity function" to separate an image's luma and chroma components. (image url: http://upload.wikimedia.org/wikipedia/commons/a/a0...
    The reason why this does not reliably work, for example in image compression, is the fact that the curve's mathematical deviation assumed that there are more green than red cones in the retina, thus assigning green light a higher "brightness" weight than the same "amount" of red light receives. (Although red and green light share a similar absolute spectral sensitivity; image url: http://upload.wikimedia.org/wikipedia/commons/8/8f...
    This also explains why some people (most likely individuals that have many more green than red cones) so convincingly defend that colour subsampling techniques, such as pentile and the similar Bayer filter (also "chroma sub-sampling" for film), are visually lossless while others (those with a more normal distribution of more red than green cones) seem to find the images so produced to be severely lacking in quality and detail.
    In conclusion, the distribution of and relationship between red and green cones in a normal human retina varies too much for any generalised statement to be made about the visual data thereby rendered redundant. Also, all the mathematical models we use to convert and compress images (such as those referenced above) are based on subjective measurements and do not denote any hard, objective fact about our physiology.
    As a side-note: looking at the colour matching functions and going by the exact meaning of your post, "your color vision" seems to be much better (more efficient) in blue than in either red or green. Blue vision is, however, of very low spacial resolution, especially, close to the fovea...

    3) Fiction: "So by the time you have enough G subpixels to fool 20/20 vision, you've got way more R and B subpixels than you need"

    Truth: Normal, healthy, well developed, young eyes are limited to a general acuity of about 20/8; However, certain tasks can be done with a precision exceeding even the size of an individual cone receptor cell, for example, Vernier acuity that exceeds 20/2.6.

    Explanation: This roughly means that normally sighted people will still be able to see certain artifacts of the display process (such as the regular pixel spacing leading to the "screen door" and various other "moire" effects) even when the resolution density meets the 20/8 resolution limit of a normal retinal receptor. A spacial dithering technique (such as random dithering) would eliminate most of these cases without any substantial negative effects, assuming that the display can keep track of where the pixels actually are positioned and interpolates the signal they receive accordingly and that the pixel density exceeds the 20/8 ratio of good vision at the distance you normally view the display.

    In Conclusion, I sincerely hope that I could extinguish some lingering FUD about the tangible improvements of increased pixel densities together with proper tri-chromatic subpixel structures. I also want to add, that even though there is still quite a lot of progress to be made by improving the resolution of displays, there are some other areas of display performance that have been so badly neglected that their shortcomings are much more prevalent and annoying (at least to me) than the blurry or aliased images that a display optimized for 20/20 vision produces. If I had to choose only one, I would gladly pick either improved refresh rates (144hz still looks way too discontinuous, jumpy and unnatural), greater colour gamut and bit-depth (how can we approach the spatial resolution limits of the eye and still not be able to display more than half of the colours that our eyes can perceive or even produce a realistically smooth colour transition without gross banding or dithering "grain") or even just an aspect ratio that more closely matches a human's field of vision (like, the "unfashionable" 4:3 ratio) over a further increase in display resolution.

    That does not mean that I would simply like all engineers to simply give up on improving display resolutions (no, they should still improve those to the levels of "real" perfect human vision), I still want to see a future where a real-world holodeck is at least closer to reality than it is today.
  • editorsorgtfo - Thursday, March 5, 2015 - link

    Epic comment. There are so many subtleties of human vision that can make a display look like shit even if the conventional wisdom is that you can't see more than 24 fps or if you can't make out a single black pixel on a white background then there's no way that pentile can make your eyes burn.
  • lilmoe - Tuesday, March 3, 2015 - link

    That's not accurate at all. Samsung improves power efficiency of their OLED with each iteration in addition to the SoC.

    But I just think it would have been better if they stuck with 1080p on the GS6. It would still have been the best screen at that size, with the added benefit of even faster rendering and lower power consumption. The benefits go beyond gaming; rendering text, webpages and other assets takes a hit when the SoC has to do 70% more work and wouldn't rush to idle as fast.

    Again, the GS6' should be more efficient **overall** compared to last year's GS5, BUT it could have been **even better** if they stuck with 1080p instead **in addition** to the added benefits of the _newer panel_.
  • JoshHo - Monday, March 2, 2015 - link

    That isn't quite accurate, Samsung relies upon gains in emitter efficiency to offset decreases from higher pixel density.
  • Morawka - Tuesday, March 3, 2015 - link

    Most mobile Games are not GPU bound but are Memory Bandwidth bound. this is the case across almost all mobile soc's.
  • ol1bit - Sunday, March 1, 2015 - link

    You are wrong. I use my HTC One with mount on my bicycle for commuting to and from work. Sounds awesome, no extra bulky speakers to mount, and much safer than headphone because I can still hear traffic, people, etc. around me.
  • lilmoe - Sunday, March 1, 2015 - link

    Totally agreed on the 1440p madness... A *perfected* 1080p panel (perfect color accuracy, better pixel arrangement, brighter and more power efficient) would have been much better for performance, battery life, and overall user experience.
    Those who can tell 441ppi and 577ppi apart are superhumans, which most people probably aren't.
    Oh, and screw VR........
  • TrojMacReady - Sunday, March 1, 2015 - link

    The Note 4 1440P screen was already close to perfection in terms of color accuracy, brightness, contrast, viewing angles, etc.
    To think that a higher resolution automatically means lower gaming performance, is a bit otudated too. There are several free apps available that allow you to set your resolution (if needed per app) at any resolution at or below the screen resolution, to fit your desired balance between quality and framerates. Therefore it adds choice. High (gaming) framerate is still part of that choice.
  • AnnonymousCoward - Sunday, March 1, 2015 - link

    Wrong...no app can change the native res that fills your entire screen. You have to either run non-native res and stretch pixels, or display 1:1 with 4 black bars.

Log in

Don't have an account? Sign up now