Exynos 7420: First 14nm Silicon In A Smartphone

This generation more than any generation in recent memory has been a time of significant movement in the SoC space. We were aware of the Exynos 7420 well before it was announced in the Galaxy S6, but for the most part I expected to see Snapdragon 810 in at least a few variants of the Galaxy S6. It was a bit surprising to see Samsung drop Snapdragon SoCs completely this generation, and judging by the battery life of the Galaxy S6 it seems that Samsung had their reasons for doing this.

For those that are unfamiliar with the Exynos 7420, this SoC effectively represents the culmination of their efforts in semiconductor manufacturing and integrated circuit design. On the foundry side, Samsung is leveraging their vertical integration to make the first SoC on their 14nm LPE (Low Power Early) process, which seems to be solely for Systems LSI until they can no longer use all production capacity.

We previously mentioned that Samsung’s 14nm process in general will lack any significant die shrink due to almost unchanged metal interconnect pitch, but this assumption was in comparison to their 20nm LPM process from which the 14nm LPE process borrows its BEOL (back end of line) from. Opposite to what we thought, the Exynos 5433 was manufacturered on a 20LPE process which makes use of a quite larger metal layer. The result is that one can see a significant die shrink for the 7420 as it is, according to Chipworks, only 78mm² and a 44% reduction over the Exynos 5433's 113mm². This is considerable even when factoring in that the new SoC had two added GPU shader cores. Beyond the swap from a LPDDR3 memory controller to a LPDDR4 capable one, the only other at first noticeable major functional overhaul on the SoC seems to be that the dedicated HEVC decoder block has been removed and HEVC encoding and decoding capability has been merged into Samsung's MFC (Multi-Function Codec) media hardware acceleration block.


Galaxy S6 PCB with SoC and modem in view (Source: Chipworks)

The move from a planar to FinFET process is crucial. Although this is covered in more detail in previous articles, the short explanation is that planar processes suffer from increasing power leakage at smaller process nodes as the bulk of the silicon becomes relatively more massive than the gate that controls the flow of current. This causes decreased power efficiency as the power source of the transistor starts to act as a gate itself. FinFET solves this problem by attempting to isolate the transistor from the bulk of the silicon wafer, wrapping the gate around the channel of the transistor to ensure that it retains strong control over the flow of current compared to a planar transistor design.

The effective voltage drop allowed by the process can be substantial. We can have a look at some voltage excerpts of common frequencies available on both the Exynos 5433 and 7420:

Exynos 5433 vs Exynos 7420 Supply Voltages
  Exynos 5433 Exynos 7420 Difference
A57 1.9GHz (ASV2) 1287.50mV 1056.25mV -234.25mV
A57 1.9GHz (ASV9) 1200.00mV 975.00mV -225.00mV
A57 1.9GHz (ASV15) 1125.00mV 912.50mV -212.50mV
A57 800MHz (ASV2) 950.00mV 768.75mV -181.25mV
A57 800MHz (ASV9) 900.00mV 687.50mV -224.50mV
A57 800MHz (ASV15) 900.00mV 625.00mV -275.00mV
A53 1.3GHz (ASV2) 1200.00mV 1037.50mV -162.50mV
A53 1.3GHz (ASV9) 1112.50mV 950.00mV -162.50mV
A53 1.3GHz (ASV15) 1062.50mV 900.00mV -162.50mV
A53 400MHz (ASV2) 862.00mV 743.75mV -118.25mV
A53 400MHz (ASV9) 787.50mV 656.25mV -131.25mV
A53 400MHz (ASV15) 750.00mV 606.25mV -143.75mV
GPU 700MHz (ASV2) 1125.00mV 881.25mV -243.75mV
GPU 700MHz (ASV9) 1050.00mV 800.00mV -250.00mV
GPU 700MHz (ASV15) 1012.50mV 750.00mV -262.50mV
GPU 266MHz (ASV2) 875.00mV 750.00mV -125.00mV
GPU 266MHz (ASV9) 800.00mV 668.75mV -131.25mV
GPU 266MHz (ASV15) 762.50mV 606.25mV -156.25mV

The ASV (Adaptive Scaling Voltage) numbers represent the different type of chip bins, a lower value representing a worse quality bin and a higher one a better quality one. Group 2 should be the lowest that is found in the wild, with group 15 representing the best possible bin and group 9 the median that should be found in most devices. As one can see in the table, we can achieve well up to -250mV voltage drop on some frequencies on the A57s and the GPU. As a reminder, power scales quadratically with voltage, so a drop from 1287.50mV to 1056.25mV as seen in the worst bin 1.9GHz A57 frequency should for example result in a considerable 33% drop in dynamic power. The Exynos 7420 uses this headroom to go slightly higher in clocks compared to the 5433 - but we expect the end power to still be quite lower than what we've seen on the Note 4.

On the design side, Systems LSI has also done a great deal to differentiate the Exynos 7420 from the 5433. Although the CPU architectures are shared, the A53 cluster is now clocked at 1.5 GHz instead of 1.3 GHz, and the A57 cluster at 2.1 GHz rather than 1.9 GHz. The memory controller is new and supports LPDDR4 running at 1555MHz. This means that the Galaxy S6 has almost double the theoretical memory bandwidth when compared to the Galaxy Note 4 Exynos variant, as we get a boost up to 24.88GB/s over the 5433's 13.20GB/s. We still need to test this to see how these claims translate to practical performance in a deep dive article in the future, as effective bandwidth and latency can often vary depending on vendor's memory settings and SoC's bus architecture. 

Outside of the memory controller, LSI has also updated the 7420 to use a more powerful Mali T760MP8 GPU. Although the Exynos 5433 had a Mali T760 GPU as well, it had two fewer shader cores which means that achieving a given level of performance requires higher clock speeds and higher voltages to overcome circuit delay. This new GPU is clocked a bit higher as well, at 772 MHz compared to the 700 MHz of the GPU in the Exynos 5433. We see the same two-stage maximum frequency scaling mechanism as discovered in our Note 4 Exynos review, with less ALU biased loads being limited to 700MHz as opposed to the 5433's 600MHz. There's also a suspicion that Samsung was ready to go higher to compete with other vendors though, as we can see evidence of an 852 MHz clock state that is unused. Unfortunately deeply testing this SoC isn’t possible at this time as doing so would require disassembling the phone.

Introduction and Design Battery Life and Charge Time
Comments Locked

306 Comments

View All Comments

  • CrazyElf - Friday, April 17, 2015 - link

    I just hope the Note 5 comes with expandable storage and a removable battery.

    Equally annoying is that it as lost waterproofing. On paper at least, it should be easier to waterproof a phone with no removable battery. That and accidents happen. The glass back too is form over function. Give us a metal back or something like carbon fibre.
  • whiteiphoneproblems - Friday, April 17, 2015 - link

    Playing with the Edge, I noticed that the curves of its screen create undesirable light reflections. Any comments on this?
  • JoshHo - Friday, April 17, 2015 - link

    I noticed interference effects as well, which is discussed in the display section. Overall there are some notable compromises with the edge with relatively little benefit. Combined with the steep price increase and I find it hard to justify buying one over a 64 GB GS6.
  • whiteiphoneproblems - Saturday, April 18, 2015 - link

    Thanks - sorry to make you repeat yourself.
  • nyonya - Friday, April 17, 2015 - link

    Great review! Any chance you guys will get a Verizon or Sprint variant? Would love to see the battery life tests with the Qualcomm modem in those.
  • victorson - Friday, April 17, 2015 - link

    Nothing about the front facing camera?
  • Ammaross - Friday, April 17, 2015 - link

    It's a selfy-cam. Don't like it? Take pictures in the bathroom mirror like the rest of them. I doubt Skype et al pixelated video chat will care about slight distortions at the edge of the FoV or slight aliasing for striped objects, etc. :P
  • johnnohj - Friday, April 17, 2015 - link

    The S5 (and Note 4?) used to have a serious problem with edge distortion on the front-cam. I wonder if it's present on the S6.
    See this thread for examples http://forums.androidcentral.com/samsung-galaxy-s5...
  • akdj - Sunday, April 26, 2015 - link

    I've got the Note 4, can't comment on the S5. I've never used one. But the Note 4, if so inclined allows you to take 'selfies' with the front (main) camera. You set your phone where you want it (actually according to where you want you;)) -- it looks slick. I've never used it. You frame/focus/lock exposure where you'll be 'posing' yourself or with a group. Take position. It recognizes a face. Flashes a light, counts down a couple seconds and snaps.
    Regardless, they're using a nice wide aperture and high megapixel selfie cam (that also shoots decent 1080p video for conferencing). It's definitely a step 'up' from most of the competition including the iPhone (I'm ambidextrous, the 6+ is my personal phone. I run our business with the Note4). Note 4's 'selfie' cam definitely beats up on my iPhone's. But then again, FaceTime is extremely cool, more reliable than Skype and convenient that Voice. Be cool if one of the three would open their face 'facing' software as open source/X-Platform, secure and not subsidized by data mining/search dollars or near trillion dollar company servers like Apple's.

    More n more fills are using this camera, not necessarily for selfies but conferencing and team meetings. Between the two I've got, while the Note's is a better face cam IMHO, it's slight. And that's for both front and rear. They're both phenomenal in comparison to the 2007 iPhone I owned, the '08 Android, and any iPad or Xoom/Nexus I've owned --- and with a ten year old son, going through Google's Drive photos/Picassa and iCloud, both of which I was using pre 2007 for email and DSLR & visual storage or transfer, I'm now able to watch my son grow up in front of the computer.
    So much different than my mom's photo albums of my three younger brothers ( all of us married, with kids now) & I.
    These cameras and their storage software/data management subsystems have grown in leaps and bounds in the past couple years ...ita going to be interesting in thirty years to see what my son's 'photo albums' look like. If you were born in the last decade, your entire LIFE will be online and documented photographically
    Practice Safe Selfies! I've got stories from friends about watching 'slide shows' with their teenagers or college kids that are hilarious! I'm not so sure they need to improve selfies significantly -- beyond today's capabilities. There's a fine line between too much detail and improved clarity on a wide angle closely focused, and hence distorted facial or grainy 'length' shots. I think nearly all selfie cams suffer not only edge distortion but soft corners/vignette, low resolution, tiny sensors and bad skin tones. They're more than fine for casual web shots but I don't want to see the pores of the race of the person I'm chatting with. Too distracting!
  • akdj - Sunday, April 26, 2015 - link

    Ugh!
    *Edit* paragraph two is supposed to say 'More folks' not fills.
    **Edit 2** last, main sentence '..the (f)ace of the person I'm chatting with!'

    J

Log in

Don't have an account? Sign up now