Exynos 7420: First 14nm Silicon In A Smartphone

This generation more than any generation in recent memory has been a time of significant movement in the SoC space. We were aware of the Exynos 7420 well before it was announced in the Galaxy S6, but for the most part I expected to see Snapdragon 810 in at least a few variants of the Galaxy S6. It was a bit surprising to see Samsung drop Snapdragon SoCs completely this generation, and judging by the battery life of the Galaxy S6 it seems that Samsung had their reasons for doing this.

For those that are unfamiliar with the Exynos 7420, this SoC effectively represents the culmination of their efforts in semiconductor manufacturing and integrated circuit design. On the foundry side, Samsung is leveraging their vertical integration to make the first SoC on their 14nm LPE (Low Power Early) process, which seems to be solely for Systems LSI until they can no longer use all production capacity.

We previously mentioned that Samsung’s 14nm process in general will lack any significant die shrink due to almost unchanged metal interconnect pitch, but this assumption was in comparison to their 20nm LPM process from which the 14nm LPE process borrows its BEOL (back end of line) from. Opposite to what we thought, the Exynos 5433 was manufacturered on a 20LPE process which makes use of a quite larger metal layer. The result is that one can see a significant die shrink for the 7420 as it is, according to Chipworks, only 78mm² and a 44% reduction over the Exynos 5433's 113mm². This is considerable even when factoring in that the new SoC had two added GPU shader cores. Beyond the swap from a LPDDR3 memory controller to a LPDDR4 capable one, the only other at first noticeable major functional overhaul on the SoC seems to be that the dedicated HEVC decoder block has been removed and HEVC encoding and decoding capability has been merged into Samsung's MFC (Multi-Function Codec) media hardware acceleration block.


Galaxy S6 PCB with SoC and modem in view (Source: Chipworks)

The move from a planar to FinFET process is crucial. Although this is covered in more detail in previous articles, the short explanation is that planar processes suffer from increasing power leakage at smaller process nodes as the bulk of the silicon becomes relatively more massive than the gate that controls the flow of current. This causes decreased power efficiency as the power source of the transistor starts to act as a gate itself. FinFET solves this problem by attempting to isolate the transistor from the bulk of the silicon wafer, wrapping the gate around the channel of the transistor to ensure that it retains strong control over the flow of current compared to a planar transistor design.

The effective voltage drop allowed by the process can be substantial. We can have a look at some voltage excerpts of common frequencies available on both the Exynos 5433 and 7420:

Exynos 5433 vs Exynos 7420 Supply Voltages
  Exynos 5433 Exynos 7420 Difference
A57 1.9GHz (ASV2) 1287.50mV 1056.25mV -234.25mV
A57 1.9GHz (ASV9) 1200.00mV 975.00mV -225.00mV
A57 1.9GHz (ASV15) 1125.00mV 912.50mV -212.50mV
A57 800MHz (ASV2) 950.00mV 768.75mV -181.25mV
A57 800MHz (ASV9) 900.00mV 687.50mV -224.50mV
A57 800MHz (ASV15) 900.00mV 625.00mV -275.00mV
A53 1.3GHz (ASV2) 1200.00mV 1037.50mV -162.50mV
A53 1.3GHz (ASV9) 1112.50mV 950.00mV -162.50mV
A53 1.3GHz (ASV15) 1062.50mV 900.00mV -162.50mV
A53 400MHz (ASV2) 862.00mV 743.75mV -118.25mV
A53 400MHz (ASV9) 787.50mV 656.25mV -131.25mV
A53 400MHz (ASV15) 750.00mV 606.25mV -143.75mV
GPU 700MHz (ASV2) 1125.00mV 881.25mV -243.75mV
GPU 700MHz (ASV9) 1050.00mV 800.00mV -250.00mV
GPU 700MHz (ASV15) 1012.50mV 750.00mV -262.50mV
GPU 266MHz (ASV2) 875.00mV 750.00mV -125.00mV
GPU 266MHz (ASV9) 800.00mV 668.75mV -131.25mV
GPU 266MHz (ASV15) 762.50mV 606.25mV -156.25mV

The ASV (Adaptive Scaling Voltage) numbers represent the different type of chip bins, a lower value representing a worse quality bin and a higher one a better quality one. Group 2 should be the lowest that is found in the wild, with group 15 representing the best possible bin and group 9 the median that should be found in most devices. As one can see in the table, we can achieve well up to -250mV voltage drop on some frequencies on the A57s and the GPU. As a reminder, power scales quadratically with voltage, so a drop from 1287.50mV to 1056.25mV as seen in the worst bin 1.9GHz A57 frequency should for example result in a considerable 33% drop in dynamic power. The Exynos 7420 uses this headroom to go slightly higher in clocks compared to the 5433 - but we expect the end power to still be quite lower than what we've seen on the Note 4.

On the design side, Systems LSI has also done a great deal to differentiate the Exynos 7420 from the 5433. Although the CPU architectures are shared, the A53 cluster is now clocked at 1.5 GHz instead of 1.3 GHz, and the A57 cluster at 2.1 GHz rather than 1.9 GHz. The memory controller is new and supports LPDDR4 running at 1555MHz. This means that the Galaxy S6 has almost double the theoretical memory bandwidth when compared to the Galaxy Note 4 Exynos variant, as we get a boost up to 24.88GB/s over the 5433's 13.20GB/s. We still need to test this to see how these claims translate to practical performance in a deep dive article in the future, as effective bandwidth and latency can often vary depending on vendor's memory settings and SoC's bus architecture. 

Outside of the memory controller, LSI has also updated the 7420 to use a more powerful Mali T760MP8 GPU. Although the Exynos 5433 had a Mali T760 GPU as well, it had two fewer shader cores which means that achieving a given level of performance requires higher clock speeds and higher voltages to overcome circuit delay. This new GPU is clocked a bit higher as well, at 772 MHz compared to the 700 MHz of the GPU in the Exynos 5433. We see the same two-stage maximum frequency scaling mechanism as discovered in our Note 4 Exynos review, with less ALU biased loads being limited to 700MHz as opposed to the 5433's 600MHz. There's also a suspicion that Samsung was ready to go higher to compete with other vendors though, as we can see evidence of an 852 MHz clock state that is unused. Unfortunately deeply testing this SoC isn’t possible at this time as doing so would require disassembling the phone.

Introduction and Design Battery Life and Charge Time
Comments Locked

306 Comments

View All Comments

  • twtech - Friday, April 17, 2015 - link

    There's nothing compelling about the S6 that makes me want it over the S5. Sure, it's a bit faster, but there is a loss of functionality, and it's easier to break.

    Realistically, we are at a point where speed doesn't matter that much anymore unless you play games on your phone, and frankly if I wanted an iPhone, I'd get an iPhone.
  • gnx - Friday, April 17, 2015 - link

    The promise of Samsung was always the best hardware performance coupled with the most functions. Design was secondary. It just needed to not be a deal-breaker.

    It really took off with the S2 because it's exynos and graphics performance was the markedly above others (when Android was still behind), it had the most vibrant screesn (Super AMOLED was central to it's marketing), and added some software tricks (like toggles on the notification screen). SD-card and removable batteries were just one part of the appeal (other manufacturers offered it too).

    S3 largely delivered, but S4 and S5 failed cause there wasn't enough differentiation on those fronts. They used the same Snapdragon as others, or their Exynos was no better, their screen were being parlayed as too saturated, and software additions were criticized as gimmicky. SD cards and removable batteries became the only lasting meaningful difference. Design started to become stale and a hinderance.

    Looks like S6 returns to the original promise: blazing hardware performance with the new Exynos, amazing screen with the maturation of AMOLED, and software that at least doesn't get in the way. Plus additional features that really make a difference: arguably the best camera (Sammy cameras were never bad, not just ever the best) which appeals to the masses and the Gear VR which appeals to the geeks. And a design that is not a deal-breaker. (S6 does not fall behind, and S6 Edge might even have an advantage design-wise)

    I'm not getting one for sure. I'm too wed to the Nexus-line. But this time round, I'd be happy to recommend it to tech-minded friends and tech-ignorant family.
  • generalako - Saturday, April 18, 2015 - link

    The S5 wasn't "too saturated", it was the best display of any smartphone during its release, and still holds that title for any non-Samsung smartphone display -- 1 year after its release.
  • akdj - Sunday, April 26, 2015 - link

    "
    S3 largely delivered, but S4 and S5 failed cause there wasn't enough differentiation on those fronts. They used the same Snapdragon as others, or their Exynos was no better, their screen were being parlayed as too saturated, and software additions were criticized as gimmicky. SD cards and removable batteries became the only lasting meaningful difference. Design started to become stale and a hinderance."

    General beat me to part of your comment but it's almost like you own an S3, is that correct? Have you managed to 'stay away' from the Internet for three years when it came to technology too? If so, good on ya bro! Wish I could!

    The S4 was a Grand Slam. Hence the 'letdown' with S5 sales figures. That said, the two of them are significant improvements on their predecessor. ESPECIALLY AMOLED's technology, and now we're seeing the fruits of Sammys work on Exynos, the internal 'speed' of the storage and memory as well as camera, incredibly quick wifi and LTE speeds and the display, man the display. As an owner of both the iPhone 6+ & Note 4 (biz/personal), I've seen the improvements first hand. In all facets of smartphone usage. Speed, software, displays and cameras and their abilities, the fluency of the OS (I'm weird as I like TouchWiz and iOS, stock Android and OS X, even Win 8....1. Hated '8';)
    Point being, the difference between the three you compared is night and day. My Note 1's subsidy couldn't end quick enough. Impulse purchase at the time and I hated it. It would time out before an app could present the dialog box to accept permission, gingerbread w/TW was a mess, as the SoC couldn't cut through the peanut butter (code). Android wasn't perfect yet, but the difference between my Note 1 (same guts as the S3) & original Xoom was unreal. TW killed that phone. The S3 was much better as the stylus software wasn't killing it, but the Note 2/S4 update was HUGE! From the SoC to the camera, the display (AMOLED has matured EVERY year. So much so DisplayMate pre iPhone 6+, could be the same I've not looked ...has the Note 4 as one of, if not THE best display on the market ...and most reviews pegged the Note 4 as the Android phone to beat as an all rounder, IF you can handle the size (it feels much smaller than it is, I've owned 1, 3 and 4 my wife the 1, 2 and 4). The S5 was a 'lot' of half baked but cool ideas. It was a killer display. Phenomenal processing and memory, decent camera in most situations (forget low light), and an extraordinary amount of 'features' added by the OEM. Fingerprint reading and All the Galaxy apps that brought the engines power to it's knees. From S Voice to S Note, S Finder to Smart Stay added to AT& T's plethora of crap, it never really had a chance to 'spread its wings'. All other phones using the SD 800-805 are beasts, including the Note 4, an even further improvement to a near perfect display (consumer) calibration, with Samsung dropping many of the heavy code or worthless crap found on the S5. Also an extra GB of RAM, quick internal storage and a healthy quick MicroSD, fast as hell radios both wifi and cellular provided an enjoyable experience and with the LP 5.01 update I received a week ago, it's the first Android phone over a ½ year later that not only hasn't slowed, but has become MUCH faster, more responsive.
    Design wasn't EVER a hindrance. Quite the opposite. As an iPhone owner and two phone daily carrier, the design allowed for fast, safe access to battery and external storage. That's not a hindrance. Now I'd agree it didn't 'help' Sammy that metals and other plastics well formed and nicely crafted by other OEMs started to become the norm. I've always enjoyed the craftsmanship and 'design' from both Samsung and Apple and they've been extremely different.
    While reviews would talk about the faux stitching, mock leather, plastic flimsy backs...I don't recall ever reading about the necessity of that typed of manufacture. I've never had a Sammy back 'break' precisely because of its flexibility. Made it easy to access internals and not break it after a few open and closes and the stitching or rather textured plastic exterior made the type of material used easy to grip, tough to slip. I like aluminum too. Both have benefits. Both have drawbacks. But I wouldn't call battery replacement a detriment. We've got several extra Note batteries around the house and other than backpacking (GPS only, airplane. No cell service allowed, it's good to shut it down!) and I can easily get a couple days of use from the Note or iPhone these days. Need more juice there's many hundreds of car chargers, 'power packs' and now like the Note 4, USB 2 Quick Adaptive Charging (I owned the Note 3 and USB 3 while present wouldn't work at USB 3 speeds for transfer and ultimately no benefit to speed of transfers between computer and phone. It was picky on which computers it would even 'tell' it was '3' and showed as '2'). The QC cable and adaptor need to be used but 30 minutes 0-50% and about 90 to get to 100% is phenomenal. As with most OEMs not sporting the Nexus badge, it's lame relying on the OEM to push the latest version of Android 'out' OTA (another reason USB 3 isn't necessary! Who's plugging in anymore? MicroSD can be easily plugged in for faster media transfer from the computers but who needs the cord? Other than charging, either with Android or iOS? Neither require computers not have they since, huh, ironically enough, the S3 era;). And Kies sucks! iTunes is the Mona Lisa of software in comparison.
    As far as your SoC comments, just to add...Apple's the onIy OEM to successfully design their own chip and it's low level programming and architecture. They've got a single phone to worry about typically (different this year after the 5s/c test run) and they're able to control the Eco system relatively easily with onIy their 'own' to worry about) and a tablet they've used the same chip with certain upgrades or bulking up the SoC or increasing RAM. Samsung indeed has used the same Snapdragon processors as others but they used only the very best, top shelf parts in their flagships. They've never been shy about RAM, more cores or bigger, better and brighter displays.
    They're efforts have paid off with Exynos. A company doesn't build a billion or two transistor piece of silicon on a die the size of your thumbnail overnight. They've done a damn fine job 'keeping up' with Qualcomm IMHO ...to th extent when the 810 exhibited bizarre heat/throttling issues, they were able to immediately slide their own 64bit SoC in for ALL markets (it's been in the S3, 4, & 5 ...too lazy to research but I don't think Sammy used nVidia's Tegra at any point) quite nimbly for such a massive release.
    The 'saturation' issue very much went the way of the dodo two years ago or so with the S4, the Note 2. The S5/N3 were another HUGE step up and as General and I've echoed, the N-4 was even better and besting most of the flagship LCD panels. With a display setting the user can control you can 'over' saturate things if you'd like but go back and compare how good this transitions actually were objectively. Here and at DisplayMate.

    You missed the actual displacement of LCD's dominance by Super AMOLED during the exact time period you specify. Apple is using it in their watch. It's gorgeous and the S5 software additions, we do agree, are/were, and they leaned 'gimmicky'.
    The improvement was evident with the Note 4 six months later and even more so with the S6.

    I am curious though, has Samsung dropped the lollipop update for the S5 like the Note 4 in America? Or European countries with the Snapdragon 801/5? If so, and you've downloaded it, has Sammy remedied that bloat by a significant margin? I was happy with 4.4.4 on the N4, but 5.0;1 is like lightning (Nova launch 70%/Google Lauch 30%)

    I can't imagine there's a beast trapped in there. The Note 4 has the same motor albeit a 50% bump in RAM. Seems like a simple S/W update as the sales stailed would've helped a company and millions of end users. I think they're trying (HARD) to optimize T/W and abide by Google's rules as muc as feasibly possible when you're using the OS with things like SPen, fingerprint scanning genesis, touch to share NFC or '(S)urrond(S)ound' @ the party!
  • chizow - Friday, April 17, 2015 - link

    Nicely written, fair and balanced review.

    I was disappointed when I heard Samsung went with a unibody design and removed the option for removable storage and replaceable battery, but I understand the decision and ultimately the market has spoken and agreed with this decision. Personally, the changes would've been unremarkable to me as I would throw a Spigen case on the phone anyways, so the least striking changes to the face would've been the only thing I would've noticed over my S4.

    At the same time, I realized without these features, there was really much less reason to go with a Samsung/Android device, so along with the option to BYOD for work, I went ahead and got an iPhone 6 Plus. I guess I was ready to go with a phablet and there was a number of annoyances I had with Android (just too bloaty and too many hidden CPU drainers leading to awful battery life). My work iPhone on the other hand would go DAYS without needing to be charged.

    Samsung also has an awful track record of supporting their existing products as they are always rushing towards "The Next Big Thing"; this has held true for a number of their products from SSDs, to phones, to Smart TVs. They just don't care once they have your money, they figure bad support is just forced obsolescence and a way to get more of your money in 2-3 years.

    My iPhone 6 Plus hasn't been perfect and there are some oddball bugs I am running into (like Pandora burning CPU/battery randomlly), but overall I'm happy with the decision. We'll see where things stand in a few years when I am ready for another phone upgrade.
  • Drumsticks - Friday, April 17, 2015 - link

    The Galaxy S4 just received the lollipop update. That makes for at least 2 years of updates, which is much better than they used to be.
  • chizow - Saturday, April 18, 2015 - link

    But that's just another example of Android's disjointed hodgepodge support model. Only more recent hardware supports their latest updates and then it is still up to the OEM and then the carrier's discretion to push the OTA update. End result is late updates that are already borderline irrelevant or no update at all.

    IOS isn't perfect either and has had some bunk updates but I got the iOS 8.3 just three days after it was covered here.
  • Ammaross - Friday, April 17, 2015 - link

    "...annoyances I had with Android (just too bloaty and too many hidden CPU drainers leading to awful battery life)"
    "..some oddball bugs I am running into (like Pandora burning CPU/battery randomlly)"

    Trade one demon for a devil. If I had to deal with the same thing either way, I'd go with the one you can actually have a chance at fixing yourself rather than just having to deal with it. :P
  • chizow - Friday, April 17, 2015 - link

    Well at least with iOS it is a well-documented solution that just involves closing and re-opening Pandora, vs. the solution to bloaty Android CPU suckage the answer is root your phone and become your own 24.7 tech support . :p
  • Peichen - Saturday, April 18, 2015 - link

    chizow is right. I tried to use a Note 3 to replace my iPhone 5 before iPhone 6 came out and after spending 2 weeks rooting, loading launchers, mod and so on I decided I don't want to waste any more time tweaking a phone. I had enough of that from high school and college days overclocking my water cooled computer. My time is a lot more valuable now.

Log in

Don't have an account? Sign up now