At Intel's Investor Day today, CEO Bob Swan and Murthy Renduchintala spoke to the ability of the company with respect to its manufacturing capabilities. Intel has historically been strong in its ability to execute on its process technology, however the delay of its 10nm process has obviously raised multiple question marks, and has done for several years. The two Intel executives went into a little detail about what Intel was doing in the interim, and how it has learned from the issues.

Back in 2013, Intel envisoned its 10nm to succeed the 14nm by providing 2.7x density, with new technologies such as Self-Aligned Quad Patterning (SAQP), Contact over Active Gate (COAG), Cobolt Interconnects, and new packaging technologies such as EMIB and Foveros. Intel admits that this was an ambitious plan, and the goals were not clearly defined with the teams and it was ultimately overly complex and not managed in an ideal way.

This ended up pushing 10nm out into a later time frame. In this case, Intel pushed 10nm out to 2019 (technically they shipped Cannon Lake in small quantities on 10nm in 2017, however that is nothing more than a curio in the timeline of semiconductors), and filled the gap with 14+ and 14++.

Intels 14+ and 14++ processes extracted more than 20% more performance (from Broadwell to Whiskey Lake) from the process since its inception. As a result, Intel is prepared to not only get ready for future intra-node optimizations, but actually adjust the roadmap to compensate for it. Murthy made it clear that Intel wants to introduce a Moore's Law-like gain at the beginning of a new process, and another similar gain by the end of the process.

Intel has stated that its 10nm product family (beyond Cannon Lake) will start to be available from the middle of this year (2019), with Ice Lake on client platforms (notebooks).

Intel will be launching multiple 10nm products through 2019 and 2020, including server based 10nm in the first half of 2020:

In the above slide, Intel states that it will have 7nm in production and launching a product in 2021. That sounds very aggressive for a company that has had issues with 10nm. It even shows in Intels radmap, with 10nm (and 10+ and 10++) having a much shorter life cycle than the 14nm family of processes.

With this in mind, Intel's 7nm is going to be the combination of what Intel has learned from the 14nm and 10nm family of products. Intel wants that 2x scaling (Moores Law), but with intra-node optimations planned as part of the roadmap. Intel is also reducing its number of design rules, which should help with execution. 7nm will also be where Intel intersects with EUV, and also introduce next-gen Foveros and EMIB packainging.

Intel provided this slide, which shows a monolithic PC-Centric die with a multi-die Data-Centric chip built on both Foveros and EMIB. This corroborates our discussion with Intel's chiplet and packaging team, who also stated that we would see Foveros and EMIB on a combined product - specifically the GPU.

Intel announced that its lead 7nm product (lead = top, or lead = first?) would be its new GPGPU, built on the Xe graphics architecture. Intel has stated that its Xe product stack will feature two different microarchitectures from mobile client up to GPGPU, with one of those architectures called Arctic Sound - technically Intel will launch its first discrete GPU in 2020 according to its press release, however the 7nm GPGPU will be launched in 2021.

More information is coming out of Intel's Event, more to follow.

Related Reading

Source: Intel

Comments Locked

237 Comments

View All Comments

  • Korguz - Thursday, May 9, 2019 - link

    only IF you do.. but you wont... cause you cant... case in point.. the post about amd and the cray super computer
  • Xyler94 - Friday, May 10, 2019 - link

    If Intel does something worth criticizing, then they deserve it. And they wholeheartingly deserve it for stagnating innovation since the Core series began. And AMD pushed Intel to make the huge leap with the Core Arch, if it wasn't for Athlon X2, there'd be no amazing Core series.

    That's what competition breeds. Innovation. Why don't you praise AMD for their accomplishments also? Because that would make Intel look bad, right?
  • HStewart - Thursday, May 9, 2019 - link

    I completely understand, I wish people would leave AMD out Intel discussions. There is no question that Intel has made mistakes with 10nm process - but that is not causing them to go down - Intel is involving from a primary desktop market to more mobile market. AMD is actually good for Intel, because it keep them on their feet and moving forward.

    What would be ideal is think about what this means to future, this obvious means that Intel knows they have been hit hard by this 10nm failures of Cannon lake and also Spectre/Meltdown - but did they fold there hands and just cry - no instead they hire new people, got rid obsolete labs and corrected there mistakes.

    I have 30 years experience in computers and know where Intel comes from. I would just be upset if someone company wrote a cpu that was ARM compatible without ARM's permutation. Did Apple do that - not sure maybe.
  • Manch - Thursday, May 9, 2019 - link

    So Arbie and HStewart are the same person. Got it
  • Manch - Thursday, May 9, 2019 - link

    You promote Intel regardless. I only have one pc with amd cpus but good lord youre full of shit. If you have had 30 yrs experience as you say then you know the x86 has been a shared experience since the 70's and a majority of the x64 code base is amd, not Intel. People dont like you or your comments because ypure blantantly dishonest. Mosg of us lile competition between the two.bc it pushes both to put out a better product. Ypu in the meantime shill for intel regardless of facts and you spam the forums. JSTFU and go away.
  • sa666666 - Friday, May 10, 2019 - link

    Exactly. I don't think most people care about which is better _at the moment_; Intel or AMD. I have systems based on both architeCtures, and will continue buying both as their merits are presented. Right now AMD is better in quite a few areas. Intel has been better in the past, and will likely recover and come back again.

    To be clear, this isn't an Intel vs. AMD debate. It's a "we're all sick of the dishonesty and bullshit coming out of your mouth" debate. It is all about you, HStewart; you are annoying, and I believe intentionally so (aka, a troll).
  • Korguz - Thursday, May 9, 2019 - link

    " I have 30 years experience in computers " but yet.. you cant spell architecture correctly....
  • Lord of the Bored - Friday, May 10, 2019 - link

    "I would just be upset if someone company wrote a cpu that was ARM compatible without ARM's permutation. Did Apple do that - not sure maybe."

    But AMD has Intel's "permutation", and has going back to the original 8086. And AMD was even nice enough to give Intel permission to copy THEIR processors too. Adopting AMD64 doubtless really hurt Intel, but it was a strong sign of how dire the straits they'd found themselves in.

    On the other hand, Intel is about the only company on Earth that DIDN'T clone AMD's very popular AM2900 series of parts. So that's a nice thing.

    And on the third hand, if Intel hadn't licensed the 8086 to someone for second-source'ing, we wouldn't care about that device's descendants because the IBM 5150 would have used something from TI's 9900 series or Motorola's then-new 68000 instead. And both of those were actually better architectures than the 8086. So perhaps there's some fairness in being mad, because AMD cheated us out of a TMS9900-based IBM PC.
  • Targon - Friday, May 10, 2019 - link

    Don't forget Zilog, who was huge back in the 1970s into early 1980s. Z80 assembly was actually very elegant compared to 6502 or even 65816.
  • Lord of the Bored - Saturday, May 11, 2019 - link

    I thought 16-bit was one of the goalposts for the 5150 design team, so they weren't going to use an 8-bit microprocessor.

Log in

Don't have an account? Sign up now