High-performance computing chip designs have been pushing the ultra-high-end packaging technologies to their limits in the recent years. A solution to the need for extreme bandwidth requirements in the industry has been the shifts towards large designs integrated into silicon interposers, directly connected to high-bandwidth-memory (HBM) stacks.

TSMC has been evolving their CoWoS-S packaging technology over the years, enabling designers to create bigger and beefier designs with bigger logic dies, and more and more HBM stacks. One limitation for such complex designs has been the reticle limit of lithography tools.

Recently, TSMC has been increasing their interpose size limitation, going from 1.5x to 2x to even projected 3x reticle sizes with up to 8 HBM stacks for 2021 products.

As part of TSMC’s 2020 Technology Symposium, the company has now teased further evolution of the technology, projecting 4x reticle size interposers in 2023, housing a total of up to 12 HBM stacks.

Although by 2023 we’re sure to have much faster HBM memory, a 12-stack implementation with the currently fastest HBM2E such Samsung's Flashbolt 3200MT/s or even SKHynix's newest 3600MT/s modules would represent at least 4.92TB/s to 5.5TB/s of memory bandwidth, which is multitudes faster than even the most complex designs today.

Carousel image credit: NEC SX-Aurora TSUBASA with 6 HBM2 Stacks

Related Reading

POST A COMMENT

34 Comments

View All Comments

  • jeremyshaw - Tuesday, August 25, 2020 - link

    How will this scale vs wafer sized SRAM? Reply
  • azfacea - Tuesday, August 25, 2020 - link

    its not just intel TSMC is destroying, its us. humans are finished. in 5 years an xbox could be a "deeper mind" than a human. elon warned us. no one listened. Reply
  • Drkrieger01 - Tuesday, August 25, 2020 - link

    Just remember, this is only the hardware portion to said equation. We still need developers to release products that can harness this power... for good or evil ;) Reply
  • azfacea - Tuesday, August 25, 2020 - link

    are you suggesting training will take a long time ?? whats to stop a 1 GW super computer doing the training and programming ??

    we r years not decades away from being no more useful than monkeys. maybe thats a good thing, maybe its not. maybe it means infinite prosperity and well being for everyone maybe it means we'll be cleansed out and deemed too dangerous but we are for sure not going to be useful anymore.
    Reply
  • Dizoja86 - Tuesday, August 25, 2020 - link

    Put down the ayahuasca, friend. We're still a long way away from a technological singularity. Listening too seriously to Elon Musk might be part of the problem you're facing. Reply
  • Spunjji - Wednesday, August 26, 2020 - link

    The funniest bit is that Elon hasn't even said anything new - he's just repeating things other people were saying a long time before him.

    If he ever turns out to have been right, it will be incidentally so. A prediction isn't any use at all without a timeline.
    Reply
  • Santoval - Wednesday, August 26, 2020 - link

    Exactly. Others like Stephen Hawking started sounding the alarm quite earlier. Reply
  • edzieba - Wednesday, August 26, 2020 - link

    Computers are very dumb, very quickly. 'Deep Learning' is very dumb, in vast parallel.

    While the current AI boom is very impressive, it is fundamentally implementing techniques from many decades ago (my last Uni course was a decade ago and the techniques were decades old THEN!) but jsut throwing more compute power at them to make them commercially viable. The problem is always how to train your neural networks, and 'Deep Learning' merely turned that from a tedious, finicky and slow task to a merely tedious and finicky one.

    Or in other words: if you want your kill-all-humans skynet AI, you're going to have to find someone who wants to make a robust and wide coverage killable-human-or-friendly-robot training dataset, and debug why it decides it wants to destroy only pink teapots.
    Reply
  • azfacea - Wednesday, August 26, 2020 - link

    so you are saying what evolution did in humans was impossible to do because it should've never gone past pink teapots ?? got it.

    who cares if techniques were old if you lacked processing power to use them. ramjets were conceived of 50 years before the first turbojet flew.
    Reply
  • melgross - Wednesday, August 26, 2020 - link

    You don’t understand how this works. Life isn’t science fiction. While some things are just steady in their progress, such as electronic and mechanical systems, until we have a far better understanding of how our brain works, we won’t be able to have a machine equal it. Processing speed and scads of memory aren’t enough. Even neural networks aren’t close to being enough. Reply

Log in

Don't have an account? Sign up now