So, in the future we could not only have a choice from Intel of which processor line, how fast and how much cache, but also how much RAM is build into it?
From what I've read, it's designed as a replacement for traditional DIMM sticks of RAM. Instead of buying 4GB DDR3 stick from Corsair, you'd buy a 4GB HMC from Micron. The innovation here is they're using new ways of addressing the individual memory bits (with a logic processor) in order to speed up access across the entire section of memory.
As awesome as this sounds, I don't forsee much market traction unless Intel/Micron can get a standards body like JEDEC behind it, especially with the just-announced details of DDR4.
Current DRAM burns a huge amount of the power in laptops and phones.
You may think this has no traction. I think Apple will be all over it.
And once again, let's predict how this will play out. The peanut gallery will complain that Apple is shipping devices that have no customizability --- which may be true. The opportunity cost of having traditional RAM slots is a huge cost in power because of having drive power across the noisy interface between the slot and the DIMM. The tighter the RAM can be integrated with the CPU, the lower the power --- at the cost of having to decide how much RAM you want when you buy the device, with no opportunity to later change your mind.
We've seen over and over again as one part or another of the traditional PC becomes non-replacable, in order to get (always) lower power and (sometimes) a smaller footprint --- non-replacable CPUs, then batteries, now SSDs. Memory is simply the next on the list.
But, of course, the peanut gallery has no concept of the idea of tradeoffs, and refuses to accept that certain laws of physics exist. And so we will hear another round of choruses about how Apple is doing this in their next machines (first laptops, then mini and iMacs) because they want to screw their users over and charge them higher prices, and because Apple hates freedom. And then, when the PC manufacturers follow two years later, a deafening silence.
Meanwhile, whiners, how about that IE 10 and no Flash huh? Could it possibly be that Apple were driven by something more than an insane need to control every aspect of their user's lives?Reply
"Meanwhile, whiners, how about that IE 10 and no Flash huh? Could it possibly be that Apple were driven by something more than an insane need to control every aspect of their user's lives? "
what about it? and no
Since when has Apple been the first to use new tech? They still use old crap in their new products and price it like its new tech. But thanks for showing your fanboyism. In case you missing the article the is about RAM not the magic of Apple. Reply
Weasel, Micron is trying to get another large DRAM manufacturer on board so they can go to JEDEC and try to make it a standard. Currently, it's not intended for mass market, but for specialized servers only.Reply
Help me understand. Isn't 1 Terabit per second a phenomenal data transfer rate? I understand that there's probably not much need to consume that much data in any system at this point but part of the article is about opening the door for future technology. I thought it was a rather substantial advancement.Reply
Actually 1 terabit/sec & greater speeds are already being used by high-end GPUs. Though it's not done as "efficiently" as this tech could possibly do it.Reply
Stuff I've read indicates that intel intends to put it in their next CPUs as video ram. The silicon connect layer means it has to be an SOC style package because you can't customize layout like you can with a PCB for each vendor.Reply
Does this logic layer tie in well with the idea of an optical CPU-RAM path? E.g. the silicon lasers could be put on the RAM's logic layer in the end.Reply
The memory chips will each take the same amount of voltage as on a non-stacked module, thus it increases the power intake.Will this type of module require a heat sink?Reply
17 Comments
Back to Article
prophet001 - Thursday, September 15, 2011 - link
that is really neat. I hope to see this become an affordable solution for home computing Replydouglaswilliams - Thursday, September 15, 2011 - link
Anand,So, in the future we could not only have a choice from Intel of which processor line, how fast and how much cache, but also how much RAM is build into it?
Doug Reply
douglaswilliams - Thursday, September 15, 2011 - link
Even after re-reading the story I'm still confused as to what is being talked about.Is this a CPU with on-package RAM? Similar to mobile SoCs?
Or is it just traditional RAM stacked on top of each other with some logic in-between to sort out the layers?
Doug Reply
WeaselITB - Thursday, September 15, 2011 - link
Both and neither.From what I've read, it's designed as a replacement for traditional DIMM sticks of RAM. Instead of buying 4GB DDR3 stick from Corsair, you'd buy a 4GB HMC from Micron. The innovation here is they're using new ways of addressing the individual memory bits (with a logic processor) in order to speed up access across the entire section of memory.
As awesome as this sounds, I don't forsee much market traction unless Intel/Micron can get a standards body like JEDEC behind it, especially with the just-announced details of DDR4.
Further details:
http://www.micron.com/innovations/hmc.html
http://blogs.intel.com/research/2011/09/hmc.php
http://www.jedec.org/news/pressreleases/jedec-anno...
-Weasel Reply
name99 - Thursday, September 15, 2011 - link
Current DRAM burns a huge amount of the power in laptops and phones.You may think this has no traction. I think Apple will be all over it.
And once again, let's predict how this will play out.
The peanut gallery will complain that Apple is shipping devices that have no customizability --- which may be true. The opportunity cost of having traditional RAM slots is a huge cost in power because of having drive power across the noisy interface between the slot and the DIMM. The tighter the RAM can be integrated with the CPU, the lower the power --- at the cost of having to decide how much RAM you want when you buy the device, with no opportunity to later change your mind.
We've seen over and over again as one part or another of the traditional PC becomes non-replacable, in order to get (always) lower power and (sometimes) a smaller footprint --- non-replacable CPUs, then batteries, now SSDs. Memory is simply the next on the list.
But, of course, the peanut gallery has no concept of the idea of tradeoffs, and refuses to accept that certain laws of physics exist. And so we will hear another round of choruses about how Apple is doing this in their next machines (first laptops, then mini and iMacs) because they want to screw their users over and charge them higher prices, and because Apple hates freedom. And then, when the PC manufacturers follow two years later, a deafening silence.
Meanwhile, whiners, how about that IE 10 and no Flash huh? Could it possibly be that Apple were driven by something more than an insane need to control every aspect of their user's lives? Reply
piiman - Friday, September 16, 2011 - link
"Meanwhile, whiners, how about that IE 10 and no Flash huh? Could it possibly be that Apple were driven by something more than an insane need to control every aspect of their user's lives? "what about it? and no
Since when has Apple been the first to use new tech? They still use old crap in their new products and price it like its new tech. But thanks for showing your fanboyism. In case you missing the article the is about RAM not the magic of Apple. Reply
minijedimaster - Friday, September 16, 2011 - link
How do you take this article and make it about Apple? Seriously? Pretty cool tech though. Replymenting - Thursday, September 15, 2011 - link
Weasel,Micron is trying to get another large DRAM manufacturer on board so they can go to JEDEC and try to make it a standard.
Currently, it's not intended for mass market, but for specialized servers only. Reply
prophet001 - Friday, September 16, 2011 - link
Help me understand. Isn't 1 Terabit per second a phenomenal data transfer rate? I understand that there's probably not much need to consume that much data in any system at this point but part of the article is about opening the door for future technology. I thought it was a rather substantial advancement. ReplyKlinky1984 - Friday, September 16, 2011 - link
Actually 1 terabit/sec & greater speeds are already being used by high-end GPUs. Though it's not done as "efficiently" as this tech could possibly do it. ReplyDanNeely - Friday, September 16, 2011 - link
Stuff I've read indicates that intel intends to put it in their next CPUs as video ram. The silicon connect layer means it has to be an SOC style package because you can't customize layout like you can with a PCB for each vendor. ReplySaltticus - Thursday, September 15, 2011 - link
All this new technology gets me so excited!PS. I believe the word 'parallel' is misspelled in the middle of the first paragraph. Reply
mlkmade - Thursday, September 15, 2011 - link
Why no news on SB-E? We hardly know anything "official" about it and its supposed to come out in 2 months.Usually, at IDFs in the past, Intel has always spilled the beans on the most the new/impeding uARCHs that will launch shortly after the show.. Reply
MODist - Thursday, September 15, 2011 - link
Finally 3D technology is getting exciting. Replytoyotabedzrock - Friday, September 16, 2011 - link
I read this and can't help but notice the similarities to FBDIMM and RDRAM.They need to just face up the fact they will have to pay the licensing fee for DDR4 and DDR5. Reply
stephenbrooks - Sunday, September 25, 2011 - link
Does this logic layer tie in well with the idea of an optical CPU-RAM path? E.g. the silicon lasers could be put on the RAM's logic layer in the end. ReplyDDR4 - Thursday, February 02, 2012 - link
The memory chips will each take the same amount of voltage as on a non-stacked module, thus it increases the power intake.Will this type of module require a heat sink? Reply