Sean Maloney is getting his speaking practice in and he was back during the afternoon for the second set of keynotes at IDF. This one is a bit more interesting.

Sean started by tackling Pat Gelsinger's old playground: Servers. Nehalem-EX (8-core Nehalem) was the primary topic of discussion, but he also demonstrated the new 32nm Westmere-EP processors due out next year.

Westmere-EP is the dual-socket workstation/server version of Westmere (aka 32nm Xeon). The feature-set is nearly identical to Westmere on the desktop, so you get full AES acceleration for improved encrypt/decrypt performance. This is particularly useful for enterprise applications that do a lot of encryption/decryption.

The AES-NI instructions get added to x86 with Westmere. The x86 ISA will be over 700 instructions at that point.

The other major change to the Xeon platform is networking support. The Westmere-EP platforms will ship with Intel's 82599 10GbE controller.

Power consumption should be lower on Westmere and you can expect slightly higher clock speeds as well. If Apple sticks to its tradition, you can expect Westmere-EP to be in the next-generation Mac Pro.

What if you built a Core i7 using 1000nm transistors?

Intel has been on an integration rampage lately. Bloomfield integrated the memory controller and Lynnfield brought the PCIe controller on-die. Sean held up an example of what would happen if Intel had stopped reducing transistor size back in the 386 days.

Here's an image of what the Core i7 die would look like built using ~1000nm transistors instead of 45nm:

That's what a single Core i7 would look like on the 386's manufacturing technology

Assuming it could actually be built, the power consumption would be over 1000W with clock speeds at less than 100MHz.

Next he held up an Atom processor built on the same process:

And that's what Sean Maloney shaking an Atom built on the 386's manufacturing process would look like

Power consumption is a bit more reasonable at 65W, but that's just for the chip. You could drive a few modern day laptops off of the same power.

Useful comparisons? Not really, interesting? Sure. Next.

Larrabee Demo

Larrabee is of course at IDF, but its presence is very muted. The chip is scheduled for a release in the middle of next year as a discrete GPU.

Bill Mark, a Senior Research Scientist at Intel, ran a raytracing demo using content from Enemy Territory Quake Wars on a Gulftown system (32nm 6-core) with Larrabee.

The demo was nothing more than proof of functionality, but Sean Maloney did officially confirm that Larrabee's architecture would eventually be integrated into a desktop CPU at some point in the future.

Larrabee rendered that image above using raytracing, it's not running anywhere near full performance

Clarkdale: Dual Core Nehalem

Clarkdale is the desktop dual-core Nehalem due out by the end of this year with widespread availability in Q1 2010.

Clarkdale will be the ideal upgrade for existing dual-core users as it adds Hyper Threading and aggressive turbo modes. There's also on-package 45nm Intel graphics, which I've heard referred to as finally "good enough" graphics from Intel.

Two cores but four threads, that's Clarkdale

Jasper Forest

Take Nehalem with three DDR3 memory channels, add PCIe 2.0 and RAID acceleration and you've got Jasper Forest. Due out in Q1 2010 this is a very specific implementation of Nehalem for the embedded and storage servers.

Nehalem was architected to be very modular, Jasper Forest is just another example of that. Long term you can expect Nehalem cores (or a similar derivative) to work alongside Larrabee cores, which is what many believe Haswell will end up looking like.

Gulftown: Six Cores for X58

I've mentioned Gulftown a couple of times already, but Intel re-affirmed support for the chip that's due out next year.

Built on the 32nm Westmere architecture, Gulftown brings all of the Westmere goodness in addition to having 6-cores on a single die.

Compatibility is going to be the big story with Gulftown: it will work in all existing X58 motherboards with nothing more than a BIOS update.



View All Comments

  • TA152H - Wednesday, September 23, 2009 - link

    For me, stuff like this is more like reading about a Porsche. It's fun to read about, but it's not something I would realistically consider.

    You'd really pay $1000 for a processor? These days, with obsolescence being so rapid, it doesn't make a lot of sense for home use, and you'd find it difficult to see the difference.

    It's changed a lot. For example, even in 1987, IBM released an 8086 (PS/2 Model 30). The chip was created 1978. To put that in perspective, that would be like releasing a 1 GHz Pentium III now. Then again, maybe it's not so different with the Atom.

    I personally prefer ILP, to TLP, and would prefer two killer cores to six slower clocked ones. But, certainly, they both have their uses.

    The Bloomfield can easily handle memory at higher clock speeds than 1066. Intel could tomorrow say it handles 1333 MHz memory, with zero change to the processor. I don't think it's very meaningful.
  • GourdFreeMan - Thursday, September 24, 2009 - link

    No, but I would pay $300-$400 for the lowest clocked Gulftown. If I needed to I could then overclock it beyond the stock specs of $1000 version... though I would probably leave it at stock clocks for stability and longevity reasons most of the time. See my comments later in this thread about multi-socket systems, and how rapid architectural improvements have changed my spending habits.

    There is no reason why ILP and TLP cannot be improved in parallel, pardon the pun. Clock speeds, however, have hit a brick wall. Software desperately needs to go parallel, and there is only so much parallelism in the instruction stream that can be obtained without intentionally redesigning to make your software overtly parallel.

    Benchmarks tend to agree with your assertion that memory speeds aren't a pressing issue at current clocks, with current architectural designs, and with existing desktop software. In the server space there are already Xeons with official support for DDR1333, and I am sure most if not all desktop Nehalems can have their memory controller overclocked to utilize DDR3 speeds well in excess of 1333 MHz. I agree it is a non-issue. However, I also understand the OP's desire to know if there are any improvements to Gulftown beyond the number of cores. The answer to which is that there are eleven new instructions added to the Westmere instruction set -- six for doing AES and five carry-less multiplication instructions.
  • CrimsonFury - Thursday, October 8, 2009 - link

    I doubt you will see a Gulftown in the $300-$400 range. The cheapest one is expected to launch at $1,000. Gulftown is mainly designed as DP server chip that they happen to be offering as a high end desktop part as well. Maybe the gen after gulftown when they shrink to 22nm you could see that sort of pricing, or if Intel releas a 4 core 8 thread 32nm part later on. Reply
  • goinginstyle - Wednesday, September 23, 2009 - link


    When is your next copy and paste article going to be printed at Toms? Just wondering what kind of insight you will be offering since your comments around here continued to be "brain damaged". By the way, still waiting on your article comparing the 920 to the 860 to prove all your statements. Since all of the websites disagree with you, I am guessing it is hard to copy and paste information to support your lost cause.

  • TA152H - Wednesday, September 23, 2009 - link

    You know, for a loser like you to be casting insults, is almost amusing. Almost, because you're too stupid to be witty.

    I'll be working on an article on the original IBM PC when I have time. With the different technologies, market positioning, good/bad, things borrowed, etc... when I have time. Thanks for asking. If you want, I can send you an advanced, signed, copy, since you are a fan.

    It does take soooooo much time. And, your idiot remarks about copy and paste just show what a hypocrite you are. Show an example. In reality, I rewrote those pages so many times, I was almost afraid to reread them for fear I would spend another hour or two getting it just right. I wish I had the talent where I could write it once and get it write, but I didn't even attempt it. Finally, when I was finally relatively happy with the content and style, they edited it and dummied it down so people like you could understand the vocabulary used. If you want me to put up the original, just let me know and I'll send you a link. You are a fan, after all, so, for you, I will.

    Actually, I am very satisfied with Anand's tests showing how much faster, 3.5%, the Bloomfield is. Well, I'm not really, since he had the uncore on the Bloomfield faster. I don't know why these guys don't get it right, it's like they don't understand you have to get things precisely accurate, instead of saying it doesn't matter much. That's my frustration.

    If they want to send me the equipment, I'll be happy to test it for them. I swear I'll send the stuff back too. Really, I will. Well, I will for the Lynnfield. The Bloomfields, well, I do have a large cat, and occassionally some raccoons visit the house, and I can't be responsible if the raccoons somehow take the Bloomfields. Naturally, even they have the sense not to want the brain-damaged Lynnfields, so I'm not so worried about them.
  • Lifted - Thursday, September 24, 2009 - link

    Doesn't surprise me they post articles from you at TH. So far, in 2 posts, you've managed to spew out such inane tripe as

    "limitations, of which you probably have many."

    "You're too stupid"
    "jackasses like you"
    "for a loser like you"
    "you're too stupid"
    "your idiot remarks"

    Do you really expect anyone to take you seriously?
  • FATCamaro - Tuesday, September 22, 2009 - link

    Why the Lynnfield hate. Get a grip... I'd like a 2 core 4 thread machine that has a GPU on package for sure. Reply
  • wumpus - Wednesday, September 23, 2009 - link

    The hate is having to pay a premium for graphics that suck. Intel has been claiming to take over graphics since before 3dfx departed from the seen. The idea of not only losing a core or two, but losing it to an Intel GPU will make all but the most committed fanboys run to AMD (or possibly current i7's and nvidia).

    On the other hand, Intel might realize this (actually, I assume most of them do. The question is what happens to the poor saps who says the emperor has no clothes). If most home Dells come with some sort of Larabee thrown in, if no more than a few extra x86 cores, this will allow software to be written for the least common denominator being able to run plenty of (low power) threads. Eventually, this will make a big difference.

    Just don't expect to ever use that GPU (and get anything but "minimum suggested hardware" preformance).
  • TA152H - Tuesday, September 22, 2009 - link

    The same reason the market seems to hate it. It doesn't make a lot of sense for a lot of the market.

    I don't hate it, I just don't think it's a useful product in a lot of instances.

    There are sites like this that built it up a lot, and then didn't want to admit they were wrong, but, so far, it's not doing so well. Why would it?

    I think the technology will really take off though, when they can integrate the graphics. The brain-damaged design right now seems to be because it's just an interim chip. Probably even Intel knew it sucked, and wouldn't have much impact, but when they get the video on it, everything will change. The inferior PCIe implementation all of the sudden stops being bad, since you have video on board. The price goes down a lot, power use, already good, goes down a lot (considering you need a discrete card now), and all of the sudden you have an inexpensive, low power, system with outrageous performance in that segment.

    It's like looking at Florida in College Football. They're great for a college team, but if they play in the pros, they'd be abused. As soon as the Lynnfield, or really what follows it, finds its proper place as a low cost, low power system (Celeron), it's going to be extremely competitive, and I think almost unavoidable for a lot of the market. As it is, it's just not a useful platform for a lot of people. Anyone with brains will get a 920 and overclock it, and get better performance, better flexibility, and better upgradeability. Anyone without any will not know that the Lynnfield is faster than the Athlon, and will not want to pay as much for a 2.66 GHz chip when they can get a 3.2 GHz chip from AMD for the same price.

    Yes, people who know more will understand, but they are going to get Bloomfield. So, where is the market? The i5 750, sure, it's a nice price point, and a decent processor. But, really, it's not a big segment. Once they get onboard video, I think it's going to make Core 2 instantly obsolete, and, because it's 32nm, and they are only using one chip for the chipset, they probably can sell it inexpensively.

    So, yes, I think Lynnfield is basically crap. I think it's an interim chip without much use as it is, at the price it is (except for some i5 750). But, I think the next interation that use this technology and adds video, will address a large segment very well. But, that's the future. For the present, I don't think Lynnfield really matters much. I think AMD outdid them with the $99 quad Athlon.
  • tim851 - Wednesday, September 23, 2009 - link

    How do you know that the market hates it? Lynnfield has only been available for 2 weeks. You sure seem to be feeling the need of defending your 920 purchase. Yes, you paid a couple of bucks more for the same performance, just get over it.

    And I wouldn't call AMD's 99$ quad-core "outdoing" anybody. What they are doing is 1) ruin themselves some more and 2) kill their price performance for years to come. I know they don't have options, but they're right back where they were with the last of the K6's - they're the buy-em-if-you-need-it-cheap company.

Log in

Don't have an account? Sign up now