NVISION 08 - Jen-Hsun Talks Larrabee, Mobility, VIA and Moreby Anand Shimpi & Larry Barber on August 26, 2008 3:00 PM EST
- Posted in
- Trade Shows
The Larrabee Question
Let's talk about Larrabee.
NVIDIA pointed out that Larrabee x86 isn't binary compatible with other Intel x86 processors (since it doesn't support any of the SSEs) - so there's no advantage there.
Honestly, x86 today is a burden for Larrabee, not a boon as it is not the most desirable ISA from anything other than a compatibility standpoint. The difference between G2xx and Larrabee is in the programming model not the ISA. It's the threading model with G2xx that the developer complaints are really about.
NVIDIA says that it simply takes a new approach to development - focus on data in and data out, rather than conventional top to bottom function coding. The issue is that programmers don't like to change the way they work.
The real question is: when Larrabee ships, will its threaded programming model be significantly easier than G2xx. At this point it's simply too early to tell, Intel thinks it will be and many of the developers I've spoken to agree, but NVIDIA keeps arguing that Larrabee's programming model will be just as different as CUDA and that NVIDIA has the inherent advantage here because of the experience it has had in building GPUs for the past 15 years.
Would NVIDIA Integrate a CPU?
David Kirk summarized, quite well, his thoughts on whether NVIDIA would ever pursue putting a CPU on die next to one of its GPUs.
Kirk's view is that at the low end there's a place for a single-chip CPU/GPU, he views integration (rightly so) as a low cost play. "None of our customers ask us for less performance, why would we ever take away part of our GPU and put a CPU in it?"
NVIDIA currently competes in the low end of the GPU market with its sub $75 GPUs and IGP chipsets. The integrated CPU/GPU does stand the chance of eating into NVIDIA's largest quantity market, and it doesn't look like NVIDIA stands the chance to compete there - at least in x86 desktops/notebooks. Why would you pay more for a NVIDIA chipset with integrated graphics, if you already get integrated graphics on every single CPU you buy?
We've got a future where AMD/Intel ship these hybrid CPU/GPUs on the low end, GPUs like the RV770 and Larrabee at the high end, and NVIDIA is already being pushed out on the chipset side (neither Intel nor AMD wants to be the #2 manufacturer of chipsets for their own CPUs). In the worst case scenario, if NVIDIA gets squeezed by everything I just mentioned over the next few years, what is NVIDIA's strategy going forward? Jen-Hsun actually highlighted one possible direction...