The Tenderloin and the Two Buck Chuck

As for the idea of Intel integrating a GPU onto their CPUs, NVIDIA painted a rather distasteful picture of mixing together something excellent with something incredibly sub par. The first analogy Jen-sun pulled out was one of someone's kid topping off a decanted bottle of '63 Chateau Latour with an '07 Robert Mondavi. The idea of Intel combining their very well engineered CPUs with their barely passable integrated graphics is an aberration to be avoided at all costs.

This isn't to say that CPUs and GPUs shouldn't work together, but that Intel should stick to what they know. In fact, NVIDIA heavily pushed the idea of heterogeneous computing but decried the idea that taking a system block diagram and drawing a box around the CPU and GPU would actually do anything useful. NVIDIA definitely wants their hardware to be the manycore floating point compute hardware paired with Intel's multicore general purpose processors, and they try to paint a picture of a world where both are critical to any given system.

Certainly CPUs and GPUs are currently needed and unless Intel can really pull out some magic that won't change for the foreseeable future. NVIDIA made a big deal of relating this pair to Star Trek technology: you need both your impulse engines and your warp drive. Neither is useful for the task the other is designed for: short range navigation can't be done with a warp drive, and impulse engines aren't suitable for long distance travel requiring faster than light speeds. The bottom line is that hardware should be designed and used for the task that best suits it.

Again, this says nothing about what happens if Intel brings to market a competitive manycore floating point solution. Maybe the hardware they design will be up to the task, and maybe it won't. But Jen-sun really wanted to get across the idea that the current incarnation of the CPU and the current incarnation of Intel's GPU technology are nowhere near sufficient to handle anything like what NVIDIA's hardware enables.

Coming back the argument that it's best to stick with what you know, Jen-sun stated his belief that "you can't be a great company by doing everything for everybody;" that Intel hardware works fine for running operating systems and for applications where visualization is not a factor at all: what NVIDIA calls Enterprise Computing (in contrast to Visual Computing). Going further, he postulates that "the best way for Google to compete against Microsoft is not to build another operating system."

Making another back handed comment about Intel, Jen-sun later defended their recent loss in market share for low end notebook graphics. He held that the market just wasn't worth competing in for them and that other companies offered solutions that fit the market better. Defending NVIDIA's lack of competition in this market segment, he doesn't say to himself: "Jen-sun, when you wake up in the morning, go steal somebody else's business," but rather "we wake up in the morning saying, 'ya know, we could change the world.'"

Intel's Graphics Performance Disadvantage New Spin on Computer Marketing & Final Thoughts
Comments Locked

43 Comments

View All Comments

  • segerstein - Saturday, April 12, 2008 - link

    As I read the article, but I wasn't wholly convinced about the arguments made by the CEO. As we have seen with EEE and other low cost computers, the current technology was about serving the first billion of people. But most people still don't have computers, because they are too expensive for them.

    Nvidia, not fully addressing even the first billion, because of its expensive discrete solutions, will see its market share shrink. Besides, there are many consumer electronics devices that would benefit from a low powered "System-on-a-chip".

    Intel has Atom+chipset, AMD bought ATI precisely because it want to offer low powered "System-on-a-chip" (but also multicore high performing parts).

    It would only make sense for Nvidia to buy VIA. VIA Isaiah processor seems promising. This was they could cater to a smaller high-end market with discrete solutions and to a growing market for low cost integrated solutions.
  • BZDTemp - Saturday, April 12, 2008 - link

    Seems Nvidia does not like to be in the receiving end.

    I do remember Nvidia spreading lies about PowerVR's Kyro 3D cards sometime back when it looked like they might have a chance to be the third factor in 3D gaming hardware.

    With ATIAMD in crisis I think it's great that Nvidia and Intel start competing even though I sincerely hope ATIAMD to come back strong and kick both their asses. After all I can't recall the red/green guys using unfair tactics and like to see integrity rewarded.


    Finally I would Anandtech to be more critical when reporting from such venues. Try and google Kyro Nvidia and pdf to find the old story or just check out the pdf directly: ftp://ftp.tomshardware.com/pub/nvidia_on_kyro.pdf">ftp://ftp.tomshardware.com/pub/nvidia_on_kyro.pdf
  • duron266 - Saturday, April 12, 2008 - link

    "Jensen is known as a very passionate, brilliant and arrogant guy but going against Intel on a frontal full scale might be the worst thing that they ever decided. Nvidia went from close to $40 to current $19.88 which means that the company has to do something to fix this but this is simply too much."
  • duron266 - Friday, April 11, 2008 - link

    NVIDIA...too high-profile,

    if they were going to vanish,

    Jen-Hsun would be the number one to blame...
  • anandtech02148 - Friday, April 11, 2008 - link

    there's a huge differences when audio is being process on a many core cpu like intel and on a stand alone pci card.
    putting the pci card in you can feel the cpu less bogged down, and the motherboard chipsets generating less heat.

    An Integrated gpu, audio, and many cores doesn't solve the problem, there will be bandwith issues too.
    Nvidia should hit Intel hard with a low powered, high performanced gpu, to prove a point.

  • epsilonparadox - Friday, April 11, 2008 - link

    NVidia will never be able to compete on the low power arena with intel. Intel just have a better process and fabs for that process. NVidia has other companies building their chips. Plus graphics chips don't go with a new process like cpus do.
  • poohbear - Friday, April 11, 2008 - link

    very nice article, but how many of us are gonna understand the analogy:

    "someone's kid topping of a decanted bottle of '63 Chateau Latour with an '07 Robert Mondavi."

    wtf is that?!? im guessing he's talking about wine, but whatever.
  • kb3edk - Friday, April 11, 2008 - link

    Well of course it's a wine reference, consider the audience: Institutional investors. These are people who are much more likely to spend $500 worth of disposable income on a bottle of Chateau Something-Or-Other instead of a GeForce 9800 GTX.

    Also note the Mondavi reference because they're in Napa just on the other side of San Fran from nVidia HQ in Silicon Valley.

    And it's still a bit odd seeing such strong words from nVidia against Intel considering that nVidia/Intel is the main enthusiast platform out there these days (as opposed to an all-AMD solution).

  • Khato - Friday, April 11, 2008 - link

    Really quite enjoyed this, makes me all the more confident in the products Intel is currently developing.

    I mean really, how similar does NVIDIA's ranting sound compared to AMD's back when they were on top? No question that they're more competent than AMD, but they've done just as good a job at awakening a previously complacent beast at Intel. Heh, and they've never had to compete with someone that has a marked manufacturing advantage before...
  • tfranzese - Sunday, April 13, 2008 - link

    Intel is no beast in these parts. Their track record in the discrete segment and drivers up to this day is complete failure. Until they execute on both the hardware and software, both monumental tasks, they'll continue to be right where they are in the discrete market (i.e. no where).

Log in

Don't have an account? Sign up now