Jen-Hsun Talks about "Bad Chips"

You may have heard that NVIDIA has had some issues with higher than expected GPU failure rates in notebooks. The parts that seem to be impacted are G84/G86 based GPUs and the failure appears to be something related to the physical manufacturing of the GPU itself.

A reporter from The Inquirer asked Jen-Hsun to clarify what other GPUs might be affected by the manufacturing issue and to basically get more specific, publicly with what users can expect.

Jen-Hsun said that he was the first to admit that this problem existed, having set aside $200M to deal with any potential repairs that had to be made. He characterized the issue as follows:

"We know that there are some failures that are associated with our chips. We know that its related to specific combinations of the chip, the deisgn of the notebook...depending on the design of the thermal solution...and all of the software that goes on top of it...sometimes it will fail. Most of the notebooks are fine...certain notebooks have this problem."

There isn't an official recall, but if your notebook fails and it's got a GPU in it that NVIDIA agrees may be problematic, your OEM should repair it for you.

The question Jen-Hsun didn't answer is what GPUs in specific that this problem impacts and whether or not it extends to the entire line of G8x GPUs, desktop and mobile.

Jen-Hsun did mention that the problem is very specific and can crop up over a long period of time. While NVIDIA's competitors are aware of what caused the problem, none of them appear to be impacted by it (even those that manufacture at the same facilities as NVIDIA), it just seems to be something that is exclusive to NVIDIA.

Who is to blame? According to Jen-Hsun both NVIDIA and its OEM partners are to blame for the issue, although he would not go as far as to blame TSMC, its manufacturing partner.

What about Lucid?

What does NVIDIA think about Lucid and Hydra? Well, some good things and some bad things. Of course no one I asked gave me anywhere near an answer about how they thought SLI would be impacted if Lucid met their goals. And there's really no good way to dodge that either; it always ends in sort of a trailing off into other things attempt to change the subject.

Some people were a little more willing to talk about the technology itself, even if they didn't go near how it would impact their platform position.

Jen-Hsun flat out said he thought that it is naive of Lucid to believe that their solution can divide the work load effectively and in a way where they get linear scaling and only the API is taken into account (the application doesn't matter). He believes that you must tune for specific applications and that high compatibility is difficult, let alone multi-GPU scaling.

David Kirk had a similar take on things but went into a little more depth. His issue is that even if you can split up the workload, there are going to be resources every GPU that's working on a scene will need in order to render it properly. Things rendered to textures, cube maps, and other "funny buffer games" make it so that rendering isn't separable. There are just some things that need to happen on all GPUs and this will make it so that performance doesn't scale linearly.

Which makes sense. But the Lucid guys still keep saying they can get linear scaling regardless of the game on any platform that uses their hardware. Which sounds nice even if it sounds a little far fetched. They did show off two games at IDF, but we would really like to see more and to have the opportunity to test it's epic scaling claims.

It's tough to tell whether NVIDIA is just being cocky and writing off a company that they don't think can pull something off, of if the logical stuff that NVIDIA is saying is really rock solid enough that they don't have to worry. Time will tell of course.

Personally, I like to dream big. But it is easy to be skeptical about Lucid because their claims are so dramatic. Like they say about anything that sounds too good to be true ... But really, I still want someone to tell me what happens to SLI and the NVIDIA chipset business if Lucid's product really delivers on its promises.

Final Words

So far we haven't been too impressed with NVISION, but access to folks like Jen-Hsun and David Kirk has thus far been worth it.

We're off to hear Epic's Tim Sweeney speak about Unreal Engine 3, check back later for more updates from NVISION.

NVIDIA's Mobile Strategy: "Completely Focus on Smartphones"
Comments Locked

18 Comments

View All Comments

  • Gary Key - Wednesday, August 27, 2008 - link

    End of next week or Monday 9/8 for 790GX plus 780a comparison/update, retested with the 8.8 drivers this week and they changed the scope and tone of the story we had almost completed. The G45 will be up right before it, just to show a comparison on where Intel is at this point, which honestly is not far considering the driver and repeater problems.
  • Theunis - Thursday, August 28, 2008 - link

    Don't forget 790GX and G45 on Linux tests! Man I'm getting worried about Linux being left in the dark when it comes to hardware decoding for H.264 :(
  • tayhimself - Tuesday, August 26, 2008 - link

    Flat out stating that it wasn't too interesting. Nvidia are in a difficult position and playing their cards very close to their chest.
  • DigitalFreak - Tuesday, August 26, 2008 - link

    There's only reasons Nvidia is being forced out of the chipset market are:

    1) They're being assholes when it comes to SLI compatibility with non Nvidia chipsets. Neither Intel nor AMD need Nvidia chipsets anymore. Both have well designed products to cover their entire markets.

    2) Their chipset products are buggy as hell. When's the last time Nvidia released a chipset that didn't cause some type of data corruption? Nforce4? Nforce2?
  • DigitalFreak - Tuesday, August 26, 2008 - link

    As far as Lucid goes, do you really think Intel would be dumping boatloads of cash into this outfit if they didn't think the technology held promise? It's not going to cure world hunger, but it sounds like the Nvidia PR machine is spinning up 'cause they're getting worried.
  • Griswold - Wednesday, August 27, 2008 - link

    Right. Thats why Intel dumped billions and billions on the HUGELY successful Itanium, which was intended to eventually replace x86 in the consumer place as well. And it did! No, wait - it didnt...
  • JarredWalton - Tuesday, August 26, 2008 - link

    $50M isn't really "boatloads" to Intel - I think that's the value I heard in one of the reports? R&D is expensive, and if Hydra/Lucid ends up going nowhere Intel won't worry too much - they'll probably still get some patents and other interesting info from the whole process.
  • evelyn - Saturday, November 27, 2010 - link

    Winter boots Big Bargain. New Year's wearing new Ugg boots.
    You have enough new boots? Cool enough beautiful? Have brand?
    If you do not have words. I would like to introduce you to my company’s ugg boots.
    Great Brands. Fashionable. Is absolutely unique.
    Do not look at the money. Just a few seconds you have time. You can get what you want.
    For more articles please visit http://www.uggbootspp.com/

    (I)Our Company is Online store, and we vogu emalls offer top quality ugg boots at good price.

    (II) In the past 3 years, we have sent great many ugg boots to the customers in USA, Europe, ASIA and other area, also we have lots of experience in dealing with online business oversea

    (III) Generally, boots are delivered in fast and safety way Of course, ugg boots will be packaged in original box with tags

Log in

Don't have an account? Sign up now