NVIDIA’s Position

The mood around NVIDIA's offices the day of the AMD/ATI announcement was one of joy.  There was no tone of worry or words of concern; you just got the feeling that every NVIDIA employee you talked to had a grin on their face that day.  As far as NVIDIA is concerned, the move is ATI admitting defeat, coming to terms that it cannot compete with NVIDIA and needs to get out.  Obviously, we are talking about ATI's competitor and there is a certain amount of PR going on, but let's take a closer look at the NVIDIA perspective.

The Graphics Market is Ours

NVIDIA basically feels ATI is ceding over control of the GPU market.  They feel AMD is not going to be as interested in cut throat competition between itself and NVIDIA.  If the deal goes through, NVIDIA will become the sole independent provider of GPUs to the industry. 

To NVIDIA, this merger doesn't have anything to do with on-die GPUs, as AMD didn't need to buy a company to accomplish this.  Intel has been manufacturing "good enough" integrated graphics chipsets for years, and it also tried to go down the path of an on-die GPU with the failed Timna project.  For those of you that don't remember, Timna was going to be Intel's system on a chip complete with on-die graphics, designed to target the sub-$600 PC market.  However just before Timna's release, the plug was pulled as the average selling prices of PCs were in free fall with no bottom in sight.  The Timna team was then tasked to create an efficient mobile processor called Banias, which eventually led to the development of Intel's latest Core 2 processors. 

From NVIDIA's standpoint, the acquisition also has nothing to do with gaining the ability to use more custom logic or being able to manufacture GPUs in house.  For starters, both ATI and NVIDIA can produce their own custom logic driven GPUs if they put in the resources; AMD isn't the only one that can design custom logic.  And from the manufacturing standpoint, since the fabs will always give preference to CPUs leaving the older fabs over manufacturing GPUs, there's not really any disadvantage to just manufacturing at a 3rd party foundry like TSMC. 

What if Intel Makes GPUs?

From NVIDIA's perspective, last time it checked, none of its GPU designers got hired by Intel, so unless that changes NVIDIA will continue to be a dominant leader in the GPU business regardless of whether or not Intel decides to enter the discrete graphics market.  While Intel can integrate very low end graphics cores with its chipsets, it's not as easy to do the same with mid range or high end graphics. 

The AMD/ATI acquisition leaves NVIDIA as the only independent GPU/platform company that provides technology to both AMD and Intel.  In the words of NVIDIA’s Dan Vivoli, Executive VP of Marketing, "we're comfortable competing with Intel in the GPU market.  It's our home court". Vivoli added that "last I checked, none of our nForce engineers are part of the merger.  We will have to continue to build leading MCP technology just like before". 

Vivoli’s attitude towards the situation is quite pragmatic.  Basing corporate direction over the potential threat of a company like Intel that has never been able to produce a successful high-performance GPU is a bit panicky.  He added in closing, "likewise, nothing has changed on the Intel front.  We will continue to have to innovate in order to offer better value than Intel.  It's a world we are comfortable and familiar with." 

Two Wrongs don't make a Right

The way the market currently stands is this: Intel and NVIDIA are the stronger marketing companies, and they have a history of better execution than their competitors.  The acquisition is combining the two runners up in their respective industries; to expect the outcome to be this one tremendous power is wishful thinking at best. 

NVIDIA has been building great technology for years now, and none of its engineers went to ATI/AMD, so what is fueling the belief that AMD/ATI can somehow build and execute this perfect chipset?  Remember that AMD is not like Intel in that it is far more open with its partners, making NVIDIA's job of designing chipsets for AMD CPUs not as difficult as it is on the Intel side of things.  AMD has publicly stated that the ATI acquisition will not change how it works with NVIDIA, so as long as that's true then NVIDIA should be able to continue to build world class products for AMD. 

AMD Still Needs Us (and we need AMD)

Currently NVIDIA ships many more AMD chipsets than it does Intel chipsets, but that may begin to change with the release of Intel's Core 2 processors.  In the future, competition for AMD platforms may end up resembling the desktop GPU market, with ATI and NVIDIA each taking about 50% of AMD chipset sales. 

How do you like our Brands?

Not only does AMD still need NVIDIA, but Intel does too.  NVIDIA has a number of very strong brands that you really can't get good competition to from anyone else: GeForce, nForce, Quadro and SLI are all things NVIDIA is well recognized for, and that will continue for the foreseeable future.  Unless Intel can come out with its own high end graphics and multi-GPU offerings, it needs NVIDIA's support.  NVIDIA also understands that dining with Intel is much like dining with the devil: the food may be great but you never know what else is cooking in the kitchen.  NVIDIA will surely have a plan of its own to remain competitive in the worst case scenario, and partnering with Intel isn't in the cards. 

ATI's Position Intel’s Position: The Silent Treatment
Comments Locked

61 Comments

View All Comments

  • johnsonx - Thursday, August 3, 2006 - link

    Yep, you two are both old. Older than me. Heath H8? I didn't think selling candy bars would pay for college. You actually had to build candy bars from a kit back then? Wow. ;)

    Mostly the 'kids' comment was directed at your esteemed CEO, and maybe Kubicki too (who I'm well aware is with Dailytech now), and was of course 99.9% joke. Anand may be young, but he's already accomplished a lot more than many of us ever will.
  • Gary Key - Thursday, August 3, 2006 - link

    where is the edit button... led to
  • PrinceGaz - Wednesday, August 2, 2006 - link

    Well according to ATI's investors relations webby and also Wikipedia, they were founded in 1985 and started by making integrated-graphics chips for the like of IBM's PCs, and by 1987 had started making discrete graphics-cards (the EGA Wonder and VGA Wonder).

    Yes, they quite obviously do predate the 3D revolution by many years. VGA graphics date from 1987 and no doubt the VGA Wonder was one of the first cards supporting it. I imagaine that EGA Wonder card they also made in 1987 would have had the 9-pin monitor connection you mention as that is the EGA standard (I've never used it but that's what the Wiki says).

    All useless information today really, but a bit history is worth knowing.
  • johnsonx - Wednesday, August 2, 2006 - link

    Yep, I stuck quite a few EGA and VGA wonder cards in 386's and 486's back then. They were great cards because they could work with any monitor. Another minor historical point: Monochrome VGA was common in those days too - better graphics ability than old Hercules Mono, but hundreds of $ less than an actual color monitor.
  • yacoub - Wednesday, August 2, 2006 - link

    Your comment should get rated up b/c you correctly state that ATI has been around for some time. Let us also not forget that NVidia bought 3dfx, 3dfx did not simply disappear. And Matrox, while mostly focused in the graphic design / CAD market with their products, has also survived their forays into the gaming market with products like the G200 and G400. Perhaps something about basing your graphics card company in Canada is the trick? :)
  • johnsonx - Wednesday, August 2, 2006 - link

    Well, 3dfx was dead. NVidia was just picking at the carcass. Matrox survives only because they make niche products for professional applications. Their 3D products (G200/G400/G450, Parhelia) were hotly anticipated at the time, but quickly fell flat (late to market, surpassed by the competition by the time they arrived, or very shortly after).
  • mattsaccount - Wednesday, August 2, 2006 - link

    >>NVIDIA also understands that dining with Intel is much like dining with the devil: the food may be great but you never know what else is cooking in the kitchen.

    The food in Intel's cafeteria is actually quite good :)
  • stevty2889 - Wednesday, August 2, 2006 - link

    Not when you work nights..it really sucks then..
  • dev0lution - Thursday, August 3, 2006 - link

    But the menu changes so often you don't get bored ;)
  • NMDante - Wednesday, August 2, 2006 - link

    Night folks get shafter with cafe times.
    That's probably why there's so many 24 hr. fast food offerings around RR site. LOL

Log in

Don't have an account? Sign up now