Introduction

Hot on the heels of the launch of their 9800 series products, NVIDIA is holding a Financial Analyst Day. These are generally not filled with the normal technical glitz and glitter an Editors Day has, but the announcements and material covered are no less important to NVIDIA as a company. NVIDIA has an unusually large institutional ownership rate at 84% (versus 79% and 66% for AMD and Intel respectively) so the company holds these Analyst Days in part to keep its institutional investors happy and well informed about the company’s progress.

As far as we members of the press are concerned however, Analyst Days are a valuable chance to learn about the GPU market, and anything that could impact the bottom line can help us understand NVIDIA's direction, motivation, and even the reasoning behind some of the engineering decisions they make. Today saw a lot of posturing for battles to come, and we were not disappointed.

Waking up the Beast

Most of the morning was dedicated to NVIDIA taking some time to do a little PR damage control. They've stepped out to defend themselves against the doom and gloom statements of other players in the industry. With Intel posturing for a move into the graphics market and proclaiming the downfall of rasterization and discrete graphics at the same time, NVIDIA certainly has reason to address the matter.

And we aren't talking about some standard press release boiler plate filled with fluffy marketing speak. This time, Jen-sun Huang, the man himself, stepped out front and addressed some of the concerns others in the industry have put forth. And he was out for blood. We don't get the chance to hear from Jen-sun too often, so when he speaks, we are more than happy to listen.

One of the first things that Jen-sun addressed (though he didn't spend much time on it) is the assessment by Intel's Pat Gelsinger that rasterization is not scalable and won't suit future demands. He largely just threw this statement out as "wrong and pointless to argue about," but the aggregate of the arguments made over the day all sort of relate back to this. The bottom line seems more like Intel's current approach to graphics can't scale fast enough to meet the demands of games in the future, but that speaks nothing about NVIDIA and AMD's solution which are at least one if not two orders of magnitude faster than Intel graphics right now. In fact, at one point Jen-sun said: "if the work that you do is not good enough … Moore's law is your enemy."

This seems as good a time as any to address the tone of the morning. Jen-sun was very aggressive in his rebuke of the statements made against his company. Many times he talked about how inappropriate it is for larger companies to pick on smaller ones through the use of deceptive marketing tactics (ed: Intel is 11.5 times as large as NVIDIA by market cap). To such attacks, he says "It's just not right!" and "we've been taking it, every single fricking day… enough is enough!" NVIDIA, Jen-sun says, must rely on the truth to carry its message in the absence of massive volumes of marketing dollars.

Certainly, things can be true even if they paint a picture slightly different than reality, but for the most part what Jen-sun said made a lot of sense. Of course, it mostly addresses reality as it is today and doesn't speculate about what may be when Larabee hits the scene or if Intel decides to really go after the discrete graphics market. And rightly enough, Jen-sun points out that many of Intel's comments serve not only to spread doubt about the viability of NVIDIA, but will have the effect of awakening the hearts and minds of one of the most tenaciously competitive companies in computing. Let's see how that works out for them.

Tackling the Market Share Myth
Comments Locked

43 Comments

View All Comments

  • Wiz33 - Wednesday, June 4, 2008 - link

    Intel have no bargaining power in the gamer circle. Even if they withheld licensing for the next gen platform. Gamer will just stay with the current gen chipset for nVidia SLi. Since games are usually much more GPU bound than CPU.

    In my case, I'm a serious gamer (but FPS lite)). I just clocked over 40 hours on Mass Effect PC since installing it last Thursday evening. In my current setup with a E6750 and 8800GTS. I still have tons of upgrade path both in CPU and GPU without moving onto the next Intel platform.
  • sugs - Sunday, May 11, 2008 - link

    As an IC designer, I can tell you right away that 3D graphics on the scale of the products that NVidia/ATI produce is not easy. Just look at the demise of Matrox, S3 and others.

    I think Intel is going to have problems getting the performance of their offerings to a competitive level in the near future but they do have alot of resources and it might be different 5 years down the line.
  • kenour - Tuesday, April 15, 2008 - link

    Dear Jen-sun,

    All Intel want is SLI on their chips (AS DO A LOT OF GAMERS)... so neck up you little arrogant prick and licence it to them! Don't come out with your little chest puffed our playing the tough guy! If you lease SLI technology to Intel so their highend chipsets will support SLI (Officially! Without having to use hacked drivers) for say $50US, and Intel SLI enabled all their X38/X48 boards, imagine the money that would come in. But you're too busy trying to hold on to the pathetic market share of your pathetic chipsets. There are so many gamers like me out there that would gladly purchase a second high end nvidia card and SLI them, but wont, because there is no way we would use an nvidia chipset... I would pay a $50US premium on a mobo to have SLI on an Intel chipset, and then I would buy another high end card. So put your pride aside and give them (AND US) what they want! More money for you, better gaming platform for us.

    Lots of Love,

    Kenour.

    p.s. Yes I'm still pissed off about the rumour that SLI would be available on the X38 :P It was reported here and Tom's from memory, then retracted a week later... Was the happiest week of my life :P (well, in regards to the PC world).
  • ielmox - Wednesday, April 16, 2008 - link

    I think nVidia is holding on to SLI as a marketing gimmick, because SLI doesn't make economic sense except for an extremely small market of wealthy and elitist gamers. I don't see any real value to SLI aside from the bragging rights of somewhat increased performance at a huge cost, and I think nVidia's strategy is guided by this knowledge.

    SLI uses a lot more power, generates much more heat, is buggier, harder to set up, and all this while offering diminishing returns compared with a dual or even single GPU card. In fact, unless you're SLI'ing the latest and greatest cards, you are better off with a non-SLI setup. Realistically, only a very tiny minority of gamers would ever go for an SLI set-up, so I'm guessing nVidia understands there is not much potential for financial gain.

    SLI is a bit of a white elephant to most people.

  • gochichi - Monday, April 14, 2008 - link

    The intel/nvidia combo is totally the "it" combo in computer gaming and has been for some time. AMD is working on "tidy-little-packages" with their new integrated graphics platform that can just about "game" right now, not in 2010.

    Nvidia, not Intel are the people that need to be working on an Intel platform equivalent in the integrated sector.

    I am glad to be an Nvidia customer, I am also glad to see their not taking cheap-shots at AMD. They even came out kind of defending AMD which is understandable, both are smaller companies and both respect each other's products.

    I can just picture it now: AMD laptops with synergy for $500 or less and no equivalent Intel solution due to a lack of cooperation with Nvidia.
  • perzy - Monday, April 14, 2008 - link

    Well the thing is I think that Intel has no choice. The x86-cpu is DEAD . The heatwall keeps the frequenzy down (seen any 4 GHz chip's lately?)
    and well they cant keep adding another core forever. Intel is in dire PANIC, belive me. They must branch out and the GPU, PPU and maybe a little audioPU is the chips with any development years left in them.
    And no there are no quantum or laser chips yet...
    Come on, if a blond guy from Sweden like me can understand this why dont you spell it out for everybody?

  • Galvin - Monday, April 14, 2008 - link

    Actually hitting 4gz for intel would be easy. hell a lot of people get those things to 4gz on air.

    So yeah they could do 4gz if they wanted to :)
  • perzy - Tuesday, April 15, 2008 - link

    So do you think that Intel is content and everything is going according to plan? We should be at 10 GHz now if according to that plan, and using the netburst architecture...
    The 3,8 GHz P4 was so hot that Intel had to ship it with high-expensive thermal paste. Otherwise it throttle constantly.
    It's strange to me that everybody(hardware sites for example) seems to think this heat thing is a little snag, a bump in the road. It isen't !
    'Oh lookey, not i get 2 cores for the price of one. How nice!'
    The chipmakers are trying to hide the crises their in. (Stock prices..)
    Why else do they buy GPU and PPU-makers?
  • Galvin - Tuesday, April 15, 2008 - link

    I dont think intel has a leg to stand on in the graphics market.

    The point i was making is if intel wants to sell core duo at 4GHZ its very doable since people can clock these to 4GHZ today on air cooling. Thats the only point I was making.

  • Galvin - Sunday, April 13, 2008 - link

    I listened to the whole presentation.
    Nvidia has a whole computer on a chip. Didn't even know they had this. Was impressed, this will be nice for mobile devices. Have to wait and see where this goes.

    Cuda known it for as long as anyone else. I cant wait till compressors for zip, Encoding, etc all become real time. Something no CPU will ever pull off.

    We all know intel is weak in graphics, intel has tons of cash. I dont think Nvidia is going anywhere and they'll most likely get bigger in time.

    Theres only 2 companies in the world that can make this kind of graphics technology AMD/Nvidia. To make claim that intel can just magically make a gpu to compete in a few years is crazy imo.

Log in

Don't have an account? Sign up now