Final Words

Is NVIDIA in trouble? In the short term there are clearly causes to worry. AMD’s Eric Demers often tells me that the best way to lose a fight is by not showing up. NVIDIA effectively didn’t show up to the first DX11 battles, that’s going to hurt. But as I said in the things get better next year section, they do get better next year.

Fermi devotes a significant portion of its die to features that are designed for a market that currently isn’t generating much revenue. That needs to change in order for this strategy to make sense.

NVIDIA told me that we should see exponential growth in Tesla revenues after Fermi, but what does that mean? I don’t suspect that the sort of customers buying Tesla boards and servers will be lining up on day 1. I’d say best case scenario, Tesla revenues should see a bump one to two quarters after Fermi’s launch.

Nexus, ECC, and better double precision performance will all make Fermi more attractive in the HPC space than Cypress. The question is how much revenue will that generate in the short term.


Nexus enables full NVIDIA GPU debugging from within Visual Studio. Not so useful for PC gaming, but very helpful for Tesla

Then there’s the mobile space. NVIDIA could do very well with Tegra. NVIDIA is an ARM licensee, and that takes care of the missing CPU piece of the puzzle. Unlike the PC space, x86 isn’t the dominant player in the mobile market. NVIDIA has a headstart in the ultra mobile space much like it does in the GPU computing space. Intel is a bit behind with its Atom strategy. NVIDIA could use this to its advantage.

The transition needs to be a smooth one. The bulk of NVIDIA’s revenues today come from PC graphics cards. There’s room for NVIDIA in the HPC and ultra mobile spaces, but it’s not revenue that’s going to accumulate over night. The changes in focus we’re seeing from NVIDIA today are in line with what it’d have to do in order to establish successful businesses outside of the PC industry.

And don’t think the PC GPU battle is over yet either. It took years for NVIDIA to be pushed out of the chipset space, even after AMD bought ATI. Even if the future of PC graphics are Intel and AMD GPUs, it’s going to take a very long time to get there.

Chipsets: One Day You're In and the Next, You're Out
Comments Locked

106 Comments

View All Comments

  • Zool - Thursday, October 15, 2009 - link

    And what is such a novelty on Nvidia that other dont hawe ? Oh wait maybe the PR team or the shiny flashy nvidia page that lets you believe that even the most useless product is the customers neverending dream.
    I need to admit that AMD is way behind in those areas.
    I wouldnt say its novelty but it seems its working for the shareholders.
  • jasperjones - Wednesday, October 14, 2009 - link

    Anand, while I generally agree with your comment, I believe there is one area where Nvidia has a competitive advantage: drivers and software.

    Two examples:
    - the GPGPU market. On the business side, double-precision (DP) arithmetic is of tremendous importance to scientists. GT200 is vastly inferior in DP arithmetic to R700, yet people bought Nvidia due to better software (ATI is also better in DP performance if you compare its enterprise GPUs to similarly-priced Nvidia GPUs). On the consumer side, look at the sheer number of apps that use CUDA vs the tiny number of apps that use Stream. If I look at other things (OpenCL or C/C++/Fortran support), I also see Nvidia ahead of AMD/ATI.

    - Linux drivers. AMD has stepped things up but Nvidia drivers are still vastly superior to ATI drivers.

    I know they're trading blows in some other areas closely related to software (Nvidia has Physx, AMD DX11) but my own experience still is Nvidia has the better software.
  • medi01 - Friday, October 16, 2009 - link

    Pardon me, but isn't like 95+% of the gaming market - Windows, not Linux?
  • AtwaterFS - Wednesday, October 14, 2009 - link

    Where is Silicon Doc? Is he busy shoving a remote up his a$$ as Nvidia takes it up theirs? MUAHAHAHAHA!

    Seriously tho, I need Nvidia to drop price on 58xx cards so I can buy one for TES 5 / Fallout 4 - WTF!
  • tamalero - Sunday, October 18, 2009 - link

    he was banned, you forgot?
  • Transisto - Wednesday, October 14, 2009 - link

    I never play games but spend a lot of money (to me) and time info a folding farm.

    The winter is setting and I plan on heating the whole house from these. 9600 9800 and 260. So timing is now !

    To me this is a bad time to invest into folding GPUs because the decisions are taking too much precious brainwidth.

    My worries are:

    Will there still be a demand for low end 9600 and gts250 even g200 card once the G300 come out ?

    And more importantly would the performance of these antique g92 and g200 still be efficient at research computing (price wise).

    I was actually purchasing more gtx 260s because these could still sell out after I upgrade to better. But these day I am more into buying used g92s on the cheap.

    Thank You.
  • The0ne - Wednesday, October 14, 2009 - link

    This doesn't come as a shock to the "few" of us that had believe NVidia was in trouble, won't have competition and will most likely leave the high end business video market. I've even posted a link on a Dailytech article, although from a ironic website name.

    What I'm concern about is why there aren't reports are the issues around NVidia's business/management? They have been undergoing restructuring for at least a year now and workers have been laid off here and there. Does Anandtech have any info on this matter? I haven't check lately but the CA based has been struggling.

    Personally, I think some "body" has gone Financial on the business and NVidia is suffering. Planning and strategies are either nonexistent, poor, or poorly communicated/executed. I think they're going to fail and go the way of Matrox. It's too bad since they were doing well. I've been around too many companies like this to not know that upper management just got greedy and lazy. Again, that's just personal opinion from what I've read and know.

    Anyone has any direct info regarding my concerns?
  • iwodo - Wednesday, October 14, 2009 - link

    Where did you heard they have laid off? As far as i am concern they are actually hiring MORE engineers to work on their products.
  • frozentundra123456 - Wednesday, October 14, 2009 - link

    I have to say that I am somewhat disappointed with AMDs new lineup, even though I would prefer AMD to do well. They sure need it.

    If nVidia could switch to DDR5 and keep a larger bus, it seems like they could really outperform AMD. Even the current generation of cards from nVidia is relatively competitive with AMDs new cards except of course for Dx11. The "problem" is that nVidia seems to be trying to do too much with the GPU instead of just making good graphics performance.

    Anyway, lets hope that the new generation from nVidia at least is competitive enough to push down prices on AMDs new lineup, which seems overpriced for the performance, especially the 5770 and to a lesser extent the 5750.
  • MonkeyPaw - Wednesday, October 14, 2009 - link

    The problem is, it's all about making a profit. The GT200 was in theory a great product, but AMD essentially matched it with RV770, vastly eroding nVidia's profits on their entire top line. GT300 might be fantastic, but let's be realistic. The economy kinda sucks, and people are worried about becoming unemployed (if they aren't already). Being the best at all costs only works in prosperous times, but being good enough at the best price means much more in times like these. And considering that it takes 3 monitors to stress a 5870 on today's titles, how much better does the GT300 need to be?

Log in

Don't have an account? Sign up now