Final Words

Is NVIDIA in trouble? In the short term there are clearly causes to worry. AMD’s Eric Demers often tells me that the best way to lose a fight is by not showing up. NVIDIA effectively didn’t show up to the first DX11 battles, that’s going to hurt. But as I said in the things get better next year section, they do get better next year.

Fermi devotes a significant portion of its die to features that are designed for a market that currently isn’t generating much revenue. That needs to change in order for this strategy to make sense.

NVIDIA told me that we should see exponential growth in Tesla revenues after Fermi, but what does that mean? I don’t suspect that the sort of customers buying Tesla boards and servers will be lining up on day 1. I’d say best case scenario, Tesla revenues should see a bump one to two quarters after Fermi’s launch.

Nexus, ECC, and better double precision performance will all make Fermi more attractive in the HPC space than Cypress. The question is how much revenue will that generate in the short term.


Nexus enables full NVIDIA GPU debugging from within Visual Studio. Not so useful for PC gaming, but very helpful for Tesla

Then there’s the mobile space. NVIDIA could do very well with Tegra. NVIDIA is an ARM licensee, and that takes care of the missing CPU piece of the puzzle. Unlike the PC space, x86 isn’t the dominant player in the mobile market. NVIDIA has a headstart in the ultra mobile space much like it does in the GPU computing space. Intel is a bit behind with its Atom strategy. NVIDIA could use this to its advantage.

The transition needs to be a smooth one. The bulk of NVIDIA’s revenues today come from PC graphics cards. There’s room for NVIDIA in the HPC and ultra mobile spaces, but it’s not revenue that’s going to accumulate over night. The changes in focus we’re seeing from NVIDIA today are in line with what it’d have to do in order to establish successful businesses outside of the PC industry.

And don’t think the PC GPU battle is over yet either. It took years for NVIDIA to be pushed out of the chipset space, even after AMD bought ATI. Even if the future of PC graphics are Intel and AMD GPUs, it’s going to take a very long time to get there.

Chipsets: One Day You're In and the Next, You're Out
Comments Locked

106 Comments

View All Comments

  • medi01 - Saturday, October 17, 2009 - link

    Since when are sales based only on products qualities?
    How many buys are actually aware of performance details?
    How many of those, that are aware, are NOT one brand loyal?

    To me it rather seems nVidia, despite not having an answer to 5800 series for a few month will still successfully sell overpriced cards. It's AMD that will continue to make huge loses, having good products and "better value" price policy.

    Customers should be interested in healthy competition. "I buy inferior product from a company that already dominates the market" will simply kill the underdog, and then it'll show us... :(
  • kashifme21 - Wednesday, October 21, 2009 - link

    By supporting consoles, they might have gotten sales in the short term. However in the long term it has been a disaster for both of them.

    Consoles sales even putting the xbox 360 and ps3 together do not exceed 60million. which imo is not much of an ammount for these 2 companies.

    what has happened now is that the focus of developers has shifted to the consoles. which is why jumps in graphics have become stagnant now. the result in this is that even pc users dont need constant upgrades like the way they used to which means less sales from the pc market for nvidia and ati.

    Also as a note previously a pc user would need to make a switch in about 2 years to stay uptodate with the latest games, now since its developed with consoles in minds the cycle has become 7 years. So previously if a pc user needed to upgrade once in 2 years its only going to be 1 in 7yrs now, there simply wont be games out to take advantage of hardware.

    also the console market ati and nvidia thought to cater too is in the same situation. a console user will simply buy 1 console then make no hardware purchase for the rest of the generation unless the console fails.

    imo going for console sales might have given a sale in the short term butfor the long term its been bad for all the pc hardware makers be it cpu, gpu, ram, chipsets etc. as time goes on this will get worse specially if another console gen is supported.
  • KhadgarTWN - Sunday, October 25, 2009 - link

    For the part of console, I have a little diff thought.
    For very short period, consoles boost GPU sells; a bit longer, consoles devour PC gaming and affect hardware sells, that's true.

    But for a little longer? If selling GPU to console is a mistake, could they "Fix" it?
    Think about if AMD(ATi)/nVidia refused to develop graphic part of console and Larrabee failed, where is the next Gen console?

    For the years of PS2/DC/XBox, that's no big deal, Sony had its EE + GS, Nintendo had its own parts. Maybe XBOX would never born, but no big deal, consoles still live strong and prosporous.
    For now, no AMD, no XB360; no nVidia, no PS3. And for Nintendo part? Doubtful they could utilize their own part.

  • msroadkill612 - Wednesday, October 28, 2009 - link

    Am trying to make ammends for being a bit of a leech on geek sites and not contributing. Bit off topic but i hope some of you will find the below post useful.

    There is a deluge of requests around the geek forums re what grapics card to buy, but never have i seen a requestor specify whether they plan an always on PC vs "on only when in use".

    A$ .157 / kw sydney australia - oct 09 (A$=~.92 of a $US now)

    usa prices link:

    http://www.eia.doe.gov/cneaf/electricity/epm/table...">http://www.eia.doe.gov/cneaf/electricity/epm/table...
    (so sydney prices v similar to california - 15.29usc/ kw )

    so:

    in an always on PC, a graphics card which draws 10 watts less at IDLE than an alternative card
    (My logic here is - If you having fun at FULL LOAD, then who cares what the load power draw/cost is, but in most cases this is a small portion of the week and can logically be ignored - IF, and this is a big IF, your power save / sleep settings are set and working correctly)


    .157$/1000x10=.00157$ (substitute your local cost - always increasing though, negating you bean counters net present worth objections)

    x 24x365

    =A$13.75pa for each extra 10w idle draw (i hope the math is right)

    If you use air conditioning all day all year, you can theoretically double this cost.

    If, however, you use electric bar radiators for heating all day all year, then I afraid, my dear Watson, the "elementary" laws of physics dictate that you must waste no further time on this argument. It does not concern you, except to say that you are in the enviable position of soon being able to buy a formerly high end graphics card (am thinking dual core Nvidea here) for about the cost of a decent electric radiator, and getting quite decent framerates thrown in free with your heating bill.

    Using the below specs and above prices, a hd 4870 (90w) costs us$84.39 more to idle all year in california than the 5850/5870 (27w) at current prices. In about 18 months the better card should repay the premium paid for it.

    Hope this helps some of you cost justify your preferred card to those who must be obeyed.



    ATI HD 5870 & 5850 idle power 27W
    ATI Radeon HD 4870 idle 90w
    5770 18w
    5750 16w

    780g chipset mobo - AMD claims idle power consumption of the IGP is just 0.95W!


    Some nvidea card idle power specs: GPU Idle Watts


    NVIDIA GeForce GTX 280 30 W

    Gigabyte GeForce 9800 GX2 GV-NX98X1GHI-B 85 W

    ZOTAC GeForce 9800 GTX AMP! Edition ZT-98XES2P-FCP 50 w

    FOXCONN GeForce 9800 GTX Standard OC Edition 9800GTX-512N 48 W

    ZOTAC GeForce 9800 GTX 512MB ZT-98XES2P-FSP 53 W


    MSI NX8800GTX-T2D768E-HD OC GeForce 8800 GTX 76 W


    ZOTAC GeForce 8800 GT 512MB AMP! Edition ZT-88TES3P-FCP 33 W


    Palit GeForce 9600 GT 1GB Sonic NE/960TSX0202 30 W


    FOXCONN GeForce 8800 GTS FV-N88SMBD2-OD 59 w
  • Wolfpup - Friday, October 16, 2009 - link

    Okay, huh? I get it for the low end stuff, but mid range does this mean we're going to be wasting a ton of transistors on worthless integrated video that won't be used, taking up space that could have been more cache or another core or two?!?

    I have NO use for integrated graphics. Never have, never will.
  • Seramics - Friday, October 16, 2009 - link

    Nvidia is useless and going down. I dont mind they get out of the market at all.
  • Sandersann - Friday, October 16, 2009 - link

    Even if you money is not an issue for you. You want Nvidia and ATI to do well or at least stay competitive because the competition encourages innovation. without it, you will have to pay more for less features and speed. We might get a taste of that this Christmas and into Q1 2010.
  • medi01 - Friday, October 16, 2009 - link

    http://store.steampowered.com/hwsurvey/">http://store.steampowered.com/hwsurvey/
    AMD - 27%
    nVidia - 65% (ouch)

  • shin0bi272 - Friday, October 16, 2009 - link

    hey can you post the most common models too? The last time I looked at that survey the 7800gtx was the fastest card and most people on the survey were using like 5200's or something outrageous like that.
  • tamalero - Sunday, October 18, 2009 - link

    http://store.steampowered.com/hwsurvey/videocard/">http://store.steampowered.com/hwsurvey/videocard/

    most people are on 8800 series, it doesnt reflect "current" gen
    in last gen the 48XX series are increasing to 11% while the 260 series are around 4%

Log in

Don't have an account? Sign up now