Blhaflhvfa.

There’s a lot to talk about with regards to NVIDIA and no time for a long intro, so let’s get right to it.

At the end of our Radeon HD 5850 Review we included this update:

Update: We went window shopping again this afternoon to see if there were any GTX 285 price changes. There weren't. In fact GTX 285 supply seems pretty low; MWave, ZipZoomFly, and Newegg only have a few models in stock. We asked NVIDIA about this, but all they had to say was "demand remains strong". Given the timing, we're still suspicious that something may be afoot.”

Less than a week later and there were stories everywhere about NVIDIA’s GT200b shortages. Fudo said that NVIDIA was unwilling to drop prices low enough to make the cards competitive. Charlie said that NVIDIA was going to abandon the high end and upper mid range graphics card markets completely.

Let’s look at what we do know. GT200b has around 1.4 billion transistors and is made at TSMC on a 55nm process. Wikipedia lists the die at 470mm^2, that’s roughly 80% the size of the original 65nm GT200 die. In either case it’s a lot bigger and still more expensive than Cypress’ 334mm^2 40nm die.


Cypress vs. GT200b die sizes to scale

NVIDIA could get into a price war with AMD, but given that both companies make their chips at the same place, and NVIDIA’s costs are higher - it’s not a war that makes sense to fight.

NVIDIA told me two things. One, that they have shared with some OEMs that they will no longer be making GT200b based products. That’s the GTX 260 all the way up to the GTX 285. The EOL (end of life) notices went out recently and they request that the OEMs submit their allocation requests asap otherwise they risk not getting any cards.

The second was that despite the EOL notices, end users should be able to purchase GeForce GTX 260, 275 and 285 cards all the way up through February of next year.

If you look carefully, neither of these statements directly supports or refutes the two articles above. NVIDIA is very clever.

NVIDIA’s explanation to me was that current GPU supplies were decided on months ago, and in light of the economy, the number of chips NVIDIA ordered from TSMC was low. Demand ended up being stronger than expected and thus you can expect supplies to be tight in the remaining months of the year and into 2010.

Board vendors have been telling us that they can’t get allocations from NVIDIA. Some are even wondering whether it makes sense to build more GTX cards for the end of this year.

If you want my opinion, it goes something like this. While RV770 caught NVIDIA off guard, Cypress did not. AMD used the extra area (and then some) allowed by the move to 40nm to double RV770, not an unpredictable move. NVIDIA knew they were going to be late with Fermi, knew how competitive Cypress would be, and made a conscious decision to cut back supply months ago rather than enter a price war with AMD.

While NVIDIA won’t publicly admit defeat, AMD clearly won this round. Obviously it makes sense to ramp down the old product in expectation of Fermi, but I don’t see Fermi with any real availability this year. We may see a launch with performance data in 2009, but I’d expect availability in 2010.


While NVIDIA just launched its first 40nm DX10.1 parts, AMD just launched $120 DX11 cards

Regardless of how you want to phrase it, there will be lower than normal supplies of GT200 cards in the market this quarter. With higher costs than AMD per card and better performance from AMD’s DX11 parts, would you expect things to be any different?

Things Get Better Next Year

NVIDIA launched GT200 on too old of a process (65nm) and they were thus too late to move to 55nm. Bumpgate happened. Then we had the issues with 40nm at TSMC and Fermi’s delays. In short, it hasn’t been the best 12 months for NVIDIA. Next year, there’s reason to be optimistic though.

When Fermi does launch, everything from that point should theoretically be smooth sailing. There aren’t any process transitions in 2010, it’s all about execution at that point and how quickly can NVIDIA get Fermi derivatives out the door. AMD will have virtually its entire product stack out by the time NVIDIA ships Fermi in quantities, but NVIDIA should have competitive product out in 2010. AMD wins the first half of the DX11 race, the second half will be a bit more challenging.

If anything, NVIDIA has proved to be a resilient company. Other than Intel, I don’t know of any company that could’ve recovered from NV30. The real question is how strong will Fermi 2 be? Stumble twice and you’re shaken, do it a third time and you’re likely to fall.

Chipsets: One Day You're In and the Next, You're Out
Comments Locked

106 Comments

View All Comments

  • medi01 - Saturday, October 17, 2009 - link

    Since when are sales based only on products qualities?
    How many buys are actually aware of performance details?
    How many of those, that are aware, are NOT one brand loyal?

    To me it rather seems nVidia, despite not having an answer to 5800 series for a few month will still successfully sell overpriced cards. It's AMD that will continue to make huge loses, having good products and "better value" price policy.

    Customers should be interested in healthy competition. "I buy inferior product from a company that already dominates the market" will simply kill the underdog, and then it'll show us... :(
  • kashifme21 - Wednesday, October 21, 2009 - link

    By supporting consoles, they might have gotten sales in the short term. However in the long term it has been a disaster for both of them.

    Consoles sales even putting the xbox 360 and ps3 together do not exceed 60million. which imo is not much of an ammount for these 2 companies.

    what has happened now is that the focus of developers has shifted to the consoles. which is why jumps in graphics have become stagnant now. the result in this is that even pc users dont need constant upgrades like the way they used to which means less sales from the pc market for nvidia and ati.

    Also as a note previously a pc user would need to make a switch in about 2 years to stay uptodate with the latest games, now since its developed with consoles in minds the cycle has become 7 years. So previously if a pc user needed to upgrade once in 2 years its only going to be 1 in 7yrs now, there simply wont be games out to take advantage of hardware.

    also the console market ati and nvidia thought to cater too is in the same situation. a console user will simply buy 1 console then make no hardware purchase for the rest of the generation unless the console fails.

    imo going for console sales might have given a sale in the short term butfor the long term its been bad for all the pc hardware makers be it cpu, gpu, ram, chipsets etc. as time goes on this will get worse specially if another console gen is supported.
  • KhadgarTWN - Sunday, October 25, 2009 - link

    For the part of console, I have a little diff thought.
    For very short period, consoles boost GPU sells; a bit longer, consoles devour PC gaming and affect hardware sells, that's true.

    But for a little longer? If selling GPU to console is a mistake, could they "Fix" it?
    Think about if AMD(ATi)/nVidia refused to develop graphic part of console and Larrabee failed, where is the next Gen console?

    For the years of PS2/DC/XBox, that's no big deal, Sony had its EE + GS, Nintendo had its own parts. Maybe XBOX would never born, but no big deal, consoles still live strong and prosporous.
    For now, no AMD, no XB360; no nVidia, no PS3. And for Nintendo part? Doubtful they could utilize their own part.

  • msroadkill612 - Wednesday, October 28, 2009 - link

    Am trying to make ammends for being a bit of a leech on geek sites and not contributing. Bit off topic but i hope some of you will find the below post useful.

    There is a deluge of requests around the geek forums re what grapics card to buy, but never have i seen a requestor specify whether they plan an always on PC vs "on only when in use".

    A$ .157 / kw sydney australia - oct 09 (A$=~.92 of a $US now)

    usa prices link:

    http://www.eia.doe.gov/cneaf/electricity/epm/table...">http://www.eia.doe.gov/cneaf/electricity/epm/table...
    (so sydney prices v similar to california - 15.29usc/ kw )

    so:

    in an always on PC, a graphics card which draws 10 watts less at IDLE than an alternative card
    (My logic here is - If you having fun at FULL LOAD, then who cares what the load power draw/cost is, but in most cases this is a small portion of the week and can logically be ignored - IF, and this is a big IF, your power save / sleep settings are set and working correctly)


    .157$/1000x10=.00157$ (substitute your local cost - always increasing though, negating you bean counters net present worth objections)

    x 24x365

    =A$13.75pa for each extra 10w idle draw (i hope the math is right)

    If you use air conditioning all day all year, you can theoretically double this cost.

    If, however, you use electric bar radiators for heating all day all year, then I afraid, my dear Watson, the "elementary" laws of physics dictate that you must waste no further time on this argument. It does not concern you, except to say that you are in the enviable position of soon being able to buy a formerly high end graphics card (am thinking dual core Nvidea here) for about the cost of a decent electric radiator, and getting quite decent framerates thrown in free with your heating bill.

    Using the below specs and above prices, a hd 4870 (90w) costs us$84.39 more to idle all year in california than the 5850/5870 (27w) at current prices. In about 18 months the better card should repay the premium paid for it.

    Hope this helps some of you cost justify your preferred card to those who must be obeyed.



    ATI HD 5870 & 5850 idle power 27W
    ATI Radeon HD 4870 idle 90w
    5770 18w
    5750 16w

    780g chipset mobo - AMD claims idle power consumption of the IGP is just 0.95W!


    Some nvidea card idle power specs: GPU Idle Watts


    NVIDIA GeForce GTX 280 30 W

    Gigabyte GeForce 9800 GX2 GV-NX98X1GHI-B 85 W

    ZOTAC GeForce 9800 GTX AMP! Edition ZT-98XES2P-FCP 50 w

    FOXCONN GeForce 9800 GTX Standard OC Edition 9800GTX-512N 48 W

    ZOTAC GeForce 9800 GTX 512MB ZT-98XES2P-FSP 53 W


    MSI NX8800GTX-T2D768E-HD OC GeForce 8800 GTX 76 W


    ZOTAC GeForce 8800 GT 512MB AMP! Edition ZT-88TES3P-FCP 33 W


    Palit GeForce 9600 GT 1GB Sonic NE/960TSX0202 30 W


    FOXCONN GeForce 8800 GTS FV-N88SMBD2-OD 59 w
  • Wolfpup - Friday, October 16, 2009 - link

    Okay, huh? I get it for the low end stuff, but mid range does this mean we're going to be wasting a ton of transistors on worthless integrated video that won't be used, taking up space that could have been more cache or another core or two?!?

    I have NO use for integrated graphics. Never have, never will.
  • Seramics - Friday, October 16, 2009 - link

    Nvidia is useless and going down. I dont mind they get out of the market at all.
  • Sandersann - Friday, October 16, 2009 - link

    Even if you money is not an issue for you. You want Nvidia and ATI to do well or at least stay competitive because the competition encourages innovation. without it, you will have to pay more for less features and speed. We might get a taste of that this Christmas and into Q1 2010.
  • medi01 - Friday, October 16, 2009 - link

    http://store.steampowered.com/hwsurvey/">http://store.steampowered.com/hwsurvey/
    AMD - 27%
    nVidia - 65% (ouch)

  • shin0bi272 - Friday, October 16, 2009 - link

    hey can you post the most common models too? The last time I looked at that survey the 7800gtx was the fastest card and most people on the survey were using like 5200's or something outrageous like that.
  • tamalero - Sunday, October 18, 2009 - link

    http://store.steampowered.com/hwsurvey/videocard/">http://store.steampowered.com/hwsurvey/videocard/

    most people are on 8800 series, it doesnt reflect "current" gen
    in last gen the 48XX series are increasing to 11% while the 260 series are around 4%

Log in

Don't have an account? Sign up now