Why NVIDIA Did It

To understand the motivation behind NVIDIA's naming and renaming and renaming we have to once again look its approach to GPU design. NVIDIA continued to architect very high end GPUs and allow their technology to, over the course of 9 - 12 months, trickle down to mid range and lower end market segments. AMD stepped in and launched a very competitive performance mainstream part instead of a high end GPU, allowing it to windfall down to lower price points and market segments quicker than NVIDIA could for this generation.

Let's attach some code names shall we?

NVIDIA's flagship, the GT200 GPU used in the GTX 295, 285, 280 and 260, isn't available in a cheaper version yet. AMD's flagship, the RV770, is already more affordable and is available in cheaper versions. NVIDIA has to rely on its last generation GPU, the G92b, to compete in the rest of the market while the lower end GT200 derivatives get ready for production. Rather than continue to ship products with old names to vendors and customers, NVIDIA slaps a new name on an old GPU and hopes to at least provide the appearance of being just as agile and competitive as AMD despite being clearly caught off guard this generation.

Of course, NVIDIA has a case to make. This is their current generation of hardware, and it is practical and useful to maintain a consistent nomenclature so that the general public knows what the product positioning actually is. We agree, only our solution is top to bottom launches in line with new GPU architectures rather than simply changing the name of old parts so that they look shiny and new.

NVIDIA's take on this is also flawed in that it treats customers like idiots and underlines the fundamental issue we have. Do I need a card with a new name on it to believe that it is worthy of my purchase, or can I go read reviews comparing the hardware and learn for myself whether or not any card (regardless of the name) fills my need? Maybe this name change is for people who don't know anything about graphics hardware then. In that case the thing that "sells" the card is the simple fact that NVIDIA has convinced someone that this part is an affordable version of a card from their latest line of products. Saying they need a name change to maintain current naming is essentially admitting that the only reason the name needs to be changed is to mislead uninformed people.

NVIDIA would love to have 40nm GT200 derivatives out today. Until that day comes, we'll get cards that sound like GT200 based products.

Anyway, we haven't previously tested a 1GB 9800 GTX+, and until this announcement their prices haven't been anywhere near reasonable (currently they're up at $200, so the $50 price drop will make a big difference). There is also a slight tweak between the GTS 250 1GB and the 9800 GTX+ 1GB: the memory on the 1GB 9800+ was underclocked by about 9.1%, and the GTS 250 1GB brings clock speed back in line with the 512MB 9800 GTX+. So while the 512MB part doesn't perform any different in any way, we should no longer see any performance degradation in games that don't benefit from memory size but are memory bandwidth sensitive from moving up to 1GB.

Oh, also wide availability won't be until March 10th. Seriously.

Also, not explained until now is the way the new naming scheme will go forward. Now, GTX, GTS, GT and G (as far as we can gather) will indicate performance segment. The number will be the model number and within a performance segment, higher is better. Essentially NVIDIA has swapped the meaning of letters and numbers in their naming. They have also clearly told us that naming will no longer be attached to GPU architecture, but that vendors may somehow still indicate architecture on the box if they so choose. If nothing else, the feature list and specifications will be a guide. Here's to requiring that people read the fine print to know what they're buying.

For What it's Worth

Early last week Charlie over at The Inquirer posted a story saying that a number of reviewers were cut out of the GeForce GTS 250 launch. We felt a bit hurt, by the time the story launched we weren't even asked to be briefed about the GTS 250. Cards had already gone out to other reviewers but we weren't on any lists. Oh, pout.

Magically, a couple of days after Charlie's article we got invited to a NVIDIA briefing and we had a GTS 250 to test. Perhaps NVIDIA was simply uncharacteristically late in briefing us about a new GPU launch. Perhaps NVIDIA was afraid we'd point out that it was nothing more than a 9800 GTX+ that ran a little cooler. Or perhaps we haven't been positive enough about CUDA and PhysX and NVIDIA was trying to punish us.

Who knows what went on at NVIDIA prior to the launch, we're here to review the card, but for what it's worth - thank you Charlie :)

Index More From CeBIT: New Mobile Parts
Comments Locked

103 Comments

View All Comments

  • Hrel - Thursday, April 9, 2009 - link

    you should specify when you're being sarcastic and when you're being serious. Also, all that red rooster loon red camp green goblin crap simply doesn't make any sense and makes you sound like a tin-foil hat wearing crazy person. Just sayin' dude, lighten up. Do you work for Nvidia? Or do you just really hate AMD?

    Yes, they're both good cores, and yes, it'd be great if Nvidia used DDR5, but they don't, so they don't get the performance boost from it; that's their fault too. And they DID make the GT200 core too big and expensive to produce, that's why the GTX260 is now being sold at a loss; just to maintain market share.
  • Hrel - Wednesday, March 4, 2009 - link

    Oh, also... I almost forgot: you still didn't include 3D Mark scores:( PLEASE start including 3D Mark scores in your reviews.

    Also, I care WAY more about how these cards perform at 144x900 and 1280x800 than I do about 2560x1600; I will NEVER have a monitor with a resolution that high. No point.

    It's just, I'm more interested in seeing what happens when a card that's on the border of playable with max settings, gets the resolution turned down some, then what happens when the resolution gets turned up beyond what my monitor can even display.

    It's pretty simple really, more on board RAM means the card won't insta-tank at resolutions above 1680x1050; but the percent differences should be the same between the cards. Where, comparing a bunch of 512MB and 1GB cards, at resolutions at 1680x1050 and lower, that extra RAM doesn't really matter; so all we're seeing is how powerful the cards are. It seems like a truer representation of the cards performance to me.
  • Hrel - Wednesday, March 4, 2009 - link

    I really do mean to stop adding to this; just wanted to clarify.

    When I say that the extra RAM doesn't matter, I mean that the extra RAM isn't necessary just to run the game at ur chosen resolution. Of course some newer games will take advantage of that extra RAM even at resolutions as low as 1280x800. I'd just rather see how the card performs in the game based on it's capabilities rather than seeing one card perform better than another simply because that "other" card doesn't have enough on board RAM; which has NOTHING to do with how much rendering power the card has and has only to do with on board RAM.

    I think it would be good to just add a fourth resolution, 1280x800, just to show what happens when the cards aren't being limited by their on board RAM and are allowed to run the game to the best of their abilities; without superficial limitations. There, pretty sure I'm done. Please respond to at least some of this; it took me kind of a long time; relative to how long I normally spend writing comments.
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Hmmm... you'd think you could bring yuorself to apply that to the 4850 and the 4870 that have absolutely IDENTICAL CORES, and only have a RAM DIFFERENCE.
    Yeah, one would think.
    I suppose the red fan feverish screeding "blocked" your immensely powerful mind from thinking of that.
  • Hrel - Saturday, March 21, 2009 - link

    What are you talking about?
  • Hrel - Wednesday, March 4, 2009 - link

    I'm excited about this; I was kind of wondering what Nvidia was going to do, considering GT200 costs too much to make and isn't significantly faster than the last generation; and I knew there couldn't be a whole new architecture yet, even Nvidia doesn't have that much money.
    However I'm excited because this is a 9800GTX+, still a very good performing part, made slightly smaller, more energy efficient and cooler running; not to mention offered at a lower price point! Yay, consumers win!(Why did Charlie at the Inquirer say it was MORE expensive but anandtech lists lower prices?) I really hope the 512MB version is shorter and only needs 1 PCI-E connector/lower power consumption; if not, that almost seems like intentional resistance to progress. However the extra RAM will be great now that the clocks are set right; and at $150, or less if rebates and bundles start being offered, that's a great deal.

    On the whole, Nvidia trying to essentially screw the reviewers... I guess I don't have much to say; I'm disappointed. But Nvidia has shown this type of behavior before; it's a shame, but it will only change with new company leadership.

    Anyway, from what I've read so far, it looks like the consumer is winning, prices are dropping, performance is increasing(before at an amazingly rapid rate, now at a crawl, but still increasing.) power consumption is going down and manufacturing processes are maturing... consumers win!
  • san1s - Wednesday, March 4, 2009 - link

    365? are you sure about that?
    "even when the 9800 was new... iirc the 4850 was already making it look bad"
    google radeon 4850 vs 9800 GTX+ and see the benchmarks... IMO the 9800 was making the brand new 4850 look bad
    "i'd doubt that anyone buying a 9800 today is planning to sli it later"
    what if they already have a 9800? much cheaper to get another one for sli than a new gtx 260
    "hahaha, less power useage relative to"
    read the article
    "name some mainstream cuda and physx uses"
    ever heard of badaboom? folding@home? mirror's edge?
    the gts 250 competes with the 4850, not 4870
    "continually confusing their most loyal customers "
    what's so confusing about reading a review and looking at the price?

    The gts 250 makes perfect sense to me. Rather than spending $ on R&D for a downgraded GT200 (that will perform the same more or less), why not use an existing GPU that has the performance between the designated 240 and 260?
    Its a no win situation, option #1 will mean a waste of money for something that won't perform better than the existing product that can probably be made cheaper (the G92b is much smaller), and #2 will cause complaints with enthusiasts who are too lazy to read reviews.
    Which option looks better?
  • kx5500 - Thursday, March 5, 2009 - link


    Shut the *beep* up f aggot, before you get your face bashed in and cut
    to ribbons, and your throat slit.
  • SiliconDoc - Wednesday, March 18, 2009 - link

    I saw that more than once in Combat Arms - have you been playing to long on your computer?
  • rbfowler9lfc - Tuesday, March 3, 2009 - link

    Well, whatever it is, be it a rebadged this or that, it seems like it runs on par with the GTX260 in most of the tests. So if it's significantly cheaper than the GTX260, I'll take it, thanks.

Log in

Don't have an account? Sign up now