Why NVIDIA Did It

To understand the motivation behind NVIDIA's naming and renaming and renaming we have to once again look its approach to GPU design. NVIDIA continued to architect very high end GPUs and allow their technology to, over the course of 9 - 12 months, trickle down to mid range and lower end market segments. AMD stepped in and launched a very competitive performance mainstream part instead of a high end GPU, allowing it to windfall down to lower price points and market segments quicker than NVIDIA could for this generation.

Let's attach some code names shall we?

NVIDIA's flagship, the GT200 GPU used in the GTX 295, 285, 280 and 260, isn't available in a cheaper version yet. AMD's flagship, the RV770, is already more affordable and is available in cheaper versions. NVIDIA has to rely on its last generation GPU, the G92b, to compete in the rest of the market while the lower end GT200 derivatives get ready for production. Rather than continue to ship products with old names to vendors and customers, NVIDIA slaps a new name on an old GPU and hopes to at least provide the appearance of being just as agile and competitive as AMD despite being clearly caught off guard this generation.

Of course, NVIDIA has a case to make. This is their current generation of hardware, and it is practical and useful to maintain a consistent nomenclature so that the general public knows what the product positioning actually is. We agree, only our solution is top to bottom launches in line with new GPU architectures rather than simply changing the name of old parts so that they look shiny and new.

NVIDIA's take on this is also flawed in that it treats customers like idiots and underlines the fundamental issue we have. Do I need a card with a new name on it to believe that it is worthy of my purchase, or can I go read reviews comparing the hardware and learn for myself whether or not any card (regardless of the name) fills my need? Maybe this name change is for people who don't know anything about graphics hardware then. In that case the thing that "sells" the card is the simple fact that NVIDIA has convinced someone that this part is an affordable version of a card from their latest line of products. Saying they need a name change to maintain current naming is essentially admitting that the only reason the name needs to be changed is to mislead uninformed people.

NVIDIA would love to have 40nm GT200 derivatives out today. Until that day comes, we'll get cards that sound like GT200 based products.

Anyway, we haven't previously tested a 1GB 9800 GTX+, and until this announcement their prices haven't been anywhere near reasonable (currently they're up at $200, so the $50 price drop will make a big difference). There is also a slight tweak between the GTS 250 1GB and the 9800 GTX+ 1GB: the memory on the 1GB 9800+ was underclocked by about 9.1%, and the GTS 250 1GB brings clock speed back in line with the 512MB 9800 GTX+. So while the 512MB part doesn't perform any different in any way, we should no longer see any performance degradation in games that don't benefit from memory size but are memory bandwidth sensitive from moving up to 1GB.

Oh, also wide availability won't be until March 10th. Seriously.

Also, not explained until now is the way the new naming scheme will go forward. Now, GTX, GTS, GT and G (as far as we can gather) will indicate performance segment. The number will be the model number and within a performance segment, higher is better. Essentially NVIDIA has swapped the meaning of letters and numbers in their naming. They have also clearly told us that naming will no longer be attached to GPU architecture, but that vendors may somehow still indicate architecture on the box if they so choose. If nothing else, the feature list and specifications will be a guide. Here's to requiring that people read the fine print to know what they're buying.

For What it's Worth

Early last week Charlie over at The Inquirer posted a story saying that a number of reviewers were cut out of the GeForce GTS 250 launch. We felt a bit hurt, by the time the story launched we weren't even asked to be briefed about the GTS 250. Cards had already gone out to other reviewers but we weren't on any lists. Oh, pout.

Magically, a couple of days after Charlie's article we got invited to a NVIDIA briefing and we had a GTS 250 to test. Perhaps NVIDIA was simply uncharacteristically late in briefing us about a new GPU launch. Perhaps NVIDIA was afraid we'd point out that it was nothing more than a 9800 GTX+ that ran a little cooler. Or perhaps we haven't been positive enough about CUDA and PhysX and NVIDIA was trying to punish us.

Who knows what went on at NVIDIA prior to the launch, we're here to review the card, but for what it's worth - thank you Charlie :)

Index More From CeBIT: New Mobile Parts
Comments Locked

103 Comments

View All Comments

  • SiliconDoc - Saturday, March 21, 2009 - link

    Thanks for going completely nutso (already knew you were anyway), and not having any real counterpoint to EVERYTHING I've said.
    Face the truth, and stop spamming.
    A two year old with a red diaper rash bottom can drool and scream.
    Epic fail.
  • kx5500 - Thursday, March 5, 2009 - link


    Shut the *beep* up f aggot, before you get your face bashed in and cut
    to ribbons, and your throat slit.
  • Baov - Thursday, March 5, 2009 - link

    Does this thing do hybridpower like the 9800gtx+? Will it completly power down?
  • san1s - Wednesday, March 4, 2009 - link

    "365 wwwwwelll no but how old is the g92 regardless of die size.. g80?
    lol?"
    if you think all the changes that went from g80 to g92b were insignificant, then I guess you'll think that the difference from an intel x6800 and an eo stepping e8400 is meaningless too. I mean, they are both around 3 Ghz right? and they both say core 2, so that means that they're the same./sarcasm off. I'm not going to continue with this any further- if you don't get it, then you'll never will. The gpu in the 9800 GTX+ was released last summer, over half a year ago, but not quite a year.

    "at all resoutions?"
    at all the resolutions that a educated person purchasing a midrange video card plays at. Mid range card= midrange monitor. You don't mix high end with low end or midrange components as that will result in bottlenecking. Anyway, the difference between 8 and 12 FPS @ 2560 by 1600 are meaningless as they are not playable anyway.

    "i wouldnt say $50 would stop me from getting a 260 it is at least a newer arch. or ahem a 1gb 4870.
    what if they do have a 9800/250... well if they look at the power #'s for sli in this article they'd definately reconsider"
    not everyone has the luxury to overshoot their budget on a single component by $50 and call it insignificant.

    "most people don't care enough to engage in this activity"
    lol. How would they ever get their custom built PCs to work without knowing a bit of background info? Give a normal person a bunch of components and lets see how far they get without knowing anything about PCs. If you don't know your hardware you shouldn't be building computers anyway. I personally wouldn't go out and buy tires by myself if I were up change them myself without researching. I don't have a clue about tire sizes, and I as sure as hell won't buy new tires without researching just because I don't care for that activity.

    "and what about option #3 buy ati?"
    That's not what I was talking about. Consumers should support all the sides of competition to drive prices down, not just only ati or only nvidia. What I meant was people blaming nvidia for their own mistakes. There is a gap in the current line of nvidia gpus, and to fill it, what would be the best way while maintaining performance relative to the price and naming bracket?
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Good response, you aren't a fanboy, but the idiots can't tell. You put the slap on the little fanboys COMPLAINT.
    This is an article about the GTS250, and the whining little fanboy red wailers come on and whine and cry about it.
    To respond to their FUD and straighten out their kookball distorted lies IS NOT BEING A FANBOY.
    You did a good job trying to straighten out the poor ragers noggin.
    As for the other whiners agreeing "fan boys go away" - if they DON'T LIKE the comments, don't read 'em. They both added ZERO to the discussion, other than being the easy, lame, smart aleck smarmers that pretended to be above the fray, but dove into the gutter whining not about the review, but about fellow enthusiasts commenting on it - and I'm GLAD to join them.
    I hope "they go away" - and YOU, keep slapping the whiners about nvidia right where they need it - upside the yapping text lies and stupidity.
    Thank you for doing it, actually, I appreciate it.
  • SiliconDoc - Wednesday, March 18, 2009 - link

    PS - as for the red fanboy that did the review, I guess he thought he was doing a "dual gpu review".
    I suppose "the point" of having all the massive dual gpu scores above the GTS250 - was to show "how lousy it is" - and to COVER UP the miserable failure of the 4850 against it.
    Keep the GTS250 at OR NEAR 'the bottom of every benchmark"...
    ( Well, now there's another hint as to why Derek gets DISSED when it comes to getting NVidia cards from Nvidia - his bias is the same and WORSE than the red fan boy commenters - AND NVIDIA KNOWS IT AS WELL AS I DO.)
    Thanks for the "dual gpu's review".
  • Totally - Thursday, March 5, 2009 - link

    Dear fanboys,

    Go away.

    Love,

    Totally
  • Hxx - Thursday, March 5, 2009 - link

    lol best post gj Totally

    Seriously, 3 main steps to buy the righ card:

    1. look at benchmarks
    2. buy the cheapest card with playable fps in the games u play
    3. don't think its future proof - none of them are.
  • Mikey - Wednesday, March 4, 2009 - link

    Is this even worth the money? In terms of value, would the 4870 be the one to get?

    http://findaerialequipment.com/">aerial lifts ftw
  • Nfarce - Wednesday, March 4, 2009 - link

    Mikey, the 4870 is the way to go in just about all scenarios. Search AnandTech's report from last fall on the 4870 1GB under the Video/Graphics section. The GeForce 260/216 is still more and performs lower. Normally I'm an Nvidia fanboy, but in this segment where I'm purchasing, it's ATI/AMD hands down no questions asked.

Log in

Don't have an account? Sign up now