Why NVIDIA Did It

To understand the motivation behind NVIDIA's naming and renaming and renaming we have to once again look its approach to GPU design. NVIDIA continued to architect very high end GPUs and allow their technology to, over the course of 9 - 12 months, trickle down to mid range and lower end market segments. AMD stepped in and launched a very competitive performance mainstream part instead of a high end GPU, allowing it to windfall down to lower price points and market segments quicker than NVIDIA could for this generation.

Let's attach some code names shall we?

NVIDIA's flagship, the GT200 GPU used in the GTX 295, 285, 280 and 260, isn't available in a cheaper version yet. AMD's flagship, the RV770, is already more affordable and is available in cheaper versions. NVIDIA has to rely on its last generation GPU, the G92b, to compete in the rest of the market while the lower end GT200 derivatives get ready for production. Rather than continue to ship products with old names to vendors and customers, NVIDIA slaps a new name on an old GPU and hopes to at least provide the appearance of being just as agile and competitive as AMD despite being clearly caught off guard this generation.

Of course, NVIDIA has a case to make. This is their current generation of hardware, and it is practical and useful to maintain a consistent nomenclature so that the general public knows what the product positioning actually is. We agree, only our solution is top to bottom launches in line with new GPU architectures rather than simply changing the name of old parts so that they look shiny and new.

NVIDIA's take on this is also flawed in that it treats customers like idiots and underlines the fundamental issue we have. Do I need a card with a new name on it to believe that it is worthy of my purchase, or can I go read reviews comparing the hardware and learn for myself whether or not any card (regardless of the name) fills my need? Maybe this name change is for people who don't know anything about graphics hardware then. In that case the thing that "sells" the card is the simple fact that NVIDIA has convinced someone that this part is an affordable version of a card from their latest line of products. Saying they need a name change to maintain current naming is essentially admitting that the only reason the name needs to be changed is to mislead uninformed people.

NVIDIA would love to have 40nm GT200 derivatives out today. Until that day comes, we'll get cards that sound like GT200 based products.

Anyway, we haven't previously tested a 1GB 9800 GTX+, and until this announcement their prices haven't been anywhere near reasonable (currently they're up at $200, so the $50 price drop will make a big difference). There is also a slight tweak between the GTS 250 1GB and the 9800 GTX+ 1GB: the memory on the 1GB 9800+ was underclocked by about 9.1%, and the GTS 250 1GB brings clock speed back in line with the 512MB 9800 GTX+. So while the 512MB part doesn't perform any different in any way, we should no longer see any performance degradation in games that don't benefit from memory size but are memory bandwidth sensitive from moving up to 1GB.

Oh, also wide availability won't be until March 10th. Seriously.

Also, not explained until now is the way the new naming scheme will go forward. Now, GTX, GTS, GT and G (as far as we can gather) will indicate performance segment. The number will be the model number and within a performance segment, higher is better. Essentially NVIDIA has swapped the meaning of letters and numbers in their naming. They have also clearly told us that naming will no longer be attached to GPU architecture, but that vendors may somehow still indicate architecture on the box if they so choose. If nothing else, the feature list and specifications will be a guide. Here's to requiring that people read the fine print to know what they're buying.

For What it's Worth

Early last week Charlie over at The Inquirer posted a story saying that a number of reviewers were cut out of the GeForce GTS 250 launch. We felt a bit hurt, by the time the story launched we weren't even asked to be briefed about the GTS 250. Cards had already gone out to other reviewers but we weren't on any lists. Oh, pout.

Magically, a couple of days after Charlie's article we got invited to a NVIDIA briefing and we had a GTS 250 to test. Perhaps NVIDIA was simply uncharacteristically late in briefing us about a new GPU launch. Perhaps NVIDIA was afraid we'd point out that it was nothing more than a 9800 GTX+ that ran a little cooler. Or perhaps we haven't been positive enough about CUDA and PhysX and NVIDIA was trying to punish us.

Who knows what went on at NVIDIA prior to the launch, we're here to review the card, but for what it's worth - thank you Charlie :)

Index More From CeBIT: New Mobile Parts
POST A COMMENT

103 Comments

View All Comments

  • VooDooAddict - Tuesday, March 03, 2009 - link

    If trying to decide for purchasing, I would cut that list down to the following:

    8800GTS 512MB - Good bang for the $ but hotter, power hungry GPU
    9800GTX+ 512MB - Die shrink gave more speed and lower temps
    GTX250 1GB - New board design gives better power usage
    Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    VooDoo where the heck can you get an 8800gts anymore ? ebay ?
    The 9800GT ultimate by Asus ? Those are literally gone as well ...
    Which brings me to something... I hadn't thought of yet...
    A LOT of the core g80/g92/g92b cards are GONERS - they're sold out !
    So - nvidia makes a new "flavor".
    Golly Wally, I never thought of that before.
    " That's why you're the Beave. "
    ____________________

    Oh, THEY SOLD OUT - HOW ABOUT THAT.
    Reply
  • erple2 - Wednesday, March 04, 2009 - link

    No, I disagree - the OP has a good point. Compare all of the G92 parts together to see just how much real difference there is. Throwing in the G80 part (8800GTX, I suppose) is an interesting twist as well, to show how the G80 evolved over time.

    nVidia has a crazy number of cards that are all "the same". The evaluation proposed sure would help explain away what was going on.

    I'd definitely be interested in seeing what the results of that were!
    Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Go to techpowerup and see their reviews, for instance on the 4830 - it has LOTS of games and lots of g80/g92/g92b flaovr - including the gtx768 (G80) which YES, pulls out some wins even against the 4870x2....
    Check it out at techpowerup.
    Reply
  • emboss - Tuesday, March 03, 2009 - link

    I'd also say for the purpose of comparison to throw a G80 in there (ie: a 8800 GTX or Ultra). It'd be interesting if the extra bandwidth and ROPs of the G80s make a difference in any cases. Reply
  • Casper42 - Tuesday, March 03, 2009 - link

    1) You should have included results for a 9800GTX+ so we could truly see if the results were identical to the "new" card.
    2) If you can, please stick a 9800GTX+ and a GTS 250 512MB into the same machine and see if you can still enable SLI.

    I own a 9800 GTX+ and item #2 is especially interesting to me as it means when I want to go SLI, I may have an easier time finding a GTS 250 rather than hunting on eBay for a 9800 GTX+

    Thanks,
    Casper42
    Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Casper as DEREK said in the article > " Anyway, we haven't previously tested a 1GB 9800 GTX+," (until now)
    THAT'S WHAT THEY USED.
    lol
    Yes, well you still don't have an answer to your question though...
    How about the lower power consumption and better memory and core creation translating into higher overlclocks ?
    LOL
    No checking that, either...
    "A 9800gtx+" will do - "bahhhumbug ! I hate nvidia and it's the same ding dang thing ! Forget that I derek said it's better memory, a better made core itteration, and therefore lower power, a smaller pcb make, SCREW all that I can' overclock I don't have the DAM*! CARD I HATE NVIDIA ANYWAY SO WHO CARES! "
    ____________________________________

    Sorry for the psychological profile but it's all too obvious - and it's obvious nvidia knows it as well.
    Hope the endless red fan ragers save the multiple billion dollar charge off losers, ati. I really do. I really appreciate the constant slant for ati, I think it helps lower the prices on the cards I like to buy.
    It's great.
    Reply
  • Mr Perfect - Tuesday, March 03, 2009 - link

    Now that's a good question(number 2 that is). Maybe a 9800GTX+ can be BIOS flashed into a 250 to enable SLI? Reply
  • DerekWilson - Tuesday, March 03, 2009 - link

    GTS 250 can be SLI'd with 9800 GTX+ -- NVIDIA usually disables SLI with different device IDs, but this is an exception.

    If you've got a 9800 GTX+ 512MB you can SLI it with a GTS 250. If you have a 9800 GTX+ 1GB you can SLI it with a GTS 250 1GB. You can't mix memory sizes though.

    Also, the 9800 GTX+ and the GTS 250 are completely identical and there is no reason to put two in a system and test them because they are the same card with a different name. At least until NVIDIA's partners release GTS 250s based on the updated board, but even then we don't exepct any performance difference whatsoever.

    These numbers were run with a 9800 GTX+ and named GTS 250 to help show the current line up.
    Reply
  • dgingeri - Tuesday, March 03, 2009 - link

    I noticed that the 512MB version of te 4870 beats the GTS250 1GB in everything, and yet costs the same. even when the video memory makes a big difference, the 512MB 4870 wins out. Even better is that the 4870 512MB board costs the same, or at least will soon, as the GTS250 1GB board.

    Doesn't this make the 4870 512MB board a better deal?
    Reply

Log in

Don't have an account? Sign up now