Why NVIDIA Did It

To understand the motivation behind NVIDIA's naming and renaming and renaming we have to once again look its approach to GPU design. NVIDIA continued to architect very high end GPUs and allow their technology to, over the course of 9 - 12 months, trickle down to mid range and lower end market segments. AMD stepped in and launched a very competitive performance mainstream part instead of a high end GPU, allowing it to windfall down to lower price points and market segments quicker than NVIDIA could for this generation.

Let's attach some code names shall we?

NVIDIA's flagship, the GT200 GPU used in the GTX 295, 285, 280 and 260, isn't available in a cheaper version yet. AMD's flagship, the RV770, is already more affordable and is available in cheaper versions. NVIDIA has to rely on its last generation GPU, the G92b, to compete in the rest of the market while the lower end GT200 derivatives get ready for production. Rather than continue to ship products with old names to vendors and customers, NVIDIA slaps a new name on an old GPU and hopes to at least provide the appearance of being just as agile and competitive as AMD despite being clearly caught off guard this generation.

Of course, NVIDIA has a case to make. This is their current generation of hardware, and it is practical and useful to maintain a consistent nomenclature so that the general public knows what the product positioning actually is. We agree, only our solution is top to bottom launches in line with new GPU architectures rather than simply changing the name of old parts so that they look shiny and new.

NVIDIA's take on this is also flawed in that it treats customers like idiots and underlines the fundamental issue we have. Do I need a card with a new name on it to believe that it is worthy of my purchase, or can I go read reviews comparing the hardware and learn for myself whether or not any card (regardless of the name) fills my need? Maybe this name change is for people who don't know anything about graphics hardware then. In that case the thing that "sells" the card is the simple fact that NVIDIA has convinced someone that this part is an affordable version of a card from their latest line of products. Saying they need a name change to maintain current naming is essentially admitting that the only reason the name needs to be changed is to mislead uninformed people.

NVIDIA would love to have 40nm GT200 derivatives out today. Until that day comes, we'll get cards that sound like GT200 based products.

Anyway, we haven't previously tested a 1GB 9800 GTX+, and until this announcement their prices haven't been anywhere near reasonable (currently they're up at $200, so the $50 price drop will make a big difference). There is also a slight tweak between the GTS 250 1GB and the 9800 GTX+ 1GB: the memory on the 1GB 9800+ was underclocked by about 9.1%, and the GTS 250 1GB brings clock speed back in line with the 512MB 9800 GTX+. So while the 512MB part doesn't perform any different in any way, we should no longer see any performance degradation in games that don't benefit from memory size but are memory bandwidth sensitive from moving up to 1GB.

Oh, also wide availability won't be until March 10th. Seriously.

Also, not explained until now is the way the new naming scheme will go forward. Now, GTX, GTS, GT and G (as far as we can gather) will indicate performance segment. The number will be the model number and within a performance segment, higher is better. Essentially NVIDIA has swapped the meaning of letters and numbers in their naming. They have also clearly told us that naming will no longer be attached to GPU architecture, but that vendors may somehow still indicate architecture on the box if they so choose. If nothing else, the feature list and specifications will be a guide. Here's to requiring that people read the fine print to know what they're buying.

For What it's Worth

Early last week Charlie over at The Inquirer posted a story saying that a number of reviewers were cut out of the GeForce GTS 250 launch. We felt a bit hurt, by the time the story launched we weren't even asked to be briefed about the GTS 250. Cards had already gone out to other reviewers but we weren't on any lists. Oh, pout.

Magically, a couple of days after Charlie's article we got invited to a NVIDIA briefing and we had a GTS 250 to test. Perhaps NVIDIA was simply uncharacteristically late in briefing us about a new GPU launch. Perhaps NVIDIA was afraid we'd point out that it was nothing more than a 9800 GTX+ that ran a little cooler. Or perhaps we haven't been positive enough about CUDA and PhysX and NVIDIA was trying to punish us.

Who knows what went on at NVIDIA prior to the launch, we're here to review the card, but for what it's worth - thank you Charlie :)

Index More From CeBIT: New Mobile Parts
Comments Locked

103 Comments

View All Comments

  • Leyawiin - Tuesday, March 3, 2009 - link

    Good refinement of an already good card. New more compact PCB, lower power comsumption, lower heat, better performance, 1GB. If Nvidia feels thats worthy of a rename, why should anyone get their drawers in a bunch?

    But please, let the conspiracy theories fly if there was a rewrite of the conclusion. Could be it was just poorly done and wasn't edited, but thats not as fun as insinuating Nvidia must have put pressure on the AT.
  • Gannon - Tuesday, March 3, 2009 - link

    Because it's lying, the Core should always match the original naming scheme. Nvidia is just doing this to get rid of inventory and cause market confusion so that dimwits who don't do their research go for the 'newer' ones, when in fact they are the older.

    I hate this practice, creative did the same thing with some of their soundblaster cards, the soundblaster PCI I believe it was, it was some other chipset from a company they had bought out and merely renamed and rebadge the card "soundblaster"

    Needless to say I hate the practice of deceiving customers, imagine you're in a restaurant and you ordered something but then they switched it on you to something else, you'd rightly get pissed off.

    If people weren't so clueless about technology they wouldn't get away with this shit. This is where the market fails, when your customers are clueless, it's sheep to the slaughter.
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Yeah, imagine, you ordered coke one day, and the next week you ordered coca cola off the same menu, and they even had the nerve to bring it in a different shaped glass. What nerve, huh !
    They sqirted a little more syrup in the latter mix, and a bit less ice, and you screamed you were decieved and they tricked you, and then you went off wailing away that it's the same thing anyway, but you want the coca cola not the coke because it tasted just a tiny bit better, and you had darn better see them coming yup with some honest names.
    Then ten thousand morons agreed with you - then the cops hauled you out in a straight jecket.
    I see what you mean.
    Coke is coca cola, and it should not be renamed like that - or heck people might buy it.
    I guess that isn't fair ... because people might buy it. It might even a different price at a different restaurant, or even be called something else and taste different out of a can vs a glass - and heck that ain't "fair".
    You do know I think you're all pretty much whining lunatics, now, right ? Just my silly opinion, huh.
    Coke, coca cola, soda, pop, golly - what will people do but listen to the endless whiners SCREAM it's all the same and stop fooling people....
    I guess it was a slow news YEAR.
  • SunnyD - Tuesday, March 3, 2009 - link

    Since NVIDIA really wanted to push PhysX... I'm curious which if any of the tested titles have PhysX support and if it's enabled in those titles as tested. I'd be really interested to see what kind of performance hit the PhysX "holy grail" takes from this new/old card when trying to compare it to its competition.
  • SiliconDoc - Wednesday, March 18, 2009 - link

    I wonder why they haven't done a Mirror's Edge PhysX extravaganza test - they can use secondary PhysX cards and then use the primary for enabling, turn it on and off and compare - etc.
    But not here - Derek would grind off all his tooth enamel, and Anand can't afford the insurance for him.
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Derek CAN'T include PsysX, and NEVER SAYS wether or not he has it disabled in the nvidia driver panel - although this site USED to say that.
    If they even dare bring up PhysX - ati looks BAD.
    Hence, to keep as absolutely MUM as possible, is the best red fan rager course.
    You see of course, Derek the red has to admit that yes, even NVIDIA ITSELF brought this SAD BIAS up to Derek...
    Oh well, once a raging red rooster, always a red rooster - and NOTHING is going to change that. (or so it appears)
    Is that 10 or 15 points of absolute glaring bias now ?
    ____________________________________________________________
    " We're just trying to save the billions losing ati so we have competition and lower prices - so shut up SiliconDoc ! Do you want to pay more, ALL OVER AGAIN FOR NVIDIA CARDS !!?!"
    ____________________________________________________________

    PS, thanks for lying so much red roosters, you've done a wonderful job of endless bs and fud, hopefully now Obama can bail out amd/ati, and my nvidia cuba badaboom low power game profiles, forced sli, PhysX cards will remain the best buy and continue to utterly dominate with only DDR3 memory in them.

    PSS - Yes, I can hardly wait for nvidia DDR5 - oh will that ever be fun - be ready to rip your red badges off your puny chests fellers - I'm sure you'll suddenly find a way to reverse 180 degrees after a few weeks of "gating nvidia for stealing ati ddr5 intellectual property".
    LOL
    Oh it's gonna be a blast.
  • C'DaleRider - Tuesday, March 3, 2009 - link

    Very early this morning, I stumbled upon this article when it was originally put up....and went directly to the conclusions page. Interesting read....and I should have saved that page.

    Subsequently, the entire review went down with this reasoning, "...ust we had some engine issues... missing images and such. I don't have the images or I'd put them on the server and set the article to "live" again. Anand and Derek have been notified; sorry for the delays."

    Well, it's back up and what do you know.....the conclusions have now become somewhat softer, or as a few others on another forum put it who also saw the "original" review...circumcised, censored, and bullied by nVidia.

    Shame that the original conclusion has been redone....would have liked others to actually see AT had some independence. Guess that's a lost ideal now...........
  • strikeback03 - Tuesday, March 3, 2009 - link

    Interesting, you mentioned in the comments in the other article that you didn't get to see any of the review, as when you clicked it went to the i7 system review.
  • JarredWalton - Wednesday, March 4, 2009 - link

    Thanks for the speculation, but I can 100% guarantee that the "pulling" of the article was me taking it down due to missing images. I did it, and I never even looked at the rest of the article, seeing that it was 3AM and I had just finished editing a different article.

    Was the conclusion edited before it was put back up? Yes, but not by me. That's not really unusual, though, since we typically have someone else read over things before an article goes live, and with a bit more discussion the wording can be changed around. It would have changed regardless, and not because of anything NVIDIA said.

    Is the 9800 GTX+ naming change stupid? I certainly think so. However, that doesn't make the current conclusion wrong. The card reworking does have benefits, and at the new price it's definitely worth a look as a midrange option.
  • RamarC - Tuesday, March 3, 2009 - link

    please consider styling the resolution links so they stand out a bit or look button-ish. it took me a minute to realize they were clickable.

Log in

Don't have an account? Sign up now