In the beginning there was the GeForce 8800 GT, and we were happy.

Then, we then got a faster version: the 8800 GTS 512MB. It was more expensive, but we were still happy.

And then it got complicated.

The original 8800 GT, well, it became the 9800 GT. Then they overclocked the 8800 GTS and it turned into the 9800 GTX. Now this made sense, but only if you ignored the whole this was an 8800 GT to begin with thing.

The trip gets a little more trippy when you look at what happened on the eve of the Radeon HD 4850 launch. NVIDIA introduced a slightly faster version of the 9800 GTX called the 9800 GTX+. Note that this was the smallest name change in the timeline up to this point, but it was the biggest design change; this mild overclock was enabled by a die shrink to 55nm.

All of that brings us to today where NVIDIA is taking the 9800 GTX+ and calling it a GeForce GTS 250.

Enough about names, here's the card:

You can get it with either 512MB or 1GB of GDDR3 memory, both clocked at 2.2GHz. The core and shader clocks remain the same at 738MHz and 1.836GHz respectively. For all intents and purposes, this thing should perform like a 9800 GTX+.

If you get the 1GB version, it's got a brand new board design that's an inch and a half shorter than the 9800 GTX+:


GeForce GTS 250 1GB (top) vs. GeForce 9800 GTX+ (bottom)

The new board design isn't required for the 512MB cards unfortunately, so chances are that those cards will just be rebranded 9800 GTX+s.

The 512MB cards will sell for $129 while the 1GB cards will sell for $149.

 

While the GPU is still a 55nm G92b, this is a much more mature yielding chip now than when the 9800 GTX+ first launched and thus power consumption is lower. With GPU and GDDR3 yields higher, power is lower and board costs can be driven down as well. The components on the board draw a little less power all culminating in a GPU that will somehow contribute to saving the planet a little better than the Radeon HD 4850.


There's only one PCIe power connector on the new GTS 250 1GB boards

Note that you need to have the new board design to be guaranteed the power savings, so for now we can only say that the GTS 250 1GB will translate into power savings:


These are the biggest gains you'll see from this GPU today. It's still a 9800 GTX+.

Why NVIDIA Did It
POST A COMMENT

103 Comments

View All Comments

  • sbuckler - Wednesday, March 04, 2009 - link

    I don't understand the hate. They rebranded but more importantly dropped the price too. This forced ati to drop the price of the 4850 and 4870. That's a straight win for the consumer - whether you want ati or nvidia in your machine. Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Oh, now stop that silliness ! Everyone worthy knows only ati drops prices and causes the evil green beast to careen from another fatal blow. ( the evil beast has more than one life, of course - the death blow has been delivered by the sainted ati many times, there's even a shrine erected as proof ).
    Besides, increasing memory, creating a better core rollout, redoing the pcb for better efficiency and pricing, THAT ALL SUCKS - because the evil green beast sucks, ok ?
    Now folllow the pack over the edge of the cliff into total and permanent darkness, please. You know when it's dark red looks black, yes, isn't that cool ? Ha ! ati wins again ! /sarc
    Reply
  • Hrel - Wednesday, March 04, 2009 - link

    I can't wait to read your articles on the new mobile GPU's and I'm REALLY looking forward to a comparison between 1GB 4850 and GTS250 cards; as well as a comparison between the new design for the GTS250 512MB and the HD4850 512MB.

    It seems to me, if Nvidia wanted to do right by their customers, that they'd just scrap the 1GB GTS250 and offer the GTX260 Core216 at the $150 price point, it has a little less RAM so there's a little savings for them there. But then, that's if they wanted to do the right thing for their customers.

    It's about time they introduced some new mobile GPU's, I hope power consumption and price is down as performance goes up!

    I look forward to AMD releasing a new GPU architecture that uses significantly less power, like the GT200 series cards do. 40nm should help with that a bit though.

    Finally, a small rant: When you think about it, we really haven't seen a new GPU architecture from Nvidia since the G80. I mean, the G90 and G92 are just derivatives of that and they only offer marginally better performance on their own; if you disregard the smaller manufacturing process the prices should even be similar at release. Then even the GT200 series cards, while making great gains in power efficiency, are still based on G92 and STILL only offer marginally better performance than the G92 parts; and worse, they cost a lot to make so they're overpriced for what they offer in performance. I sincerely hope that by the end of this year there has been an official press release and at least review samples sent out of completely new architectures from both AMD and Nvidia. Of course it'd be even better if those parts were released to market some time around November. Those are my thoughts anyway; congrats to you if you actually read through all of this:)
    Reply
  • SiliconDoc - Wednesday, March 18, 2009 - link

    " It seems to me, if Nvidia wanted to do right by their customers, that they'd just scrap the 1GB GTS250 and offer the GTX260 Core216 at the $150 price point, it has a little less RAM so there's a little savings for them there. But then, that's if they wanted to do the right thing for their customers. "
    _________________

    So, they should just price their cards the way you want them to, with their stock in the tank, to satisfy your need to destroy them ?
    Have fun, it would be the LAST nvidia card you could ever purchase. "the right thing for you" - WHAT EVER YOU WANT.
    Man, it's just amazing.
    Get on the governing board and protect the shareholders with your scheme, would you fella ?
    Reply
  • Hrel - Saturday, March 21, 2009 - link

    Hey, I know they can't do that. But that's their fault too; they made the GT200 die TOO BIG. I'm just saying, in order for them to compete in the market place well that's what they'd have to do. I DO want them to still make a profit; cause I wanna keep buying their GPU's. It's just that compared to the next card down, that's what the GTX260 is worth, cause it's just BARELY faster; maybe 160. But that's their fault too. The GT200 DIE is probably the WORST Nvidia GPU die EVER made, from a business AND performance standpoint. Reply
  • SiliconDoc - Saturday, March 21, 2009 - link

    PS - you do know you're insane, don't you ? The " GT200 is the probably the worst die from a performance standpoint."
    Yes, you're a red loon rooster freak wacko.
    Reply
  • Hrel - Thursday, April 09, 2009 - link

    you left out Business standpoint, so I guess you at least concede that GT200 die is bad for business. Reply
  • SiliconDoc - Saturday, March 21, 2009 - link

    Now you claqim you know, and now you ADMIT there is no place for it if they did, anyhow. Imagine that, but "you know" - even after all your BLABBERING to the contrary.
    Now, be aware - Derek has already stated - the 40nm is coming with the GT200 shrunk and INSERTED into the lower bracket.
    Maybe he was shooting off his mouth ? I'm sure "you konw" -
    ( Like heck I am )
    Six months from now, or more, and 40nm, will be a different picture.
    Reply
  • Hrel - Wednesday, April 01, 2009 - link

    seriously, what are you talking about?
    pretty sure I'm gonna just ignore you from now on; pretty certain you are medically insane!

    I'd respond to what you said, I honestly have no idea what you were TRYING to say though.
    Reply
  • SiliconDoc - Wednesday, April 08, 2009 - link

    You don't need to respond, friend. You blabber out idiocies of your twisted opinion that noone in their right mind could agree with, so its clear you wouldn't know what anyone else is talking about.
    You whine nvidia made the gt200 core too big, which is merely your stupid opinion.
    The g92 core(ddr3) with ddr5 would match the 4870(drr5), which is a 4850(ddr3) core.
    So nvidia ALREADY HAS a 4850 killer, already has EVERYTHING the ati team has in that region - AND MORE BECAUSE OF THE ENDLESS "REBRANDING".
    But you're just too crewed up to notice it. You want a GT200 that is PATHETIC like the 4830 - a hacked down top core. Well, only ATI can do that, because only their core SUCKS THAT BADLY without ddr5.
    NVidia ALREADY HAS DDR3 ON IT.
    SHOULD THEY GO TO DDR2 TO MOVE THEIR GT200 CORE DOWN TO YOUR DESIRED LEVEL ?
    Now, you probably cannot understand ALL of that either, and being stupid enough to miss it, or so emotionally petrified, isn't MY problem, it's YOURS, and by the way, it CERTAINLY is not NVidia's - they are way ahead of your tinny, sourpussed whine, with aJUST SOME VERY BASIC ELEMENTARY FACTS THAT SHOULD BE CLEAR TO A SIXTH GRADER.
    Good lord.
    the GT200 chips already have just ddr3 on them mr fuddy duddy, they CANNOT cut em down off ddr5 to make them as crappy as the 4850 or 4830, which BTW is matched by the two years old g80 revised core- right mr rebrand ?
    Wow.
    Whine whine whine whine.
    I bet nvidia people look at that crap and wonder how STUPID you people are. How can you be so stupid ? How is it even possible ? Do the red roosters completely brainwash you ?
    I know, you don't understand a word, I have to spell it out explicitly, just the very simple base drooling idiot facts need to be spelled out. Amazing.
    Reply

Log in

Don't have an account? Sign up now