In the beginning there was the GeForce 8800 GT, and we were happy.

Then, we then got a faster version: the 8800 GTS 512MB. It was more expensive, but we were still happy.

And then it got complicated.

The original 8800 GT, well, it became the 9800 GT. Then they overclocked the 8800 GTS and it turned into the 9800 GTX. Now this made sense, but only if you ignored the whole this was an 8800 GT to begin with thing.

The trip gets a little more trippy when you look at what happened on the eve of the Radeon HD 4850 launch. NVIDIA introduced a slightly faster version of the 9800 GTX called the 9800 GTX+. Note that this was the smallest name change in the timeline up to this point, but it was the biggest design change; this mild overclock was enabled by a die shrink to 55nm.

All of that brings us to today where NVIDIA is taking the 9800 GTX+ and calling it a GeForce GTS 250.

Enough about names, here's the card:

You can get it with either 512MB or 1GB of GDDR3 memory, both clocked at 2.2GHz. The core and shader clocks remain the same at 738MHz and 1.836GHz respectively. For all intents and purposes, this thing should perform like a 9800 GTX+.

If you get the 1GB version, it's got a brand new board design that's an inch and a half shorter than the 9800 GTX+:


GeForce GTS 250 1GB (top) vs. GeForce 9800 GTX+ (bottom)

The new board design isn't required for the 512MB cards unfortunately, so chances are that those cards will just be rebranded 9800 GTX+s.

The 512MB cards will sell for $129 while the 1GB cards will sell for $149.

 

While the GPU is still a 55nm G92b, this is a much more mature yielding chip now than when the 9800 GTX+ first launched and thus power consumption is lower. With GPU and GDDR3 yields higher, power is lower and board costs can be driven down as well. The components on the board draw a little less power all culminating in a GPU that will somehow contribute to saving the planet a little better than the Radeon HD 4850.


There's only one PCIe power connector on the new GTS 250 1GB boards

Note that you need to have the new board design to be guaranteed the power savings, so for now we can only say that the GTS 250 1GB will translate into power savings:


These are the biggest gains you'll see from this GPU today. It's still a 9800 GTX+.

Why NVIDIA Did It
Comments Locked

103 Comments

View All Comments

  • Wurmer - Tuesday, March 3, 2009 - link

    Performances aside, Nvidia should get their naming scheme straight. All this renaming and name swapping only contributes to get the customers confused. Now matter how it's explained, make this simple. Higher number, more powerful card ! In this regard, I find that ATI has made an effort of late.

    I'll also agree with one of the above poster that Nvidia was taken aback by the release of the 4870 and 4850. ATI hit the nail right on the head and the green team seems to have a bit hard time devising a proper response. Instead of getting their pray in their gun sight they use a shotgun and pepper the target all around......
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Shotguns usually KILL with just one shot - and ATI has caused another charge off, another BILLION dollar loss for AMD.
    I'm not sure NVidia hit them, (they obviously don't need to) so let's hope they don't, as well.
    I'm also not sure what it means when people keep speculating that ATI "caught them off guard" - it doesn't really mean anything - it's just a stupid way of saying "ATI did better than I expected" ( but it's "cool" to not say that and put down NVidia instead, huh... since so many around the geek spots "taught you to say it that way" ).
    Then after "being caught off guard" NVidia "drops a card" and it's because "they panicked" - right ?
    DEREK - explained it - didn't he.... " NVida released the 9800GTX... "on the eve of the 4850 launch"....
    YES, THAT'S HOW OFF GUARD THEY WERE.... NVIDIA HUH...
    They released a DAY BEFORE ati did....
    And they're STILL USING the 9800 - to battle the 4850 that was relased AFTER Nvidia....released their card.
    Oh well....
    DEREK tried to make nvidia sound evil, too - for releasing "on the eve of " the sainted red card 4850 release day - those nasty nvidia people spoiling the launch by releasing on ati's "eve"...
    By golly, it's no wonder Obama was elected my friend.

  • Mr Perfect - Tuesday, March 3, 2009 - link

    Hey, uhm, in that link posted on page two, there is mention that the press review cards are specially picked by Nvidia. Any idea if this is true?
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Doesn't matter when they are stock clocked. The reviewers can do all sorts of things to "compensate" for how they want outcomes, anyway, like editing ini files and choosing the games and the order - heck even using a card that isn't the card they are supposedly reviewing.
  • spinportal - Tuesday, March 3, 2009 - link

    The GTX250 1GB is barely useful over it's 512MB counterpart except for power usage and slot size. The est. street price of 149 is already countered by the AMD4870 512MB and tests shows it's a hair better. Given the 250 uses less juice over the 4870, it's odd that the 250SLI is using more juice than the 260 core 216 SLI, so there goes that benefit.
    NVidia cannot strip down the GT200 core to reduce the power load from 2 6-pin to just 1 for 150W. Perhaps there is something to be said for GDDR5 power reduction.
    Either way, the 250 is a win for nvidia in the mainstream budget for less power usage, CUDA, & Physx at the same price of the 4870 512MB which runs hotter, noisier (probably) and less feature rich. Does DX 10.1 matter at this point? PureVideo 2 is a wash vs. AMD's UVD.
    It's distateful rebadging a GT92b in the GT200 naming scheme. This helps NVidia's costs by EOL the whole 8800GT architecture.
    But by April, who's going to care? New spins are comings. This stop gap is only to reduce bleeding. Hopefully next gen is executed better so performance grows as power demands decrease.
  • kx5500 - Thursday, March 5, 2009 - link


    Shut the *beep* up f aggot, before you get your face bashed in and cut
    to ribbons, and your throat slit.
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Take your finger off the key, you're repeating, you dummy.
  • Wolfpup - Tuesday, March 3, 2009 - link

    I'm none to crazy about the random GPU/CPU naming this industry has always seen, but I disagree with Nvidia somehow needing a budget version of the 260/280 GPUs.

    I mean the 8800 series is basically the same thing. Sure there are some improvements here and there, but basically the 260 is the same thing, but with massively more execution hardware. I don't really see this huge distinction between buying a 250/9800GTX+ and it not being a stripped down 260...I mean it would almost be the same thing anyway.

    And if nothing else, it's great to see how much less this 250 uses in terms of power. I mean this is still a really nice GPU that I'd be glad to have in a system. (I'm on a laptop with a 9650GT, which is yet another 32 processor part...and even this isn't half bad at all!)
  • SiliconDoc - Wednesday, March 18, 2009 - link

    The red fanboys need Nvidia to cut the arms and legs off the GT200 and turn it into a puny 4830 (that has the top core ati has made, can make).
    The PROBLEM for red fanboy red freaks is just that... their great and sainted ATI has their "superwhomper super efficient super technology core! Oh Goddd! it's so greaaattt ! (*red boy orgasm)" - in the lowly 4830 - now compare that to the GTX260/192 and THAT'S where the red fanboys stands (or, really cries, rather).
    Now look at the 4830 top ATI core and compare it to the GTX285... oh that's PAINFUL.
    Now do the reverse...
    Compare the top core 4870 to the top Nvidia core GTX260/192 (lowest itteration) - and golly, it EVEN BEATS the 4870 sometimes...
    So THERE you have "the big ugly problem" for the red fanboys - who keep wailing that Nvidia MUST make a lower GT200 part for them...
    ( their tiny inner red child needs some comfort - it's just not fair with that big evil green GT200 SPANKING the rv770 so badly ! It's "abuse" ! )
    Can we have a GT200 core that is as LOUSY at the 4830 ?!?! please please pretty please!!! we have some really hurting little red fanboy crybaby whining fud propaganda doper diaper babies that need some satisfaction and their little red egos massaged...
    DON'T make them face the truth, EVER , nvidia, you big mean greedy green ...
    ( Yes, dude, they keep begging for it, and THAT'S WHY ! )
    There is no doubt about it.
  • LuxZg - Tuesday, March 3, 2009 - link

    How about making an article where you'd test all those G92 renames, rebrands, overclocks and shrinks?

    So put a fight between 8800GT 256MB (137$), 8800GT 512MB (154$), 8800GTS (171$), 9800GT 512MB (148$), 9800GT 1GB (171$), 9800GTX (205$), 9800GTX+ (214$), 9800GTX 1GB (228$), GTX250 512MB (+230$ ??), GTX250 1GB (+240$ ??). And if I've missed some variation, please include that too :)

    Than test them all overclocked with their default cooling.

    I just want to see how much have they gone from the first revision till these last ones. Prices in brackets are for local prices in Croatia where I live. And yes, you can buy them all.. I even found an 8800GTX 768MB card and 8800GTS 320MB as well, interesting, both at the same 171$ price that gets you "new" 8800GTS (512MB) or 9800GT 1GB :D

    Now, since you can buy all these cards, and most of them are really close in price (some 100$ from all-new top to the rock-bottom 8800GT 256MB; if you exclude those than it's just some 60$ difference) - it would be an interesting article. Especially for those aiming at SLI for their old card which they've bought earlier.

    And while here, I support the above comment - try SLI these 8800/9800 cards with these new GTX 250. Should have no problem, but to check the performance (gains) anyway..

Log in

Don't have an account? Sign up now