Even though sub $100 hardware is very high volume, we don't often see a lot of heated debate surrounding them. People don't usually get excited about mainstream and low end hardware. The battle for who can run the newest game with the coolest effects at the highest resolution, while not applicable to most people, tends to generate quite a bit of interest. There is a lot of brand loyalty in this industry, and people like to see the horse they backed come out on top. Others, while not siding with a particular company, jump on the wagon with what ever company has the fastest part at any given time. I, myself, am a fan of the fastest hardware out at any given time. I get excited by how far we've come, and how much closer the top of the line gets us to the next step. Keeping up with top of the line hardware is more like attending a sporting event or taking in a play: the struggle itself is entertainment value.

For some, knowing what's best does have relevance. For many many others, it is more important for to keep track of hardware that, while cheap, is as capable as possible. And that is where we are today.

At the end of July, NVIDIA released their GeForce 9500 GT. This part (well, the GDDR3 version anyway) is almost a drop in replacement for the 8600 GT as far as the specifications go. In fact, the prices are nearly the same as well.

No, it isn't that exciting. But even these very low end add in cards are head and feet above integrated graphics solutions. While we'd love to see everything get more performance, the price of the 8600 GT has dropped significantly over time. We haven't gotten to a point where people who aren't willing or able to spend above $100 on a graphics card can get good experiences on modern games. At least software and hardware complexity tend to parallel each other to the point where the disparity in how new a title can be played on cheap hardware isn't getting any worse.

So with so many similarities, why release this part? There won't be an endless supply of G84 hardware going forward. Thus the G96 comes along with nearly the same specs selling at the same price. The decreased die size of the 65nm G96 (as opposed to the 80nm G84) will also help to increase profits for NVIDIA and board partners on this hardware while they sell at the same price point. There are rumors that NVIDIA will even move the G96 to 55nm later this year further increasing their saving and possibly enabling passive cooling solutions. But we will have to wait for a while yet to find out if that will actually happen.

Before we get into the 9500 GT itself, let's take a look at the state of the industry that brought us to this point.

Mainstream Graphics Today


View All Comments

  • strikeback03 - Tuesday, September 09, 2008 - link

    Is it a failure from the GPU maker side though? Take the 3850. It launched as a ~$170 part, and sold well there. They have probably earned back their development costs, so any profit over the manufacturing cost is gravy now. So if they can convince the buyers to go for the $100 last-gen part instead of the $75 current-gen, they make more money and can spend less in development on the low end. Not great for the consumer, but good for their bottom line.

    What I want to know is how some of these cheaper cards perform outputting video to an HDTV or something. I built a computer for my brother-in-law a few months ago. He had no need for extensive 3D capability, but wanted to be able to run stuff on the TV from the computer. I ended up putting a 9600GT in the system, but couldn't really find any info on these cards in non-gaming scenarios.
  • toyota - Monday, September 08, 2008 - link

    well Jarred the 9500gt is a completely different core than the 8600gt but yeah its pretty much the same specs. Nvidia loves to have those big numbers. look at most of their very low end parts because they are recycled for several generations. Reply
  • kevinkreiser - Saturday, September 06, 2008 - link

    cards like these are great for htpc owners who need a little bit of graphics performance but not the huge heat and power requirements of a bigger card. i wonder if these new cards play well with the newest htpc motherboards. i just got the asus p5q-em and dropped in an 8800gt to see what would happen. after trying out a billion graphics driver versions i found out the that newest nvidia drivers don't work with that configuration. i had to settle for the 169.02 version drivers. lets hope nvidia debugs the drivers for the htpc crowd by testing on typical htpc mobos like the asus p5q-em. Reply
  • djfourmoney - Sunday, September 07, 2008 - link

    Yeah but you can be HTPC use out of the upcoming 9400GT or HD4450 which will be out before Thanksgiving...

    As was mentioned before, now that the online media has gotten around to testing this card, its too late! The HD4670 will be this coming Wed and I plan to pick up maybe one or two just for giggles and offer to send one out to a web site if they haven't gotten their boards yet. Or I might upgrade to a Phenom, 790GX and two HD4670's

    Somebody already Crossfired some engineering samples that Diamond sent him as reward for a raffle he won. Check out Overclock.net and search "HD4670"


    The HD4670 beats it by at least 40% its not even close. If AnandTech was looking for a card that could be a game changer for the PC gaming market this could be the start. PC programers should not require somebody to spend $200 on a card just to get good performance. Crysis is the perfect example. Experienced Console programers like Codemasters has done much better with GRID. It will run on Midrange hardware as was proven in the game review on here.

    Even if you turn down the detail its no worst than a PS3 or Xbox 360. I ran it at 1920x1200@60hz and got upper 50+fps and it was more than playable, I noticed NO slow down or stutter, gliches, nothing. It could have been a console game save for it crashing to BSOD which only PC's do!

    You could Crossfire two HD4670's and play anything on the market. Maybe not at the Ultra or Very Highest detail setting but at the very least at default which is usually high.
  • wicko - Sunday, September 07, 2008 - link

    Exactly... there is no need for the 9500, when you have the 9400 or 4450, or AMD's 780G platform. If you want a good HTPC, you should be buying one of those, not a 9500GT. Much less heat and noise, as well as power consumption, you won't be playing many games on them but then again who games on an HTPC?

    You might say, well the 9500 is good for media stuff but then I can also game with it! Well, just like everyone else is saying, you can get the much faster 4600 when it comes out to your region, or the 3800 now. The 9500GT is definitely pointless, and I think nVidia is hoping people will buy it without doing research.
  • DerekWilson - Saturday, September 06, 2008 - link

    sry -- i had originally left this table out. Reply
  • Finally - Saturday, September 06, 2008 - link

    [quote]It is possible that Larrabee could be a disruptive technology in this market. If Intel is able to deliver a top to bottom launch on day one with volume on all parts, the way graphics hardware is addressed could see a fundamental shift. We might just see the competition realize that they need to change their ways and address the all important low end space with new generations as quickly as possible.[/quote]

    Would you kindly refrain from whipping out your Intel-appreciation crystal ball each time you review hardware that's completely unrelated to Intel's could-bes, might-bes and ifs?
    Thank you.

    PS: Review the cr*p out of it, once it is released, but for this time, if the topic is completely different, why? WHY?

    PPS: Oh, the paycheck... I see.
  • DerekWilson - Saturday, September 06, 2008 - link

    it's just frustrating to see neither nvidia nor amd really pushing performance in this segment. i think after seeing the crappy performance of the 9500 gt in this space that the unknown factor that intel brings to the party might be what we need to get nvidia and amd in line.

    tbh, i don't care so much about how larrabee performs (though it'd be nice to have another solid competitor in the market). what i do care about is nvidia and amd not writing intel off ... i want them to be afraid and to really push the envelope next year.

    my speculation was not for the benefit of intel (they'll sink or swim on their own merit) but for the benefit of the consumer at the response of nvidia and amd to the possibility of competition at the low end.
  • djfourmoney - Sunday, September 07, 2008 - link

    AMD pushed it, wait and see, faster card you can buy for under $100 no rebate needed!

    It beats the 9500 by 40% in all the samem resolutions they tested. Its also slower than a HD3850 but only by a tiny margin and given you don't need an external connector, draws only 75w under load and it perfect for the pre-built PC, HTPC PC crowd that might game on occasion is just fine. You can run games at 720p and frames rates will be more than exceptable.

  • nubie - Saturday, September 06, 2008 - link


    This card is a failure, nevermind that ATI/AMD can spank it all the way to town and back with an HD3850, Its own siblings the 8800GS/ 8800GSO, 9600GT and 9600GSO simply mop the floor with it, and for around $10-20 more.

    I am so sick of seeing posted in a forum: "I got a new video card and payed $120 for it, but the 8600GT won't let me play [insert any game from last 2 years] properly".

    The "street" price of these cards is well north of $100. The web price may be in the $50-70 range, but the card is sold retail.

    I wish nVidia would simply give up on this price point pushing. The market is saturated, no need to fill a point that you will need to unload your high-midrange cards into in a few months.

    I don't see a reason for this card. Really. I could be biased, but why spend money on a card that will need a few driver revisions to be as compatible as the 8600GT already is?

    I suppose that you generate 2 sales by releasing this card, but one of them may be to the competition if they are really disgusted.

    I notice that nVidia go to interesting lengths to hide the stream processors and memory bus widths of their products. Nowhere on their site are there specifications for their product, you must go to a third party for the information most likely to determine if a certain product is likely to be fast enough.

    Forget educating their customers either. I hope that Intel does shake things up, because this is nuts.

Log in

Don't have an account? Sign up now