ATI vs. NVIDIA Once Again: 4670 vs 9500 GT & 9600 GSO

Now we get into the real competition. We're looking at AMD's newest mainstream card vs. NVIDIA's two latest entries into the sub $100 market. The 9600 GSO is just an 8800 GS. Though we didn't compare it here, the 9600 GT is priced right around $100 and offers performance a little better than the 9600 GSO. While we are comparing with the 9500 GT here, it will become quickly apparent that the card doesn't even come close to competing with the 4670.

For our medium quality Crysis test, the 4670 seems framelimited at about 60 here, while the 9600 GSO seems to push past the 60 fps barrier. At 1280x1024 and above, the 4670 leads the pack by a small margin in this benchmark.

Enemy Territory with 4xAA enabled shows a huge advantage for the 4670 over NVIDIA's more expensive 9600 GSO. Even so, the NVIDIA cards remain playable at 1280x1024, so the practical advantage is a little decreased until we get to 1680x1050.

Once again Oblivion offers us a role reversal when AA is enabled. Without AA, the 4670 falls behind above 1024x768, while it pulls a little ahead when 4xAA is enabled. Again, the 4670 and 9600 GSO are playable at 1280x1024 with AA, and might both be passable at 1680x1050 as well (Oblivion is a game that still offers a good experience at anything above 25 fps).

 

The 9600 GSO maintains a steady ~10% lead over the 4670 in Age of Conan. This is quite an interesting benchmark for NVIDIA to lead considering how handily they are clobbered at higher price points by the 4850 and 4870.

Performance in GRID is nearly identical between the 4670 and 9600 GSO.

 

Throw AA in there and we see a little separation trying to happen, but the 9600 GSO actually does keep up at lower resolutions with AA.

For The Witcher, while the 4670 leads at lower resolutions, performance converges at higher res. Both are playable at 1280x1024. We wanted to test AA in this one, but it is sort of difficult as the game limits the ability to enable AA based on framebuffer sizes; thus we couldn't enable 4xAA past 1024x768 on the 9600 GSO.

When we hit high quality Crysis, the 4670 leads.

And the lead just gets bigger if we look at medium quality (high quality shaders) with 4xAA enabled. The 4670 is borderline playable in this situation at 1280x1024 while the 9600 GSO falls way short.

Radeon HD 4670 vs. Last Year's $200 Offerings: The 3870/3850 Revisited What You Get for Your Money: 4870 vs 4850 vs 4670
Comments Locked

90 Comments

View All Comments

  • strikeback03 - Thursday, September 11, 2008 - link

    Plus, there is the fact that CRTs blurred everything. The LCD image is so much sharper (at native resolution) that jaggies are much more apparent.
  • razor2025 - Wednesday, September 10, 2008 - link

    4670 looks to be a great low-mid range card. I've been wanting to get a slimline PC, but the current choices in low-profile GPUs are still lacking. Sure there's the 9600GT low-profile, but that requires 400watt PSU and it's already a hot card in full-length form. If there's a 4670 low-profile, I'd buy it in a heartbeat as long as they keep it under $80.

    As for the review itself, it was terribly written. AT articles seems to be on a decline in recent times. Horrible graph choices and questionable writings. Also, a entire page dedicated to talking about a competitor's product. How low can we go AT? Oh and when can we have a decent motherboard roundup? You know, the one that was promised since last summer (when 690G came out)?
  • Pale Rider - Wednesday, September 10, 2008 - link

    <<<<<< Now, if we could get a 3870 for about $100 (a 9600 GT fits the bill here, as 3870 cards can't be found for that price), >>>>>>>

    This is just flat out misleading information.
  • djfourmoney - Wednesday, September 10, 2008 - link

    Okay honesty is the best policy time -

    When a co-worker wanted to purchase a new PC, he consulted me. I told him to "future proof" himself and get a Quad Core. Being a parent of 2 children and the only one working in the household, he can't afford to upgrade every 2-3 years like hardcore gamers, power users, overclockers.

    Now its your typical $900-1000 Dell and of course he could have gotten equal performance for much less if he DIY'ed it. Let's be honest people, when you consider there's still an intimidation factor with any electronics let alone PC's which seem to crash on their own (of course that's not always true either) you can understand that most people value pre-builts and being able to call up Dell tech support if something happens.

    At least when benchmarking, they should use not only the most powerful system they can find, but use one that a typical end-user would have and that's slow speed dual cores and even late of era single core CPU's like the Pent D and AMD 64's.

    I bought my Dell back in 2003, I didn't build a new system until I found I couldn't get more out of my old system, that was about 8 months ago (2008). So five years between systems and if wanted to watch HD content the regular way, my old Dell was just fine the way it was and did play HD@720p without issue.

    This new system is middle of the road in terms of power and crushes most PC's between $600-700 available from HP/Compaq or Dell (3.1Ghz 5000+ BE).

    My point is, that most people don't have PC's with $200-$1200 GPU's and couldn't fathom spending $500 for a video card, not even $200, $100 is the threshold for most people and that's pushing it, only 20somethings and teens would even think its "reasonable" for gaming performance to spend $100 on a card.

    I'll go ahead and do what all these other sites aren't doing, I'll give you a user review of the HD 4670 on a basic system (Dell 530) on a 19" LCD, depends when I get one. Currently only New Egg has it and I won't be able to order until Friday.

    If you have a modern CPU, with only a 128-bit bus, I doubt an older dual core or even a single core would bottleneck performance. It really depends on what games you play. FPS are more GPU dependent than CPU. Racing games because of physics and AI tend to use a fair amount of CPU power, which is why GRID recommends 3.0Gz single cores but on Bit Tech they tried it with both a single core and dual core and it clearly ran faster with a dual core CPU.






  • tacoburrito - Wednesday, September 10, 2008 - link

    Seriously, does people really expect a $79 card to perform anywhere near the level of a $180 card, i.e. the 4850? AMD would be stupid to do that. If that happens, who would want to buy the 4850 or 4870? AMD crippled the 4670's performance for a reason, i.e. not to cannabalize the sales of its higher end cards.
  • The0ne - Wednesday, September 10, 2008 - link

    This article could be one that girls/women would avoid. Just seeing the term "epic fail" is already a turn off for me. Just seem so childish and in the same terminology I guess, childish for a review.
  • Laura Wilson - Wednesday, September 10, 2008 - link

    ok i'm a girl/woman and my favorite part of this article was the term "epic fail," but perhaps i'm stunted in my fifth grade humor...
  • Pale Rider - Wednesday, September 10, 2008 - link

    I agree. What are we in the 5th grade? I bet the neffers in OT love it.
  • Pale Rider - Wednesday, September 10, 2008 - link

    Once again AT has an ATI review that has an entire page reserved for nVidia product information. Every ATi review we get from AT seems to have an entire page dedicated to nVidia products. Funny how the nVidia reviews NEVER have entire pages dedicated to ATi products.
  • KikassAssassin - Thursday, September 11, 2008 - link

    That's because with every ATI release, nVidia scrambles to put out a new (usually re-hashed) part as an answer to ATI's new product, so the review sites naturally compare the two cards together. ATI doesn't have OCD about putting out a direct answer to every single product their competitor releases like nVidia does, so review articles on nVidia products don't usually have anything new from ATI to talk about.

    You can't blame the review sites for this one. They're just reporting on what the companies are doing. Instead, blame nVidia for saturating the market with a ridiculous number of redundant products.

Log in

Don't have an account? Sign up now