8800 GT 256 vs. Radeon HD 3870/3850

With the 8800 GT 256 priced at $219 - $229 it finds its way smack in the middle of the Radeon HD 3850 at $179 and the Radeon HD 3870 at around $250.

To make our job extremely difficult, the 8800 GT 256 manages to find itself performing in between the two Radeon HD 3800 cards in almost all benchmarks with a couple exceptions.

Performance in Crysis continues to be an issue for the Radeon HD series, which AMD insists is just an issue with driver optimizations. The same problem exists in Oblivion, but there's no excuse for a lack of driver optimizations there, Oblivion has been out for a very long time now.

Then there are games like World in Conflict where the 8800 GT 256 performs like a Radeon HD 3850 or worse.

Overall, it seems like the 256MB 8800 GT can justify its price. It's cheaper than the Radeon HD 3870, but performs slower in most cases, and more expensive than the 3850 but is faster. The problem here is that the Radeon HD 3870, at $250 isn't that much more expensive, and comes equipped with twice as much memory. If AMD could bring Crysis performance on par with NVIDIA's then it'd be an easy recommendation, but instead we're left with these weird caveats.

The Radeon HD 3870 gets the nod from us here because it's not much more expensive than the 256MB 8800 GT, you get twice the frame buffer and better performance in almost all scenarios. Crysis performance is a big deal however, and the 256MB 8800 GT is a bit cheaper, so if you want a slightly more affordable alternative to the Radeon HD 3870 but don't want to step down to the 3850 then it may not be a bad option.

8800 GT 512MB vs. 256MB Power Consumption
Comments Locked

56 Comments

View All Comments

  • Griswold - Thursday, December 13, 2007 - link

    Also (partly) wrong. Its a good price/performance part and its short in supply. That is why its priced higher. And I'm willing to bet the supply shortage is artificial. Look at how the availability of the GTS 512 is - seems to be much better than that of the GT. Its no surprise. Nvidias margins with the GT must be abyssmal compared to that of higher priced units (thats a given, but they also rendered almost their complete lineup obsolete for several weeks prior to the launch of the GTS 512), but they needed that horse to compete with the 3850/3870 price point.

    And you really need to stop talking out of your ass about the 3850. Its selling well and its selling at MSRP because supply is decent (and you lecture him about fundamentals... ). I think there was a the register claim of 150k units in 3 weeks. Well, thats three times the amount of the available 8800GT units in the same timeframe. Speaks for itself.
  • neogodless - Tuesday, December 11, 2007 - link

    Whew... just bought an 8800GT and would like to feel like it was a good buy for a *little while*! Hope it has enough supply to help drive prices down in general though...
  • R3MF - Tuesday, December 11, 2007 - link

    Where are the G92 GTS cards with memory over 2.0GHz?
    Does this preage the entrance of a G92 GTX with memory at 2.4GHz and a higher core clock?

    It isn't rocket science to put some decent speed memory on a midrange card. Witness the 3870 with 2.35GHz memory, so why haven't any of the so called "OC" versions of the G92 GTS got overclocked memory?

    At the same time we all want a card that can play Crysis at 1920x1200 at High details and still get around 30FPS. The GTS can get ~30FPS at Medium details........... whoopy-do!

    So, we know its possible to economically provide more bandwidth and we know its necessary, but nobody has done so including the OC'ed versions.

    Is this because there is a G92 GTX product around the corner?

    Yes i know there is rumoured to be a G92 GX2 dual-card sometime in january, but how about a non-cack single card version.

    A card with:
    720MHz core clock
    2000MHz shaders
    2400MHz memory
    1GB or memory

    would absolutely rock, so why haven't we got one?
  • kilkennycat - Tuesday, December 11, 2007 - link

    Memory tweaking of the current series is a tiny marginal benefit with a huge increase in power-dissipation. The G92 represents the last gasp of the current G8x/G9x architecture. The shrink was absolutely essential to nVidia's GPU business to get away from the huge, power-hungry and low-yield G80 GPU.

    The true high-end replacement family for the 8xxx-series is coming around Q2 of 2008. It has been in design for at least the past year and is NOT just a tweak of the G8x/G9x architecture. If you really HAVE TO upgrade your system right now, just get a SINGLE 8800GT 512. At this point in time, do not invest in SLI. Keep you hands in your pockets and wait for the next gen. A single copy of the high-end version of the next-gen GPU family from nVidia is likely to have more GPU horsepower than dual 8800GTX.
  • Griswold - Thursday, December 13, 2007 - link

    "The true high-end replacement family for the 8xxx-series is coming around Q2 of 2008. It has been in design for at least the past year and is NOT just a tweak of the G8x/G9x architecture."

    Its going to be an evolved (note: thats a fair bit more than just tweaked) G80/G92. You dont design a completely new architecture in a year. Remember what nvidia claimed at launch of the G80? Its been in the works for several years. They will squeeze every bit of revenue out of this architecture before they launch their true next generation architecture (on which at least one team must have been working since the launch of G80).
  • retrospooty - Tuesday, December 11, 2007 - link

    A card with:
    720MHz core clock
    2000MHz shaders
    2400MHz memory
    1GB or memory

    would absolutely rock, so why haven't we got one?

    Ummm.... Wait until the high end card is released in january and then see what the specs are. Its suppsed to be a dual GPU version like the 7950GTX was. So think 2 8000GT SLI performance. The memory wont likely be 2400mhz, but it will be dual channel for 512mbit bandwidth.

Log in

Don't have an account? Sign up now