8800 GT 512MB vs. 256MB

When AMD released the Radeon HD 3800 series, NVIDIA responded by saying that a cheaper 256MB version of the 8800 GT would be on its way, priced below $200. NVIDIA delivered on part of its promise, we do have a 256MB 8800 GT in hand but it's not a sub-$200 card. The 8800 GT 256 we have is the Alpha Dog Edition XXX from XFX, priced at $229 not including a $10 mail in rebate. That's not too far off the mark but it's still not less than $200.

The XFX card we have runs at a 650MHz core clock but only has a 1.6GHz memory data rate. The reference 512MB card runs at 600MHz core/1.8GHz memory.

Quake Wars starts off showing us a trend we'll see quite often with the 256MB 8800 GT, it performs virtually identically to its 512MB brother until after 1600 x 1200 then there's a sharp drop off:

The performance hit isn't as pronounced when you turn on AA, instead you get a 10 - 20% hit across the board:

Bioshock shows the same thing, competitive performance up to 1600 x 1200 but at 1920 x 1200 the 512MB card has a 16% advantage, and a 60% advantage at 2560 x 1600. It is worth noting that neither card is really playable at 2560 x 1600 in Bioshock.

World in Conflict moves the choke point up to 1600 x 1200; the two cards behave similarly at 1280 x 1024, but the 512MB 8800 GT holds on to a 20% minimum advantage at 1600 x 1200 and grows it to 40% at 2560 x 1600.

Older titles like Half Life 2 and Oblivion show absolutely no difference between the two cards, showing us that this current wave of games and most likely all those to follow require larger than 256MB frame buffers. While 256MB could cut it in the Half Life 2 and Oblivion days, the same just isn't true any more.

What we have here is an 8800 Ultra that's $50 more for not much more gain, and a 256MB 8800 GT that's at least $70 cheaper for a lot less performance. If you plan on keeping this card for any length of time, it looks like 512MB is the way to go. Frame buffer demands of modern games are only going to increase, and it looks like what we're seeing here today is an indication that the transition to 512MB as a minimum for high end gaming performance is officially underway. The 768MB memory sizes of the 8800 GTX are still not totally required, but 512MB looks like the sweet spot.

8800 GTS 512 vs. 8800 GT 8800 GT 256 vs. Radeon HD 3870/3850
Comments Locked

56 Comments

View All Comments

  • AnnonymousCoward - Wednesday, December 12, 2007 - link

    So the GTS 512 vs the Ultra. The GTS does 26/47 watts less. What's the voltage, 1.5V? So the Ultra draws 17/31 amps more? That's a lotta current.
  • TheRealMrGrey - Wednesday, December 12, 2007 - link

    The authors of this review failed to comment on the fact that the 8800 GT 512MB is still under stocked and out of stock just about everywhere! Yeah, it's a really great card, but no one can purchase it! So what's the point? Just to make all those people who already have one feel good? Blah!
  • Mgz - Tuesday, December 11, 2007 - link

    so you compare an overclock version of the 8800 GT 256 MB vs the default NO OC HD 3850 and HD 3870 ? at least to make it fair you could compare to an OC version of HD 3850/3870 or compare the non-XXX version to the default clock 3800.

    =(
  • just4U - Tuesday, December 11, 2007 - link

    I didn't realize they were comparing stock to overclocked. If they were then it's the only oversight in the review. Well done Anand, Finally a review of the 8800GT 256Meg I don't take with half a pound of salt...

    ... Maybe just a dash tho! ;)
  • LRAD - Tuesday, December 11, 2007 - link

    My LCD is 1440 x 900 and it is dissapointing to see so much concern for the high resolutions only. For instance, would a 256 meg solution be fine in the near future for that res? The article beats us over the head with the fact that 256 megs is not enough, but at a lower resolution, might it be?
  • redly1 - Tuesday, December 11, 2007 - link

    Thanks for the bar charts at the end. That somehow summed it up for me. Glad to see the power consumption comparison in there too.
  • Spoelie - Wednesday, December 12, 2007 - link

    to be honest i really really like the line graphs more, don't really see what's more clear with the bar graphs

    guess it's a never ending debate
  • Zak - Tuesday, December 11, 2007 - link

    I want a high end $500-600 monster that's at least twice as fast as my current 8800GTX that can play Crysis on 24" screen with reasonable framerates:( I'm thinking about getting another GTX and go SLI but I hear some games, Crysis in particular, don't gain much from SLI. And, of course, the day I shell out $500 on another 8800GTX Nvidia will release 9800GTX or something:( Frustrating....

    Zak
  • Bal - Tuesday, December 11, 2007 - link

    I think every FPS bar chart should have a FPS/$ overlay. You could incorporate it on all your bar charts and allows users to really compare "bang for buck" vs performance for games they are interested in without adding more graphs..
  • Bal - Tuesday, December 11, 2007 - link

    dang no edit...that was supposed to be an original post...

Log in

Don't have an account? Sign up now