8800 GT 512MB vs. 256MB

When AMD released the Radeon HD 3800 series, NVIDIA responded by saying that a cheaper 256MB version of the 8800 GT would be on its way, priced below $200. NVIDIA delivered on part of its promise, we do have a 256MB 8800 GT in hand but it's not a sub-$200 card. The 8800 GT 256 we have is the Alpha Dog Edition XXX from XFX, priced at $229 not including a $10 mail in rebate. That's not too far off the mark but it's still not less than $200.

The XFX card we have runs at a 650MHz core clock but only has a 1.6GHz memory data rate. The reference 512MB card runs at 600MHz core/1.8GHz memory.

Quake Wars starts off showing us a trend we'll see quite often with the 256MB 8800 GT, it performs virtually identically to its 512MB brother until after 1600 x 1200 then there's a sharp drop off:

The performance hit isn't as pronounced when you turn on AA, instead you get a 10 - 20% hit across the board:

Bioshock shows the same thing, competitive performance up to 1600 x 1200 but at 1920 x 1200 the 512MB card has a 16% advantage, and a 60% advantage at 2560 x 1600. It is worth noting that neither card is really playable at 2560 x 1600 in Bioshock.

World in Conflict moves the choke point up to 1600 x 1200; the two cards behave similarly at 1280 x 1024, but the 512MB 8800 GT holds on to a 20% minimum advantage at 1600 x 1200 and grows it to 40% at 2560 x 1600.

Older titles like Half Life 2 and Oblivion show absolutely no difference between the two cards, showing us that this current wave of games and most likely all those to follow require larger than 256MB frame buffers. While 256MB could cut it in the Half Life 2 and Oblivion days, the same just isn't true any more.

What we have here is an 8800 Ultra that's $50 more for not much more gain, and a 256MB 8800 GT that's at least $70 cheaper for a lot less performance. If you plan on keeping this card for any length of time, it looks like 512MB is the way to go. Frame buffer demands of modern games are only going to increase, and it looks like what we're seeing here today is an indication that the transition to 512MB as a minimum for high end gaming performance is officially underway. The 768MB memory sizes of the 8800 GTX are still not totally required, but 512MB looks like the sweet spot.

8800 GTS 512 vs. 8800 GT 8800 GT 256 vs. Radeon HD 3870/3850
Comments Locked

56 Comments

View All Comments

  • sliblue - Friday, December 14, 2007 - link

    Im begining to wonder --- I built a new machine based on the qx9650, asus p5e3 deluxe, and 1 8800 gtx (pci 16 on pci 2.0) 4 gig a ram and Vista 64. Loaded up crysis and told it to auto detect my settings and low and behold it spit out the recomendation of very high for everything. I launched the game and couldnt believe how smooth it was with one card on Very High. I am not overclocking anything and can see a huge difference between the qx9650 and the amd blackbox 6400 x2
  • Affectionate-Bed-980 - Friday, December 14, 2007 - link

    The 8800GT review was SOLID, but based on the comparisons you made with the 8800GT, don't you think you should include it here? You did 8800GT vs GTX in the last article, so don't you think you should do 8800GTS vs. GT vs. GTX? But instead you jump to Ultra. I guess it's great that we can go BACK to the 8800GT article and then kinda interpolate how the GTS will do against the GTX, and that's why I hate about reviews that don't include more info for our benefit.

    I don't see why a lot of these graphs can't be combined together?

    I think another issue for me is why the ATI cards now use so little power? In the ATI review, you showed the 3870 gobbling more power than the 8800GT under load, but now it's a clear winner in power consumption. What's the deal here?
  • Affectionate-Bed-980 - Friday, December 14, 2007 - link

    I think my other gripe with this review is that this is a NEW revision of the GTS. Don't you think it should be wise to compare both the old GTSes against this new revision? That's one thing I really wanted to see in the GT review too. How do the 320/640 GTSes stack up against the GT. What about in this review?
  • afrost - Thursday, December 13, 2007 - link

    Except that to get a decent cooler on the 8800GT you have to spend another $40 at least for an aftermarket cooler.

    I personally prefer the GTS becuase I can just stick it straight in to my box without ripping the stock cooler off, and it's a little bit faster on top of it. I also didn't have Crysis and mine comes with it in the box.....so overall a good buy in my particular situation.
  • nubie - Thursday, December 13, 2007 - link

    I am sorry you feel that way, my EVGA 256-P3-N791-AR 8800gt 256 comes with the re-designed heatsink and fan and all I had to do was pull a slider to get 710 1720 1000 clocks, and it didn't overheat.

    Most of the newer GT's (possibly all the 256mb ones) are coming with a better cooling solution. As for the GTS, yes, great, if you have 2 slots for cooling, not everyone does. Oh, yeah, and a spare $100-150 for only 16 more stream processors?? My GT has the same memory bandwidth (64GB/s) when I pull the slider to 1Ghz(2G DDR).

    In a perfect world, of course I choose the absolute best, but on a budget an 8800gt is just fine.
  • ashegam - Thursday, December 13, 2007 - link

    Did I miss this in the article or in the comments or has no one mentioned that this new gen card won't be supporting directX 10.1?
    and that doesn't bother anyone interested in purchasing this card?
    should it not be a concern for a potential buyer?
  • Distant - Wednesday, December 12, 2007 - link

    I'll apologize if they acknowledge their mistake and include 8xAA in their tests from now on, I think cards are powerful to the point now were it should be happening in the mid/high range cards anyway.

    Why's this matter? most of those frame rates aren't playable? Well not quite as you saw Oblvion and Prey were, furthermore that site in particular only really tested the very newest games.

    Don't you play any older games? How about any of Valves games? TF2 maybe? I take it you do and I would think most people would want to know their 8800GT is going to get obliterated when they try higher then 4xAA

    And what about the implications for SLI/Crossfire? Surely if you have a cross/sli setup your going to want to run 8 and in some cases even 16xAA on games not named Crysis.
  • Distant - Wednesday, December 12, 2007 - link

    In case you guys are wondering what Nvidia payed anandtech not to show you take a look at

    http://www.computerbase.de/artikel/hardware/grafik...">http://www.computerbase.de/artikel/hard...en/2007/...

    You can clearly see in the 8 tests that they did do 8xAA on

    Anno1701
    Clive Barkers Jericho
    FEAR
    Oblivion
    Prey
    Company of heroes DX9 and 10
    Lost Planet

    In every single one of those with the exception of FEAR and Company of heroes in DX9 mode the 8800GT's framerates literally drop like a freaking rock, in some cases it's performance getting cut in more then half while the 3870's really takes much of hits at all and because of this the 3870 overtakes the 8800GT at this level of AA.

    Now call me crazy, but I think most of us don't have a 24+ inch monitor and if you do you really need two cards anyway, my point is looking at the frame rates in most games your going to be wanting to game at 8xAA and if you just read this article you wouldn't have known that the 8800GT appears to be garbage at high levels of AA

  • Zak - Wednesday, December 12, 2007 - link

    What exactly are we supposed to be looking at? Besides the fact that it's in German, I see no graphs, no tables. And actually a lot gamers play at resolutions higher than 1280x1024 so 8xAA is mostly unrealistic for any game on any card today. I'm happy when I can get playable framerates with 4X or even 2x on 8800GTX OC in modern games. Accusing Anandtech of being paid by Nvidia (not payed by the way) is baseless and I think out of order. I'd apologize if I were you...

    Z.
  • nubie - Wednesday, December 12, 2007 - link

    I bought the EVGA 8800GT 256MB from Newegg on Sunday for $215, of course they are long gone by now.

    This card has 1800mhz RAM, where does that fall in the charts? This means it has identical memory bandwidth to the GT 512's

    I hope that the prices fall and the EVGA Step-Up program will let me get something nice within the 90-days, after this madness is over.

    Either way my monitor is only 1280x1024, and I am not married to AA, so this seems the best choice, for now.

    I am still waiting for some good drivers, I barely beat my 3Dmark01 score(+200) against my old 7900GS 650mhz, that is pretty sad, I don't think that these drivers are properly functional yet.

    Stereoscopic 3D gaming is out the window too, I had a spontaneous reboot already, whether this is the fault of the GT, all 8 series cards, dual-core processor, non-existent stereo drivers or some wacky combination, I don't know.

    When you guys get the EVGA 1800mhz card to bench, let us know asap, they don't claim it as a "clocked" card on the line-up, or in the RAM.

    I got 43/19/115 framerate in World In Conflict Very High settings, 1280x1024 on an x2 4600/2GB DDR2 800/ DFI Infinity II/M2, not bad for the money.

    I am still curious about the dual-core CPU effect, I can only hit 2.6 on this proc (65w Windsor), and I don't run it there. I am getting a single-core Windsor soon and hope to have it clocked to 3ghz like my 939 was, that seemed much faster in game than the x2 4600 ever was (obviously with no background apps.)

Log in

Don't have an account? Sign up now