Back to Article

  • sliblue - Friday, December 14, 2007 - link

    Im begining to wonder --- I built a new machine based on the qx9650, asus p5e3 deluxe, and 1 8800 gtx (pci 16 on pci 2.0) 4 gig a ram and Vista 64. Loaded up crysis and told it to auto detect my settings and low and behold it spit out the recomendation of very high for everything. I launched the game and couldnt believe how smooth it was with one card on Very High. I am not overclocking anything and can see a huge difference between the qx9650 and the amd blackbox 6400 x2 Reply
  • DLeRium - Friday, December 14, 2007 - link

    The 8800GT review was SOLID, but based on the comparisons you made with the 8800GT, don't you think you should include it here? You did 8800GT vs GTX in the last article, so don't you think you should do 8800GTS vs. GT vs. GTX? But instead you jump to Ultra. I guess it's great that we can go BACK to the 8800GT article and then kinda interpolate how the GTS will do against the GTX, and that's why I hate about reviews that don't include more info for our benefit.

    I don't see why a lot of these graphs can't be combined together?

    I think another issue for me is why the ATI cards now use so little power? In the ATI review, you showed the 3870 gobbling more power than the 8800GT under load, but now it's a clear winner in power consumption. What's the deal here?
  • DLeRium - Friday, December 14, 2007 - link

    I think my other gripe with this review is that this is a NEW revision of the GTS. Don't you think it should be wise to compare both the old GTSes against this new revision? That's one thing I really wanted to see in the GT review too. How do the 320/640 GTSes stack up against the GT. What about in this review? Reply
  • afrost - Thursday, December 13, 2007 - link

    Except that to get a decent cooler on the 8800GT you have to spend another $40 at least for an aftermarket cooler.

    I personally prefer the GTS becuase I can just stick it straight in to my box without ripping the stock cooler off, and it's a little bit faster on top of it. I also didn't have Crysis and mine comes with it in the overall a good buy in my particular situation.
  • nubie - Thursday, December 13, 2007 - link

    I am sorry you feel that way, my EVGA 256-P3-N791-AR 8800gt 256 comes with the re-designed heatsink and fan and all I had to do was pull a slider to get 710 1720 1000 clocks, and it didn't overheat.

    Most of the newer GT's (possibly all the 256mb ones) are coming with a better cooling solution. As for the GTS, yes, great, if you have 2 slots for cooling, not everyone does. Oh, yeah, and a spare $100-150 for only 16 more stream processors?? My GT has the same memory bandwidth (64GB/s) when I pull the slider to 1Ghz(2G DDR).

    In a perfect world, of course I choose the absolute best, but on a budget an 8800gt is just fine.
  • ashegam - Thursday, December 13, 2007 - link

    Did I miss this in the article or in the comments or has no one mentioned that this new gen card won't be supporting directX 10.1?
    and that doesn't bother anyone interested in purchasing this card?
    should it not be a concern for a potential buyer?
  • Distant - Wednesday, December 12, 2007 - link

    I'll apologize if they acknowledge their mistake and include 8xAA in their tests from now on, I think cards are powerful to the point now were it should be happening in the mid/high range cards anyway.

    Why's this matter? most of those frame rates aren't playable? Well not quite as you saw Oblvion and Prey were, furthermore that site in particular only really tested the very newest games.

    Don't you play any older games? How about any of Valves games? TF2 maybe? I take it you do and I would think most people would want to know their 8800GT is going to get obliterated when they try higher then 4xAA

    And what about the implications for SLI/Crossfire? Surely if you have a cross/sli setup your going to want to run 8 and in some cases even 16xAA on games not named Crysis.
  • Distant - Wednesday, December 12, 2007 - link

    In case you guys are wondering what Nvidia payed anandtech not to show you take a look at">

    You can clearly see in the 8 tests that they did do 8xAA on

    Clive Barkers Jericho
    Company of heroes DX9 and 10
    Lost Planet

    In every single one of those with the exception of FEAR and Company of heroes in DX9 mode the 8800GT's framerates literally drop like a freaking rock, in some cases it's performance getting cut in more then half while the 3870's really takes much of hits at all and because of this the 3870 overtakes the 8800GT at this level of AA.

    Now call me crazy, but I think most of us don't have a 24+ inch monitor and if you do you really need two cards anyway, my point is looking at the frame rates in most games your going to be wanting to game at 8xAA and if you just read this article you wouldn't have known that the 8800GT appears to be garbage at high levels of AA

  • Zak - Wednesday, December 12, 2007 - link

    What exactly are we supposed to be looking at? Besides the fact that it's in German, I see no graphs, no tables. And actually a lot gamers play at resolutions higher than 1280x1024 so 8xAA is mostly unrealistic for any game on any card today. I'm happy when I can get playable framerates with 4X or even 2x on 8800GTX OC in modern games. Accusing Anandtech of being paid by Nvidia (not payed by the way) is baseless and I think out of order. I'd apologize if I were you...

  • nubie - Wednesday, December 12, 2007 - link

    I bought the EVGA 8800GT 256MB from Newegg on Sunday for $215, of course they are long gone by now.

    This card has 1800mhz RAM, where does that fall in the charts? This means it has identical memory bandwidth to the GT 512's

    I hope that the prices fall and the EVGA Step-Up program will let me get something nice within the 90-days, after this madness is over.

    Either way my monitor is only 1280x1024, and I am not married to AA, so this seems the best choice, for now.

    I am still waiting for some good drivers, I barely beat my 3Dmark01 score(+200) against my old 7900GS 650mhz, that is pretty sad, I don't think that these drivers are properly functional yet.

    Stereoscopic 3D gaming is out the window too, I had a spontaneous reboot already, whether this is the fault of the GT, all 8 series cards, dual-core processor, non-existent stereo drivers or some wacky combination, I don't know.

    When you guys get the EVGA 1800mhz card to bench, let us know asap, they don't claim it as a "clocked" card on the line-up, or in the RAM.

    I got 43/19/115 framerate in World In Conflict Very High settings, 1280x1024 on an x2 4600/2GB DDR2 800/ DFI Infinity II/M2, not bad for the money.

    I am still curious about the dual-core CPU effect, I can only hit 2.6 on this proc (65w Windsor), and I don't run it there. I am getting a single-core Windsor soon and hope to have it clocked to 3ghz like my 939 was, that seemed much faster in game than the x2 4600 ever was (obviously with no background apps.)
  • AnnonymousCoward - Wednesday, December 12, 2007 - link

    So the GTS 512 vs the Ultra. The GTS does 26/47 watts less. What's the voltage, 1.5V? So the Ultra draws 17/31 amps more? That's a lotta current. Reply
  • TheRealMrGrey - Wednesday, December 12, 2007 - link

    The authors of this review failed to comment on the fact that the 8800 GT 512MB is still under stocked and out of stock just about everywhere! Yeah, it's a really great card, but no one can purchase it! So what's the point? Just to make all those people who already have one feel good? Blah!
  • Mgz - Tuesday, December 11, 2007 - link

    so you compare an overclock version of the 8800 GT 256 MB vs the default NO OC HD 3850 and HD 3870 ? at least to make it fair you could compare to an OC version of HD 3850/3870 or compare the non-XXX version to the default clock 3800.

  • just4U - Tuesday, December 11, 2007 - link

    I didn't realize they were comparing stock to overclocked. If they were then it's the only oversight in the review. Well done Anand, Finally a review of the 8800GT 256Meg I don't take with half a pound of salt...

    ... Maybe just a dash tho! ;)
  • LRAD - Tuesday, December 11, 2007 - link

    My LCD is 1440 x 900 and it is dissapointing to see so much concern for the high resolutions only. For instance, would a 256 meg solution be fine in the near future for that res? The article beats us over the head with the fact that 256 megs is not enough, but at a lower resolution, might it be? Reply
  • redly1 - Tuesday, December 11, 2007 - link

    Thanks for the bar charts at the end. That somehow summed it up for me. Glad to see the power consumption comparison in there too. Reply
  • Spoelie - Wednesday, December 12, 2007 - link

    to be honest i really really like the line graphs more, don't really see what's more clear with the bar graphs

    guess it's a never ending debate
  • Zak - Tuesday, December 11, 2007 - link

    I want a high end $500-600 monster that's at least twice as fast as my current 8800GTX that can play Crysis on 24" screen with reasonable framerates:( I'm thinking about getting another GTX and go SLI but I hear some games, Crysis in particular, don't gain much from SLI. And, of course, the day I shell out $500 on another 8800GTX Nvidia will release 9800GTX or something:( Frustrating....

  • Bal - Tuesday, December 11, 2007 - link

    I think every FPS bar chart should have a FPS/$ overlay. You could incorporate it on all your bar charts and allows users to really compare "bang for buck" vs performance for games they are interested in without adding more graphs.. Reply
  • Bal - Tuesday, December 11, 2007 - link

    dang no edit...that was supposed to be an original post... Reply
  • Lennie - Tuesday, December 11, 2007 - link

    Gotcha!! AT

    J/K :b

    Test System got DDR2 mentioned as memory but the mobo is P5E3.

    Over and out.
  • Lennie - Tuesday, December 11, 2007 - link

    Man tat was quick. Thank ya. Reply
  • Cygnis - Tuesday, December 11, 2007 - link

    I've been reading these "benchmarks" for a while now. and the hardware is always a Intel w/ Nvidia chipsets etc.
    It's a little biased, in my opinion, to run an ATI card in those chipsets.

    It would only be fair, and more realisitic to run Both Nvidia cards and ATI cards in Two different Boxes, cross-manufacturer, to get a true idea.

    After all, you are trying to be fair in the representation of the data, no?

  • strikeback03 - Wednesday, December 12, 2007 - link

    The chipset is an Intel X38. As this can run Crossfire, I'd imagine it is reasonably friendly to AMD graphics cards. Reply
  • pilotofdoom - Tuesday, December 11, 2007 - link

    What happens when the 3850 512MB is compared to the 8800gt 256MB? Right now the 3850 512MB retails around $200, so about $20 more expensive than the 256MB version, but still $15 cheaper than the 8800GT 256MB card, assuming you find the cards in stock. Reply
  • Viditor - Wednesday, December 12, 2007 - link

    "Right now the 3850 512MB retails around $200"

    Actually, the 3850 is retailing for $169 at NewEgg...
  • kilkennycat - Tuesday, December 11, 2007 - link

    The default fan speed on the 8800GT (512) is 29% and the speed-profile is a joke. The fan speed does not move AT ALL until the GPU reaches ~ 94 degrees C!! This is not long-term-reliability funny at all. Is TSMCs commercial silicon-process rated for military-grade applications ( >70 degrees C )? I don't think so. And the only control the user has on fan-speed without a risky video BIOS sabotage is to use the fixed-fan settings courtesy of nTune. However, these settings are not saved during a system re-boot.

    So since the physical design of the ventilation on the 8800GTS 512 has changed from that of the 8800GT, have nVidia taken any steps to change from the ridiculous fan-speed profile of the 8800GT (512) ?? Or given the user any ability to manually control the speed profile and SAVE THE SETTINGS?
  • AnnonymousCoward - Wednesday, December 12, 2007 - link

    No kidding! The last time I tried nTune it would also go back after every reboot. To OC I started making EXPERTOOL start on startup and then I close it manually to free the memory, and the OC stays. I don't know if Riva lets you do that. Reply
  • jay401 - Tuesday, December 11, 2007 - link

    most owners are using RivaTuner to allow the fan speed to be dynamically adjusted by the temperature or to simply set a higher default, fixed fanspeed. Reply
  • kilkennycat - Tuesday, December 11, 2007 - link

    Does the fan-adjust feature of RivaTuner work properly on WinXP with the latest nV drivers 169.09beta and above (req'd for Crysis etc..)?? If so, please specify the version of Rivatuner and point me in the right direction to manipulate the fan settings. Reply
  • chizow - Tuesday, December 11, 2007 - link

    This is probably the first time I've felt an AT review wasn't worth reading and definitely the first time I've said a review done by Anand wasn't worth reading. The conclusion is correct, but for very different reasons. There is no 10-15% advantage (as many would consider that significant enough a reason to pay $50 more), there is NO advantage of getting a G92 GTS over a G92 GT.">Firing Squad Review

    When looking over this review, pay special attention to:

    Leadtek GeForce 8800 GT Extreme (680MHz core/1.0GHz memory)


    XFX GeForce 8800 GTS 512MB XXX (678MHz core/986MHz memory)

    Almost no difference at all in performance.......
  • Acragas - Tuesday, December 11, 2007 - link

    Did you read all the way to the end of the Firing Squad review? Because at the end, they seem to leave no doubt that the 8800GTS 512 is certainly the superior card. I <3 EVGA's step up program.

    They conclude:

    Given the GeForce 8800 GTS 512MB’s outstanding performance though, this higher price tag is definitely justified. The 8800 GTS 512MB cards blazed through all of our benchmarks, with performance generally falling anywhere between the GeForce 8800 GT and the GeForce 8800 GTX, while a card that’s been overclocked can put up numbers that are higher than the GTX in some cases.

    If you’ve got $400 to spend on a graphics upgrade this Christmas, the GeForce 8800 GTS 512MB is without a doubt the card we recommend. In fact, we wouldn’t be surprised if the GeForce 8800 GTS 512MB ends up stealing sales away from the GeForce 8800 GTX.
  • chizow - Tuesday, December 11, 2007 - link

    Why would I need to read their conclusion when their data allows you to come to your own? I'm sure they were blinded by the stark contrast in their pretty graphs without realizing they showed there was virtually no difference in performance between the parts at the same clock speed.

    Granted, the dual-slot cooler would allow you to run at higher clock speeds, but for a $50-100 difference in price is a better cooler and 16SP and 8 tmu/tau that yield 0-2% difference in performance worth it?
  • zazzn - Tuesday, December 11, 2007 - link

    i foolishly also bought a 8800gts like 4 months ago and now the GTs are out stomping them and for cheaper. i feel like a fool and XFX doesnt offer a step up program next time i buy its a evga for sure...

    I m so sour right now about the situation consdering i needed a new psu from 450 to 600 which also cost me 150 and most likely wouldnt have needed it if i bought the gt now since it requires less power.

    how crap is that

    can zou post the results of a old 88vs a new 88gts
  • Kelly - Tuesday, December 11, 2007 - link

    Isn't the power consumption of 3870 vs 8800GT512 a bit odd compared to previous findings?

    Here are the numbers I am wondering about

    8800GT: 146/269 (difference:123)
    3870: 122/232 (difference:110)

    Compare this to">

    8800GT: 165/209 (difference:44)
    3870: 125/214 (difference:89)

    Or am I not doing the comparison correctly?

    Thanks for a nice review as always!
  • Spoelie - Tuesday, December 11, 2007 - link

    The original review results were a bit strange, the gap between the 3850/3870 was way too great for a simple clock bump between them, also DDR4 should consume less power than DDR3. So these values seem more right, the gap between idle and load is bigger because they used a quad core cpu in this article and a dual core in the previous one. Reply
  • Khato - Tuesday, December 11, 2007 - link

    Well, the load results from this article in comparison to the previous bring light to a disturbing fact. If the definition of 'load' is a game and we're never CPU limited, then the performance of the graphics card is going to scale the CPU power usage accordingly, giving the impression that faster cards draw far more power than they actually do. On the flipside, if we're CPU limited (which might have been the case in the previous review) then CPU power is about constant, and the high end cards are idling more often, giving the impression that they're more efficient than they really are.

    It'd be interesting to see the % CPU utilization for each card.
  • trajan - Tuesday, December 11, 2007 - link

    I promise I'm not paid to say this, but I feel like the new GTS plus EVGA's step up program just saved me a load of cash. I (foolishly?) bought a superclocked EVGA 8800GTS 640mb card almost 3 months ago, right before the 8800GT came out. Yeah, bad timing. But when I checked online I still have 18 days left on my step-up.

    So, very ironically, I am upgrading from a $395 dollar card to a $360 card, paying $10 in shipping both ways. I don't get a refund, so I essentially will paid $420 for a $360 part, but what a huge upside -- I got a great card 3 months ago and am now getting a great upgrade almost free.

    I say "finally" in the subject because switching from the superclocked 8800GTS 640 to a 8800 GT just didn't seem worth it, especially given how much money I'd be losing .. I kept hoping something better would come around even if it cost more, since I can upgrade to any sub-$400 card just by paying shipping..
  • Viditor - Tuesday, December 11, 2007 - link

    My question is this...

    If an 8800 GT 512 is $300-$350, and 2 x HD3850s are a total of $358, how do they compare in performance (in other words, do the Xfired 3850s outperform the 8800GT 512, and if so by how much)?
  • chizow - Tuesday, December 11, 2007 - link

    That's basically what it comes down to with the G92 vs. G80. Another big difference between the G80s and G92s that the review failed to mention is the 24 vs. 16 ROP advantage G80 maintains over G92; a lead which the increased clock speeds can't make up for.

    Anyways, pretty clear the G92 does better in shader intensive games (newer ones) with its massive shader ops/sec advantage over the G80, but falls short when you enable AA or high resolutions where fillrate becomes more important.

    In the end I think the result is better for gamers but it doesn't look like there's a definitive winner this time around. Its basically trading performance at various settings/games but for the end-user, the benefit is that you can get great performance at a much better price by giving up the ultra high-end settings (1920+ w/AA), which at this point are borderline playable now anyways.
  • wordsworm - Tuesday, December 11, 2007 - link

    I recall someone mentioning that 32 bit OS can only handle 4GB of memory. This can be allocated to the video card memory and motherboard memory. Seems to me that since AT is running 4GB of memory on the MB and 256, 512, and 768MB on the VC, I can't help but think this would somehow skewer the results. Am I missing something? Reply
  • Le Québécois - Tuesday, December 11, 2007 - link

    How about some tests to see how well those cards do in MultiGPU Scaling? The 8800GT 512 did pretty good but was somewhat limited by memory at higher resolution. Since the 8800GTS 512 has the same amount of memory, could we expect the same king of scaling? What about the 8800GT 256? Reply
  • EateryOfPiza - Tuesday, December 11, 2007 - link

    seconded! Reply
  • Le Québécois - Tuesday, December 11, 2007 - link

    On page 3 many of the graphs show 8800 GTX Ultra in the legend.

    At the bottom of page 4 "The 8800 GTS Ultra looks to be an average of 10% faster than the 8800 GT, is it worth the $50+ premium it'll command? Not really, the 512MB 8800 GT is still the sweet spot. Moving on..."

    Should be The 8800 GTS 512.
  • sabrewulf - Tuesday, December 11, 2007 - link

    I'm sure you have your reasons, and I know the results would be largely similar, but I would really have preferred to see GTS512 vs GTX, as I'm sure there are far more GTX owners than Ultra owners (relatively speaking) Reply
  • Super Nade - Tuesday, December 11, 2007 - link

    Guys, how about posting a few scores with the cards overclocked? After all that is why we buy these cards, right ; to extract every ounce of performance? :)

    Best wishes,

    Super Nade,
  • shabby - Tuesday, December 11, 2007 - link

    From the original $200-249 the 8800gt shot up to almost $300, while the gts will be reaching for the $400 mark with the overclocked models, even the 256meg gt is priced over $200.
    Why are all these cards so far off from the msrp? It was never like this before with the gtx/ultra cards.
  • homerdog - Tuesday, December 11, 2007 - link

    The cheapest model on Newegg right now is a stock eVGA one for $359.99, which is just outside the upper end of the MSRP. That is still a good price when viewed from the "it's almost as fast as an Ultra and faster than a GTX" perspective. Reply
  • jay401 - Tuesday, December 11, 2007 - link

    Actually Anandtech's being revisionist with their pricing history.
    The original 8800GT 512MB article last month stated MSRP was $200-250. That was, to paraphrase, based on the logical inference of the article, "closer to $200 for reference clocked 512MB models and closer to $250 for high-end models".

    The 256MB model wasn't even in the shipping channels at that time and had no bearing on the pricing mentioned in the article, which was very specifically regarding the 512MB model, as that was what the review was about - the 8800GT 512MB.

    And in the first two weeks we saw them priced as low as $209 for reference models and $229 for overclocked models, further supporting the reality of that MSRP price range.

    The only reason they're as expensive as you see them today is limited supply & high demand.

    But now Anandtech wants to satisfy NVidia and help them justify maintaining the current pricing even after supply exceeds demand, which would be absurd.

    Just thought you should know the truth.

    And just so you don't think I'm some biased ATI/AMD owner, I picked up my 8800GT for $250 and am very happy with it. But I feel sorry for folks paying $300 (or more!) for a $200-250 card.
  • tshen83 - Tuesday, December 11, 2007 - link

    actually, you don't understand the fundamentals of supply and demand. The 8800GT 512MB is selling at 30 dollars over MSRP because it is just that good and worth that much. Nvidia priced it too low. The Radeon HD3850 is priced at 179 and not selling over MSRP because there is less demand due to poorer performance, especially with AA+AF.
  • Griswold - Thursday, December 13, 2007 - link

    Also (partly) wrong. Its a good price/performance part and its short in supply. That is why its priced higher. And I'm willing to bet the supply shortage is artificial. Look at how the availability of the GTS 512 is - seems to be much better than that of the GT. Its no surprise. Nvidias margins with the GT must be abyssmal compared to that of higher priced units (thats a given, but they also rendered almost their complete lineup obsolete for several weeks prior to the launch of the GTS 512), but they needed that horse to compete with the 3850/3870 price point.

    And you really need to stop talking out of your ass about the 3850. Its selling well and its selling at MSRP because supply is decent (and you lecture him about fundamentals... ). I think there was a the register claim of 150k units in 3 weeks. Well, thats three times the amount of the available 8800GT units in the same timeframe. Speaks for itself.
  • neogodless - Tuesday, December 11, 2007 - link

    Whew... just bought an 8800GT and would like to feel like it was a good buy for a *little while*! Hope it has enough supply to help drive prices down in general though... Reply
  • R3MF - Tuesday, December 11, 2007 - link

    Where are the G92 GTS cards with memory over 2.0GHz?
    Does this preage the entrance of a G92 GTX with memory at 2.4GHz and a higher core clock?

    It isn't rocket science to put some decent speed memory on a midrange card. Witness the 3870 with 2.35GHz memory, so why haven't any of the so called "OC" versions of the G92 GTS got overclocked memory?

    At the same time we all want a card that can play Crysis at 1920x1200 at High details and still get around 30FPS. The GTS can get ~30FPS at Medium details........... whoopy-do!

    So, we know its possible to economically provide more bandwidth and we know its necessary, but nobody has done so including the OC'ed versions.

    Is this because there is a G92 GTX product around the corner?

    Yes i know there is rumoured to be a G92 GX2 dual-card sometime in january, but how about a non-cack single card version.

    A card with:
    720MHz core clock
    2000MHz shaders
    2400MHz memory
    1GB or memory

    would absolutely rock, so why haven't we got one?
  • kilkennycat - Tuesday, December 11, 2007 - link

    Memory tweaking of the current series is a tiny marginal benefit with a huge increase in power-dissipation. The G92 represents the last gasp of the current G8x/G9x architecture. The shrink was absolutely essential to nVidia's GPU business to get away from the huge, power-hungry and low-yield G80 GPU.

    The true high-end replacement family for the 8xxx-series is coming around Q2 of 2008. It has been in design for at least the past year and is NOT just a tweak of the G8x/G9x architecture. If you really HAVE TO upgrade your system right now, just get a SINGLE 8800GT 512. At this point in time, do not invest in SLI. Keep you hands in your pockets and wait for the next gen. A single copy of the high-end version of the next-gen GPU family from nVidia is likely to have more GPU horsepower than dual 8800GTX.
  • Griswold - Thursday, December 13, 2007 - link

    "The true high-end replacement family for the 8xxx-series is coming around Q2 of 2008. It has been in design for at least the past year and is NOT just a tweak of the G8x/G9x architecture."

    Its going to be an evolved (note: thats a fair bit more than just tweaked) G80/G92. You dont design a completely new architecture in a year. Remember what nvidia claimed at launch of the G80? Its been in the works for several years. They will squeeze every bit of revenue out of this architecture before they launch their true next generation architecture (on which at least one team must have been working since the launch of G80).
  • retrospooty - Tuesday, December 11, 2007 - link

    A card with:
    720MHz core clock
    2000MHz shaders
    2400MHz memory
    1GB or memory

    would absolutely rock, so why haven't we got one?

    Ummm.... Wait until the high end card is released in january and then see what the specs are. Its suppsed to be a dual GPU version like the 7950GTX was. So think 2 8000GT SLI performance. The memory wont likely be 2400mhz, but it will be dual channel for 512mbit bandwidth.

Log in

Don't have an account? Sign up now