Final Words

I don't think we've ever had so many competitive graphics cards available in $50 increments of one another. Starting at $179 with the Radeon HD 3850 and now going up to just under $400 with the GeForce 8800 GTS 512, if you have a very specific budget there are plenty of options for a faster graphics card these days.

Honestly, despite the great value from cards like the 8800 GT and the Radeon HD 3800 series, there's still a need for even higher performance GPUs. If you looked at our bar graphs, there are some games where we're still forced to run at Medium Quality settings. Titles like World in Conflict and Crysis simply can't be run at high resolution with full detail settings on even the 8800 GTS 512, at least at reasonable frame rates. We regularly see this seesaw between software and hardware in the 3D gaming space; sometimes our hardware outpaces the software, and other times the software is far ahead of the hardware.

Here's the thing: remember how the 8800 GT came out and made most of NVIDIA's productline obsolete? Well, there's bound to be a G92 based successor to the 8800 Ultra, despite it being faster than the new GTS 512 it's still fundamentally built on old technology and is overdue for a refresh. If you absolutely must have the highest performance and the 8800 GTS 512 won't satisfy you, don't splurge on an 8800 Ultra, we figure you'll regret it within a matter of months. NVIDIA can't go that long without a super expensive graphics card.

Getting back to reality for a moment, what do we think about the 8800 GTS 512 as an overall buy? It's around 10 - 15% faster than the 8800 GT, with a 16% higher price tag (at least). Honestly, in our opinion, GTS 512 just isn't worth the price premium over the 8800 GT 512MB. There's significantly more shader processing power but with barely any more memory bandwidth, this isn't a card that's really any more suited for high resolution/AA performance than the 8800 GT.

It looks like our verdict still stands: if you want one of the best gaming cards on the market today, the 8800 GT 512MB is still our choice. It's more expensive than we'd like, but the 256MB version is a little too slow, and the GTS 512 isn't fast enough. The 8800 GT 512MB is just right.

Bar Charts for All
Comments Locked

56 Comments

View All Comments

  • chizow - Tuesday, December 11, 2007 - link

    This is probably the first time I've felt an AT review wasn't worth reading and definitely the first time I've said a review done by Anand wasn't worth reading. The conclusion is correct, but for very different reasons. There is no 10-15% advantage (as many would consider that significant enough a reason to pay $50 more), there is NO advantage of getting a G92 GTS over a G92 GT.
    http://www.firingsquad.com/hardware/nvidia_geforce...">Firing Squad Review

    When looking over this review, pay special attention to:

    Leadtek GeForce 8800 GT Extreme (680MHz core/1.0GHz memory)

    vs.

    XFX GeForce 8800 GTS 512MB XXX (678MHz core/986MHz memory)

    Almost no difference at all in performance.......
  • Acragas - Tuesday, December 11, 2007 - link

    Did you read all the way to the end of the Firing Squad review? Because at the end, they seem to leave no doubt that the 8800GTS 512 is certainly the superior card. I <3 EVGA's step up program.

    They conclude:

    Given the GeForce 8800 GTS 512MB’s outstanding performance though, this higher price tag is definitely justified. The 8800 GTS 512MB cards blazed through all of our benchmarks, with performance generally falling anywhere between the GeForce 8800 GT and the GeForce 8800 GTX, while a card that’s been overclocked can put up numbers that are higher than the GTX in some cases.

    If you’ve got $400 to spend on a graphics upgrade this Christmas, the GeForce 8800 GTS 512MB is without a doubt the card we recommend. In fact, we wouldn’t be surprised if the GeForce 8800 GTS 512MB ends up stealing sales away from the GeForce 8800 GTX.
  • chizow - Tuesday, December 11, 2007 - link

    Why would I need to read their conclusion when their data allows you to come to your own? I'm sure they were blinded by the stark contrast in their pretty graphs without realizing they showed there was virtually no difference in performance between the parts at the same clock speed.

    Granted, the dual-slot cooler would allow you to run at higher clock speeds, but for a $50-100 difference in price is a better cooler and 16SP and 8 tmu/tau that yield 0-2% difference in performance worth it?
  • zazzn - Tuesday, December 11, 2007 - link

    i foolishly also bought a 8800gts like 4 months ago and now the GTs are out stomping them and for cheaper. i feel like a fool and XFX doesnt offer a step up program next time i buy its a evga for sure...

    I m so sour right now about the situation consdering i needed a new psu from 450 to 600 which also cost me 150 and most likely wouldnt have needed it if i bought the gt now since it requires less power.

    how crap is that

    can zou post the results of a old 88vs a new 88gts
  • Kelly - Tuesday, December 11, 2007 - link

    Isn't the power consumption of 3870 vs 8800GT512 a bit odd compared to previous findings?

    Here are the numbers I am wondering about

    idle/load
    8800GT: 146/269 (difference:123)
    3870: 122/232 (difference:110)

    Compare this to
    http://www.anandtech.com/video/showdoc.aspx?i=3151...">http://www.anandtech.com/video/showdoc.aspx?i=3151...

    8800GT: 165/209 (difference:44)
    3870: 125/214 (difference:89)

    Or am I not doing the comparison correctly?

    Thanks for a nice review as always!
  • Spoelie - Tuesday, December 11, 2007 - link

    The original review results were a bit strange, the gap between the 3850/3870 was way too great for a simple clock bump between them, also DDR4 should consume less power than DDR3. So these values seem more right, the gap between idle and load is bigger because they used a quad core cpu in this article and a dual core in the previous one.
  • Khato - Tuesday, December 11, 2007 - link

    Well, the load results from this article in comparison to the previous bring light to a disturbing fact. If the definition of 'load' is a game and we're never CPU limited, then the performance of the graphics card is going to scale the CPU power usage accordingly, giving the impression that faster cards draw far more power than they actually do. On the flipside, if we're CPU limited (which might have been the case in the previous review) then CPU power is about constant, and the high end cards are idling more often, giving the impression that they're more efficient than they really are.

    It'd be interesting to see the % CPU utilization for each card.
  • trajan - Tuesday, December 11, 2007 - link

    I promise I'm not paid to say this, but I feel like the new GTS plus EVGA's step up program just saved me a load of cash. I (foolishly?) bought a superclocked EVGA 8800GTS 640mb card almost 3 months ago, right before the 8800GT came out. Yeah, bad timing. But when I checked online I still have 18 days left on my step-up.

    So, very ironically, I am upgrading from a $395 dollar card to a $360 card, paying $10 in shipping both ways. I don't get a refund, so I essentially will paid $420 for a $360 part, but what a huge upside -- I got a great card 3 months ago and am now getting a great upgrade almost free.

    I say "finally" in the subject because switching from the superclocked 8800GTS 640 to a 8800 GT just didn't seem worth it, especially given how much money I'd be losing .. I kept hoping something better would come around even if it cost more, since I can upgrade to any sub-$400 card just by paying shipping..
  • Viditor - Tuesday, December 11, 2007 - link

    My question is this...

    If an 8800 GT 512 is $300-$350, and 2 x HD3850s are a total of $358, how do they compare in performance (in other words, do the Xfired 3850s outperform the 8800GT 512, and if so by how much)?
  • chizow - Tuesday, December 11, 2007 - link

    That's basically what it comes down to with the G92 vs. G80. Another big difference between the G80s and G92s that the review failed to mention is the 24 vs. 16 ROP advantage G80 maintains over G92; a lead which the increased clock speeds can't make up for.

    Anyways, pretty clear the G92 does better in shader intensive games (newer ones) with its massive shader ops/sec advantage over the G80, but falls short when you enable AA or high resolutions where fillrate becomes more important.

    In the end I think the result is better for gamers but it doesn't look like there's a definitive winner this time around. Its basically trading performance at various settings/games but for the end-user, the benefit is that you can get great performance at a much better price by giving up the ultra high-end settings (1920+ w/AA), which at this point are borderline playable now anyways.

Log in

Don't have an account? Sign up now