8800 GTS 512 vs. 8800 Ultra

The 8800 Ultra was expensive when it was released in May of this year, and honestly not much has changed. The 8800 GTS 512 outclasses the Ultra in just about every category, the exception being raw memory bandwidth. The question we're looking to answer first is whether there's still a need for the 8800 Ultra, or if this sub-$400 card makes 2007's most expensive single GPU obsolete.

Quake Wars shows the two cards performing virtually the same, the Ultra starts to pull away at 2560 x 1600:

Turning on AA gives us a clear difference between the two, at 2560 x 1600 the Ultra has a 47% performance advantage over the 8800 GTS 512:

We see the same story with World in Conflict, there's no performance difference between the two cards until we turn on AA:

Lighter titles such as Half Life 2 and Oblivion (yep, Oblivion is a lighter title now) show the two cards as being equal:

The added pixel pushing power of the 8800 GTS 512 give it the advantage in our Oblivion test, but much of that advantage gets erased when we turn on 4X AA.

Looking at newer titles like Crysis, Call of Duty 4 and Unreal Tournament 3 we see 8800 Ultra levels of performance from the $350 8800 GTS 512. Not bad.

Overall, the 8800 GTS 512 is definitely competitive with the 8800 Ultra, however there are definitely cases where the raw memory bandwidth of the Ultra's 384-bit memory bus just can't be beat. If you've got an 8800 Ultra, feel threatened, but there's no need to worry about replacing your card. And if you're somehow choosing between the two, the GTS 512 comes close enough overall and for cheap enough that you can afford to skip the Ultra...or at least buy two GTS 512s.

The 8800 GT 256MB: Here at Last 8800 GTS 512 vs. 8800 GT
Comments Locked


View All Comments

  • chizow - Tuesday, December 11, 2007 - link

    This is probably the first time I've felt an AT review wasn't worth reading and definitely the first time I've said a review done by Anand wasn't worth reading. The conclusion is correct, but for very different reasons. There is no 10-15% advantage (as many would consider that significant enough a reason to pay $50 more), there is NO advantage of getting a G92 GTS over a G92 GT.
    http://www.firingsquad.com/hardware/nvidia_geforce...">Firing Squad Review

    When looking over this review, pay special attention to:

    Leadtek GeForce 8800 GT Extreme (680MHz core/1.0GHz memory)


    XFX GeForce 8800 GTS 512MB XXX (678MHz core/986MHz memory)

    Almost no difference at all in performance.......
  • Acragas - Tuesday, December 11, 2007 - link

    Did you read all the way to the end of the Firing Squad review? Because at the end, they seem to leave no doubt that the 8800GTS 512 is certainly the superior card. I <3 EVGA's step up program.

    They conclude:

    Given the GeForce 8800 GTS 512MB’s outstanding performance though, this higher price tag is definitely justified. The 8800 GTS 512MB cards blazed through all of our benchmarks, with performance generally falling anywhere between the GeForce 8800 GT and the GeForce 8800 GTX, while a card that’s been overclocked can put up numbers that are higher than the GTX in some cases.

    If you’ve got $400 to spend on a graphics upgrade this Christmas, the GeForce 8800 GTS 512MB is without a doubt the card we recommend. In fact, we wouldn’t be surprised if the GeForce 8800 GTS 512MB ends up stealing sales away from the GeForce 8800 GTX.
  • chizow - Tuesday, December 11, 2007 - link

    Why would I need to read their conclusion when their data allows you to come to your own? I'm sure they were blinded by the stark contrast in their pretty graphs without realizing they showed there was virtually no difference in performance between the parts at the same clock speed.

    Granted, the dual-slot cooler would allow you to run at higher clock speeds, but for a $50-100 difference in price is a better cooler and 16SP and 8 tmu/tau that yield 0-2% difference in performance worth it?
  • zazzn - Tuesday, December 11, 2007 - link

    i foolishly also bought a 8800gts like 4 months ago and now the GTs are out stomping them and for cheaper. i feel like a fool and XFX doesnt offer a step up program next time i buy its a evga for sure...

    I m so sour right now about the situation consdering i needed a new psu from 450 to 600 which also cost me 150 and most likely wouldnt have needed it if i bought the gt now since it requires less power.

    how crap is that

    can zou post the results of a old 88vs a new 88gts
  • Kelly - Tuesday, December 11, 2007 - link

    Isn't the power consumption of 3870 vs 8800GT512 a bit odd compared to previous findings?

    Here are the numbers I am wondering about

    8800GT: 146/269 (difference:123)
    3870: 122/232 (difference:110)

    Compare this to

    8800GT: 165/209 (difference:44)
    3870: 125/214 (difference:89)

    Or am I not doing the comparison correctly?

    Thanks for a nice review as always!
  • Spoelie - Tuesday, December 11, 2007 - link

    The original review results were a bit strange, the gap between the 3850/3870 was way too great for a simple clock bump between them, also DDR4 should consume less power than DDR3. So these values seem more right, the gap between idle and load is bigger because they used a quad core cpu in this article and a dual core in the previous one.
  • Khato - Tuesday, December 11, 2007 - link

    Well, the load results from this article in comparison to the previous bring light to a disturbing fact. If the definition of 'load' is a game and we're never CPU limited, then the performance of the graphics card is going to scale the CPU power usage accordingly, giving the impression that faster cards draw far more power than they actually do. On the flipside, if we're CPU limited (which might have been the case in the previous review) then CPU power is about constant, and the high end cards are idling more often, giving the impression that they're more efficient than they really are.

    It'd be interesting to see the % CPU utilization for each card.
  • trajan - Tuesday, December 11, 2007 - link

    I promise I'm not paid to say this, but I feel like the new GTS plus EVGA's step up program just saved me a load of cash. I (foolishly?) bought a superclocked EVGA 8800GTS 640mb card almost 3 months ago, right before the 8800GT came out. Yeah, bad timing. But when I checked online I still have 18 days left on my step-up.

    So, very ironically, I am upgrading from a $395 dollar card to a $360 card, paying $10 in shipping both ways. I don't get a refund, so I essentially will paid $420 for a $360 part, but what a huge upside -- I got a great card 3 months ago and am now getting a great upgrade almost free.

    I say "finally" in the subject because switching from the superclocked 8800GTS 640 to a 8800 GT just didn't seem worth it, especially given how much money I'd be losing .. I kept hoping something better would come around even if it cost more, since I can upgrade to any sub-$400 card just by paying shipping..
  • Viditor - Tuesday, December 11, 2007 - link

    My question is this...

    If an 8800 GT 512 is $300-$350, and 2 x HD3850s are a total of $358, how do they compare in performance (in other words, do the Xfired 3850s outperform the 8800GT 512, and if so by how much)?
  • chizow - Tuesday, December 11, 2007 - link

    That's basically what it comes down to with the G92 vs. G80. Another big difference between the G80s and G92s that the review failed to mention is the 24 vs. 16 ROP advantage G80 maintains over G92; a lead which the increased clock speeds can't make up for.

    Anyways, pretty clear the G92 does better in shader intensive games (newer ones) with its massive shader ops/sec advantage over the G80, but falls short when you enable AA or high resolutions where fillrate becomes more important.

    In the end I think the result is better for gamers but it doesn't look like there's a definitive winner this time around. Its basically trading performance at various settings/games but for the end-user, the benefit is that you can get great performance at a much better price by giving up the ultra high-end settings (1920+ w/AA), which at this point are borderline playable now anyways.

Log in

Don't have an account? Sign up now