Generally, you don't see many products released in December. It's getting a little too late to make dramatic impacts on Q4 earnings as many have already done their holiday shopping. If a company is going to release a new product this late in the year, there's generally a good reason for it, or it's simply a product we'll never see.

At the end of October NVIDIA introduced its GeForce 8800 GT, based on a brand new 65nm GPU codenamed G92. The 8800 GT quickly outclassed virtually every NVIDIA GPU, making most of the G80 lineup obsolete by offering better performance at lower prices. A higher end incarnation of the 8800 GT's G92 was inevitable, we just didn't expect to see it this soon.

The GPU is the same, we're still looking at a G92 derivative part, but the card is all new: the GeForce 8800 GTS 512.


A dual-slot G92, the larger heatsink keeps this card a bit cooler than the 8800 GT but with no increase in sound

While NVIDIA is in a better position than AMD is these days, NV marketing could stand to learn from AMD's recent changes. The Radeon HD 3800 series carry no tacky suffixes, just four digit model numbers to keep things nice and simple. Not only is the GeForce 8800 GTS 512 absurdly long, it also further complicates the 8800 product line. If you'll remember back to our 8800 GT review, the 8800 GT is faster than the old G80 based 8800 GTS. The new 8800 GTS 512 is faster than the 8800 GT, and thus faster than both the 320MB and 640MB versions of the old GTS. So you end up with the following lineup today:

8800 Ultra > 8800 GTS 512 > 8800 GTX > 8800 GT > 8800 GTS 640 > 8800 GTS 320

Confusing to say the least, but if you can forget about all of the other products on the market you'll see that there are only two NVIDIA cards to be concerned with: the 8800 GTS 512 and the 8800 GT.


Form Factor 8800 Ultra 8800 GTX 8800 GTS 8800 GTS 512 8800 GT 256MB 8800 GT 8600 GTS
Stream Processors 128 128 96 128 112 112 32
Texture Address / Filtering 32 / 64 32 / 64 24 / 48 64 / 64 56 / 56 56 / 56 16 / 16
ROPs 24 24 20 16 16 16 8
Core Clock 612MHz 575MHz 500MHz 650MHz 600MHz+ 600MHz+ 675MHz
Shader Clock 1.5GHz 1.35GHz 1.2GHz 1.625GHz 1.5GHz+ 1.5GHz+ 1.45GHz
Memory Clock 1.8GHz 1.8GHz 1.6GHz 1.94GHz 1.4GHz - 1.6GHz 1.8GHz

2.0GHz

Memory Bus Width 384-bit 384-bit 320-bit 256-bit 256-bit 256-bit 128-bit
Frame Buffer 768MB 768MB 640MB / 320MB 512MB 256MB 512MB 256MB
Transistor Count 681M 681M 681M 754M 754M 754M 289M
Manufacturing Process TSMC 90nm TSMC 90nm TSMC 90nm TSMC 65nm TSMC 65nm TSMC 65nm TSMC 80nm
Price Point $600 - $800+ $500 - $600 $270 - $450 $349+ $219 - $229 $299 - $349 $140 - $199

 

Architecturally, the 8800 GTS 512 adds another group of 16 shader processors over the 8800 GT. We'd suspect that the 8800 GT has the same number of SPs, but with one block of 16 disabled to increase yields.

Since it's based on G92 we get a 1:1 ratio between texture address and texture filtering, giving the GTS 512 the first leg up over the much more expensive 8800 Ultra. With a 650MHz core clock and 1.625GHz shader clock, the GTS 512 has an 8% shader processing advantage over the Ultra.

The only area where the 8800 GTS 512 loses to the 8800 Ultra is in its total memory bandwidth. The 8800 Ultra, like the 8800 GTX, features a 384-bit wide memory bus while the GTS 512 uses the same 256-bit memory interface from the 8800 GT. There are definite cost advantages to going with a 256-bit memory bus; NVIDIA can build a smaller chip with fewer pins, and make up for the loss in memory bandwidth by shipping the card with faster memory devices. Despite the 1.94GHz memory data rate on the 8800 GTS 512, the 8800 Ultra and GTX have around a 40% memory bandwidth advantage, resulting in better performance in memory bandwidth limited scenarios and high resolution AA tests.

Despite being built on a 754M transistor die, the move to 65nm has made G92 much smaller and thus cheaper to make than G80, which is why we're seeing NVIDIA eagerly replacing its 8800 lineup with G92 variants.

Pricing and Availability

With the disappointing aftermath of the 8800 GT launch, we're better prepared to analyze expectations for what will happen with the 8800 GTS 512. Keep in mind that the 512MB 8800 GT is supposed to be a $250 part, but in reality it's selling for around $300 in the US. The GTS 512 is expected to sell for $299 - $349, but we're already hearing from manufacturers that prices will be much higher.

The XFX GeForce 8800 GTS 512 at reference clocks will carry an MSRP of $349, and the overclocked XXX edition will sell for $379. The GTS 512 could possibly sell at $349, but we wouldn't be too surprised to see it priced even higher in the market given its close proximity to the 8800 GT.

The 8800 GT 256MB: Here at Last
POST A COMMENT

56 Comments

View All Comments

  • sliblue - Friday, December 14, 2007 - link

    Im begining to wonder --- I built a new machine based on the qx9650, asus p5e3 deluxe, and 1 8800 gtx (pci 16 on pci 2.0) 4 gig a ram and Vista 64. Loaded up crysis and told it to auto detect my settings and low and behold it spit out the recomendation of very high for everything. I launched the game and couldnt believe how smooth it was with one card on Very High. I am not overclocking anything and can see a huge difference between the qx9650 and the amd blackbox 6400 x2 Reply
  • DLeRium - Friday, December 14, 2007 - link

    The 8800GT review was SOLID, but based on the comparisons you made with the 8800GT, don't you think you should include it here? You did 8800GT vs GTX in the last article, so don't you think you should do 8800GTS vs. GT vs. GTX? But instead you jump to Ultra. I guess it's great that we can go BACK to the 8800GT article and then kinda interpolate how the GTS will do against the GTX, and that's why I hate about reviews that don't include more info for our benefit.

    I don't see why a lot of these graphs can't be combined together?

    I think another issue for me is why the ATI cards now use so little power? In the ATI review, you showed the 3870 gobbling more power than the 8800GT under load, but now it's a clear winner in power consumption. What's the deal here?
    Reply
  • DLeRium - Friday, December 14, 2007 - link

    I think my other gripe with this review is that this is a NEW revision of the GTS. Don't you think it should be wise to compare both the old GTSes against this new revision? That's one thing I really wanted to see in the GT review too. How do the 320/640 GTSes stack up against the GT. What about in this review? Reply
  • afrost - Thursday, December 13, 2007 - link

    Except that to get a decent cooler on the 8800GT you have to spend another $40 at least for an aftermarket cooler.

    I personally prefer the GTS becuase I can just stick it straight in to my box without ripping the stock cooler off, and it's a little bit faster on top of it. I also didn't have Crysis and mine comes with it in the box.....so overall a good buy in my particular situation.
    Reply
  • nubie - Thursday, December 13, 2007 - link

    I am sorry you feel that way, my EVGA 256-P3-N791-AR 8800gt 256 comes with the re-designed heatsink and fan and all I had to do was pull a slider to get 710 1720 1000 clocks, and it didn't overheat.

    Most of the newer GT's (possibly all the 256mb ones) are coming with a better cooling solution. As for the GTS, yes, great, if you have 2 slots for cooling, not everyone does. Oh, yeah, and a spare $100-150 for only 16 more stream processors?? My GT has the same memory bandwidth (64GB/s) when I pull the slider to 1Ghz(2G DDR).

    In a perfect world, of course I choose the absolute best, but on a budget an 8800gt is just fine.
    Reply
  • ashegam - Thursday, December 13, 2007 - link

    Did I miss this in the article or in the comments or has no one mentioned that this new gen card won't be supporting directX 10.1?
    and that doesn't bother anyone interested in purchasing this card?
    should it not be a concern for a potential buyer?
    Reply
  • Distant - Wednesday, December 12, 2007 - link

    I'll apologize if they acknowledge their mistake and include 8xAA in their tests from now on, I think cards are powerful to the point now were it should be happening in the mid/high range cards anyway.

    Why's this matter? most of those frame rates aren't playable? Well not quite as you saw Oblvion and Prey were, furthermore that site in particular only really tested the very newest games.

    Don't you play any older games? How about any of Valves games? TF2 maybe? I take it you do and I would think most people would want to know their 8800GT is going to get obliterated when they try higher then 4xAA

    And what about the implications for SLI/Crossfire? Surely if you have a cross/sli setup your going to want to run 8 and in some cases even 16xAA on games not named Crysis.
    Reply
  • Distant - Wednesday, December 12, 2007 - link

    In case you guys are wondering what Nvidia payed anandtech not to show you take a look at

    http://www.computerbase.de/artikel/hardware/grafik...">http://www.computerbase.de/artikel/hard...en/2007/...

    You can clearly see in the 8 tests that they did do 8xAA on

    Anno1701
    Clive Barkers Jericho
    FEAR
    Oblivion
    Prey
    Company of heroes DX9 and 10
    Lost Planet

    In every single one of those with the exception of FEAR and Company of heroes in DX9 mode the 8800GT's framerates literally drop like a freaking rock, in some cases it's performance getting cut in more then half while the 3870's really takes much of hits at all and because of this the 3870 overtakes the 8800GT at this level of AA.

    Now call me crazy, but I think most of us don't have a 24+ inch monitor and if you do you really need two cards anyway, my point is looking at the frame rates in most games your going to be wanting to game at 8xAA and if you just read this article you wouldn't have known that the 8800GT appears to be garbage at high levels of AA

    Reply
  • Zak - Wednesday, December 12, 2007 - link

    What exactly are we supposed to be looking at? Besides the fact that it's in German, I see no graphs, no tables. And actually a lot gamers play at resolutions higher than 1280x1024 so 8xAA is mostly unrealistic for any game on any card today. I'm happy when I can get playable framerates with 4X or even 2x on 8800GTX OC in modern games. Accusing Anandtech of being paid by Nvidia (not payed by the way) is baseless and I think out of order. I'd apologize if I were you...

    Z.
    Reply
  • nubie - Wednesday, December 12, 2007 - link

    I bought the EVGA 8800GT 256MB from Newegg on Sunday for $215, of course they are long gone by now.

    This card has 1800mhz RAM, where does that fall in the charts? This means it has identical memory bandwidth to the GT 512's

    I hope that the prices fall and the EVGA Step-Up program will let me get something nice within the 90-days, after this madness is over.

    Either way my monitor is only 1280x1024, and I am not married to AA, so this seems the best choice, for now.

    I am still waiting for some good drivers, I barely beat my 3Dmark01 score(+200) against my old 7900GS 650mhz, that is pretty sad, I don't think that these drivers are properly functional yet.

    Stereoscopic 3D gaming is out the window too, I had a spontaneous reboot already, whether this is the fault of the GT, all 8 series cards, dual-core processor, non-existent stereo drivers or some wacky combination, I don't know.

    When you guys get the EVGA 1800mhz card to bench, let us know asap, they don't claim it as a "clocked" card on the line-up, or in the RAM.

    I got 43/19/115 framerate in World In Conflict Very High settings, 1280x1024 on an x2 4600/2GB DDR2 800/ DFI Infinity II/M2, not bad for the money.

    I am still curious about the dual-core CPU effect, I can only hit 2.6 on this proc (65w Windsor), and I don't run it there. I am getting a single-core Windsor soon and hope to have it clocked to 3ghz like my 939 was, that seemed much faster in game than the x2 4600 ever was (obviously with no background apps.)
    Reply

Log in

Don't have an account? Sign up now