Back when a new Intel chipset launch meant excitement and anticipation, we were always impressed by the widespread availability of motherboards based on the new chipset on the day of announcement. These launches with immediate availability were often taken for granted, and it wasn't until we encountered a barrage of paper launches that discussing availability was really ever an issue.

It wasn't too long ago that both ATI and NVIDIA were constantly paper launching new graphics products, but since that unfortunate year both companies have sought to maintain these "hard launches" with immediate retail availability. NVIDIA has done a better job of ensuring widespread availability than ATI, and last week's launch of the GeForce 8800 series is a perfect example of just that.

Weeks before our G80 review went live we were receiving samples of 8800 GTX and GTS GPUs from NVIDIA's board manufacturers, all eager to get their new product out around the time of NVIDIA's launch. It's simply rare that we see that sort of vendor support surrounding any ATI GPU launch these days, and obviously it's a fact that NVIDIA is quite proud of.

The G80 itself is reason enough for NVIDIA to be proud; widespread availability is merely icing on the cake. As we saw in our review of the 681 million transistor GPU, even a single GeForce 8800 GTX is able to outperform a pair of 7900 GTX or X1950 XTX cards running in SLI or CrossFire respectively. The chip is fast and on average an 8800 GTX seems to draw only 8% more power than ATI's Radeon X1950 XTX, so overall performance per watt is quite strong.

The architecture of G80 is built for the future, and as the first DirectX 10 GPU these cards will be used to develop the next-generation of games. Unlike brand new architectures of DirectX past, you don't need newly re-written games to take advantage of G80. Thanks to its unified shader architecture, the massively parallel powerhouse is able to make full utilization of its execution power regardless of what sort of shader code you're running on it.

NVIDIA's timing with the 8800 launch is impeccable, as it is the clear high end choice for PCs this holiday season. With no competition from ATI until next year, NVIDIA is able to enjoy the crown for the remaining weeks of 2006. If you are fortunate enough to be in the market for an 8800-class card this holiday season, we present to you a roundup of some of the currently available GeForce 8800 graphics cards.

We've got a total of seven G80 based cards in today's roundup, six of which are GeForce 8800 GTX cards along with a single 8800 GTS. All seven G80s are clocked at NVIDIA's stock speeds which are 1.35GHz/575MHz/900MHz (shader/core/memory) for the GTX and 1.2GHz/500MHz/800MHz for the GTS. Apparently NVIDIA isn't allowing vendor-overclocked 8800 GTX cards (according to one of our OEM contacts), thus you can expect all 8800s to perform more or less the same at stock speeds. You will see differences however in the cooling solutions implemented by the various manufacturers, which will in turn influence the overclocking capability of these cards. It's worth mentioning that even at stock speeds, these 8800s are fast... very fast. With the power of these cards and the good overclocking we've seen, we expect we'll see a return to vendor sanctioned overclocking with NVIDIA GPUs at some point in the future, but exactly when this will happen is hard to say.

All of the cards in this roundup are fully HDCP compliant thanks to NVIDIA's NVIO chip in combination with the optional crypto-ROM key found on each of the boards. HDMI outputs are still not very common on PC graphics cards and thus HDCP is supported over DVI on each card. Coupled with an HDCP compliant monitor, any of these 8800s will be able to play full resolution HD-DVD or Blu-ray movies over a digital connection where HDCP is required.

BFG, EVGA & MSI
Comments Locked

34 Comments

View All Comments

  • JarredWalton - Monday, November 13, 2006 - link

    Derek already addressed the major problem with measuring GPU power draw on its own. However, given similar performance we can say that the cards are the primary difference in the power testing, so you get GTX cards using 1-6w more power at idle, and the Calibre uses up to 15W more. At load, the power differences cover a 10W spread, with the Calibre using up to 24W more.

    If we were to compare idle power with IGP and an 8800 card, we could reasonably compare how much power the card requires at idle. However, doing so at full load is impossible without some customized hardware, and such a measurement isn't really all that meaningful anyway if the card is going to make the rest of the system draw more power anyway. To that end, we feel the system power draw numbers are about the most useful representation of power requirements. If all other components are kept constant on a testbed, the power differences we should should stay consistent as well. How much less power would an E6400 with one of these cards require? Probably somewhere in the range of 10-15W at most be likely.
  • IKeelU - Monday, November 13, 2006 - link

    Nice roundup. One comment about the first page, last paragraph:

    "HDMI outputs are still not very common on PC graphics cards and thus HDCP is supported on each card."

    Maybe I'm misinterpreting, but it sounds like you are saying that HDCP is present *instead* of HDMI. The two are independent of each other. HDMI is the electrical/physical interface, whereas HDCP is the type of DRM with which the information will be encrypted.
  • Josh Venning - Monday, November 13, 2006 - link

    The sentence has been reworked. We meant to say HDCP is supported through DVI on each card. Thanks.
  • TigerFlash - Monday, November 13, 2006 - link

    Does anyone know if the Evga WITH ACS3 is what is on retail right now? Evga's website seems to be the only place that distinguishes the difference. Everyone else is just selling an "8800 GTX."

    Thanks.
  • Josh Venning - Monday, November 13, 2006 - link

    The ACS3 version of the EVGA 8800 GTX we had for this review is apparently not available yet anywhere, and we couldn't find any info on their website about it. Right now we are only seeing the reference design 8800 GTX for sale from EVGA, but the ACS3 should be out soon. The price for this part may be a bit higher, but our sample has the same clock speeds as the reference part.
  • SithSolo1 - Monday, November 13, 2006 - link

    They have two different heat sinks so I assume one could tell by looking at the product picture. I know a lot of sites use the product picture from the manfacture's site but I think they would use the one of the ASC3 if that's the one they had. I also assume they would charge a little more for it.
  • imaheadcase - Monday, November 13, 2006 - link

    Would love to see how these cards performance in Vista, even RC2 would be great.

    I know the graphics drivers for nivdia are terrible, i mean terrible, in vista atm but when they at least get a final out for 8800 a RC2 or a Vista final roundup vs winxp SP@ would be great :D
  • DerekWilson - Monday, November 13, 2006 - link

    Vista won't be that interesting until we see a DX10 driver from NVIDIA -- which we haven't yet and don't expect for a while. We'll certainly test it and see what happens though.
  • imaheadcase - Monday, November 13, 2006 - link

    Oh the current drivers for the 8800 beta do not support DX10? Is that the new detX drivers I read about Nvidia working on are for?
  • peternelson - Saturday, November 25, 2006 - link


    I'd be interested to know if the 8800 drivers even support SLI yet? The initial ones I heard of did not.

Log in

Don't have an account? Sign up now