Back when a new Intel chipset launch meant excitement and anticipation, we were always impressed by the widespread availability of motherboards based on the new chipset on the day of announcement. These launches with immediate availability were often taken for granted, and it wasn't until we encountered a barrage of paper launches that discussing availability was really ever an issue.

It wasn't too long ago that both ATI and NVIDIA were constantly paper launching new graphics products, but since that unfortunate year both companies have sought to maintain these "hard launches" with immediate retail availability. NVIDIA has done a better job of ensuring widespread availability than ATI, and last week's launch of the GeForce 8800 series is a perfect example of just that.

Weeks before our G80 review went live we were receiving samples of 8800 GTX and GTS GPUs from NVIDIA's board manufacturers, all eager to get their new product out around the time of NVIDIA's launch. It's simply rare that we see that sort of vendor support surrounding any ATI GPU launch these days, and obviously it's a fact that NVIDIA is quite proud of.

The G80 itself is reason enough for NVIDIA to be proud; widespread availability is merely icing on the cake. As we saw in our review of the 681 million transistor GPU, even a single GeForce 8800 GTX is able to outperform a pair of 7900 GTX or X1950 XTX cards running in SLI or CrossFire respectively. The chip is fast and on average an 8800 GTX seems to draw only 8% more power than ATI's Radeon X1950 XTX, so overall performance per watt is quite strong.

The architecture of G80 is built for the future, and as the first DirectX 10 GPU these cards will be used to develop the next-generation of games. Unlike brand new architectures of DirectX past, you don't need newly re-written games to take advantage of G80. Thanks to its unified shader architecture, the massively parallel powerhouse is able to make full utilization of its execution power regardless of what sort of shader code you're running on it.

NVIDIA's timing with the 8800 launch is impeccable, as it is the clear high end choice for PCs this holiday season. With no competition from ATI until next year, NVIDIA is able to enjoy the crown for the remaining weeks of 2006. If you are fortunate enough to be in the market for an 8800-class card this holiday season, we present to you a roundup of some of the currently available GeForce 8800 graphics cards.

We've got a total of seven G80 based cards in today's roundup, six of which are GeForce 8800 GTX cards along with a single 8800 GTS. All seven G80s are clocked at NVIDIA's stock speeds which are 1.35GHz/575MHz/900MHz (shader/core/memory) for the GTX and 1.2GHz/500MHz/800MHz for the GTS. Apparently NVIDIA isn't allowing vendor-overclocked 8800 GTX cards (according to one of our OEM contacts), thus you can expect all 8800s to perform more or less the same at stock speeds. You will see differences however in the cooling solutions implemented by the various manufacturers, which will in turn influence the overclocking capability of these cards. It's worth mentioning that even at stock speeds, these 8800s are fast... very fast. With the power of these cards and the good overclocking we've seen, we expect we'll see a return to vendor sanctioned overclocking with NVIDIA GPUs at some point in the future, but exactly when this will happen is hard to say.

All of the cards in this roundup are fully HDCP compliant thanks to NVIDIA's NVIO chip in combination with the optional crypto-ROM key found on each of the boards. HDMI outputs are still not very common on PC graphics cards and thus HDCP is supported over DVI on each card. Coupled with an HDCP compliant monitor, any of these 8800s will be able to play full resolution HD-DVD or Blu-ray movies over a digital connection where HDCP is required.

BFG, EVGA & MSI
Comments Locked

34 Comments

View All Comments

  • yacoub - Monday, November 13, 2006 - link

    I was surprised the eVGA card too the lead since the MSI had a much higher memory clock. I guess these cards are nowhere near being fillrate limited, so the core clock boost is more important? I'm not sure if that's the right conclusion to make.

    Also lol @ the 1950XTX's heinous noise and heat levels. ;)

    The power consumption reduction from its die-shrink (right?) over the 1900XTX is nice though.

    Very helpful article. Obvious conclusion: Stay away from the Caliber, lol.
  • kalrith - Monday, November 13, 2006 - link

    On page 5, it says that the Calibre "did get a core boost of 31MHz on the core clock". If the stock speed is 575 and the overclock is 631, shouldn't it be a boost of 56MHz?

    Also, on page 5 the Calibre's memory overclock is listed as 1021MHz, and on page 6 it's lists as 914MHz.
  • Josh Venning - Monday, November 13, 2006 - link

    Thanks for pointing out the errors on page 5, they've been fixed.

  • Kyteland - Monday, November 13, 2006 - link

    On page 2: "First we have the (insert exact card name here) from BFG."

Log in

Don't have an account? Sign up now