The 8800 GTX and GTS

Today we expect to see availability of two cards based on NVIDIA's G80 GPU: the GeForce 8800 GTX and 8800 GTS. Priced at $599 and $449 respectively, the two cards, as usual, differ in clock speeds and processing power.


8800 GTX (top) vs. 7900 GTX (bottom)

The 8800 GTX gets the full G80 implementation of 128 stream processors and 64 texture fetch units. The stream processors are clocked at 1.35GHz with the rest of the GPU running at 575MHz. The GTX has six 64-bit memory controllers operating in tandem, connected to 768MB of GDDR3 memory running at 900MHz. GDDR4 is supported but will be introduced on a later card.


NVIO: Driving a pair of TMDS transmitters near you

You get two dual-link DVI ports driven by NVIDIA's new NVIO chip that handles TMDS and other currently unknown functions. Keeping a TMDS on-die is a very difficult thing to do, especially if you have logic operating at such high clock speeds within the GPU, so with G80 NVIDIA had to move the TMDS off-die and onto this separate chip. The NVIO chip also supports HDCP, but you do need the crypto ROM keys in order to have full HDCP support on the card. That final decision is up to the individual card manufacturers, although at this price point we hope they all choose to include HDCP support.

The 8800 GTX has two PCIe power connectors and two SLI connectors:


Two SLI connectors on the 8800 GTX


Bridges in action

The dual power connectors are necessary to avoid drawing more power from a single connector than the current ATX specification allows for. The dual SLI connectors are for future applications, such as daisy chaining three G80 based GPUs, much like ATI's latest CrossFire offerings.


dual power connectors

The GeForce 8800 GTS loses 32 SPs bringing it down to 96 stream processors and 48 texture fetch units. The shader core runs at 1.2GHz, while the rest of the GTS runs at 500MHz. The GTS also has only five 64-bit memory controllers with 640MB of GDDR3 memory running at 800MHz.


7900 GTX (left) 8800 GTS (middle) 8800 GTX (right)

The 8800 GTS has the same NVIO chip as the 8800 GTX, but the board itself is a bit shorter and it only features one SLI connector and one PCIe power connector.


Only one power connector on an 8800 GTS


...and only one SLI connector

Both cards are extremely quiet during operation and are audibly indiscernible from a 7900 GTX.

Image Quality: Summing it All Up Power Supply Requirements
Comments Locked

111 Comments

View All Comments

  • JarredWalton - Wednesday, November 8, 2006 - link

    Page 17:

    "The dual SLI connectors are for future applications, such as daisy chaining three G80 based GPUs, much like ATI's latest CrossFire offerings."

    Using a third GPU for physics processing is another possibility, once NVIDIA begins accelerating physics on their GPUs (something that has apparently been in the works for a year or so now).
  • Missing Ghost - Wednesday, November 8, 2006 - link

    So it seems like by substracting the highest 8800gtx sli power usage result with the one for the 8800gtx single card we can conclude that the card can use as much as 205W. Does anybody knows if this number could increase when the card is used in DX10 mode?
  • JarredWalton - Wednesday, November 8, 2006 - link

    Without DX10 games and an OS, we can't test it yet. Sorry.
  • JarredWalton - Wednesday, November 8, 2006 - link

    Incidentally, I would expect the added power draw in SLI comes from more than just the GPU. The CPU, RAM, and other components are likely pushed to a higher demand with SLI/CF than when running a single card. Look at FEAR as an example, and here's the power differences for the various cards. (Oblivion doesn't have X1950 CF numbers, unfortunately.)

    X1950 XTX: 91.3W
    7900 GTX: 102.7W
    7950 GX2: 121.0W
    8800 GTX: 164.8W

    Notice how in this case, X1950 XTX appears to use less power than the other cards, but that's clearly not the case in single GPU configurations, as it requires more than everything besides the 8800 GTX. Here's the Prey results as well:

    X1950 XTX: 111.4W
    7900 GTX: 115.6W
    7950 GX2: 70.9W
    8800 GTX: 192.4W

    So there, GX2 looks like it is more power efficient, mostly because QSLI isn't doing any good. Anyway, simple subtraction relative to dual GPUs isn't enough to determine the actual power draw of any card. That's why we presented the power data without a lot of commentary - we need to do further research before we come to any final conclusions.
  • IntelUser2000 - Wednesday, November 8, 2006 - link

    It looks like putting SLI uses +170W more power. You can see how significant video card is in terms of power consumption. It blows the Pentium D away by couple of times.
  • JoKeRr - Wednesday, November 8, 2006 - link

    well, keep in mind the inefficiency of PSU, generally around 80%, so as overall power draw increases, the marginal loss of power increases a lot as well. If u actually multiply by 0.8, it gives about 136W. I suppose the power draw is from the wall.
  • DerekWilson - Thursday, November 9, 2006 - link

    max TDP of G80 is at most 185W -- NVIDIA revised this to something in the 170W range, but we know it won't get over 185 in any case.

    But games generally don't enable a card to draw max power ... 3dmark on the other hand ...
  • photoguy99 - Wednesday, November 8, 2006 - link

    Isn't 1920x1440 a resolution that almost no one uses in real life?

    Wouldn't 1920x1200 apply many more people?

    It seems almost all 23", 24", and many high end laptops have 1900x1200.

    Yes we could interpolate benchmarks, but why when no one uses 1440 vertical?

  • Frallan - Saturday, November 11, 2006 - link

    Well i have one more suggestion for a resolution. Full HD is 1920*1080 - that is sure to be found in a lot of homes in the future (after X-mas any1 ;0) ) on large LCDs - I believe it would be a good idea to throw that in there as well. Especially right now since loads of people will have to decide how to spend their money. The 37" Full HD is a given but on what system will I be gaming PS-3/X-Box/PC... Pls advice.
  • JarredWalton - Wednesday, November 8, 2006 - link

    This should be the last time we use that resolution. We're moving to LCD resolutions, but Derek still did a lot of testing (all the lower resolutions) on his trusty old CRT. LOL

Log in

Don't have an account? Sign up now