Power Requirements


Current generation graphics cards are near the limit for how much current they are allowed to pull from one connection. So, of course, the solution is to add a second power connection to the card. That's right, the GeForce 6800 Ultra requires two independent connections to the power supply. The lines could probably be connected to a fan with no problem, but each line should really be free of any other connection.

Of course, this is a bit of an inconvenience for people who (like the writer of this article) have 4 or more drives connected to their PCs. Power connections are a limited resource in PCs, and this certainly doesn't help. Of course, it might just be worth it. We'll only make you wait a little longer to find out.

The card doesn't necessarily max out both lines (and we are looking into measuring the amperage the cards draw), but, NVIDIA indicated (in the reviewers guide with which we were supplied) that we should use a 480W power supply in conjunction with the 6800 Ultra.

There are a couple factors at work here. First, obviously, the card needs a good amount of power. Second, power supplies generally partition the power they deliver. If you look on the side of a power supply, you'll see a list of voltage rails and amperages. The wattage ratings on a power supply usually indicate (for marketing purposes) the maximum wattage they could supply if the maximum current allowed was drawn on each line. It is not possible to draw all 350 watts of a 350 watt power supply across one connection (or even one rail). NVIDIA indicated that their card needs a stable 12 volt rail, but that generally power supplies offer a large portion of their 12 volt amperage to the motherboard (since the motherboard draws the most power in the system on all rails).

Many people have been worried about heat generated by a card that requires two power connections. Just to be clear, we aren't drawing twice the power because we have twice the connection, nor are we generating twice as much heat. It's a larger chip, it draws more power, but it won't be clocked as high (with the 6800 Ultra version coming in at 400MHz as opposed to the 5950's 475MHz).

Customers who end up buying this card will most likely need to upgrade their power supply as well. Obviously this isn't an optimal solution, and it will turn some people off. But, to those who like the performance numbers, it may be worth the investment. And there are obviously rumors circulating the net about ATI's next generation solution as well, but we will have to wait and see how they tackle the power problem in a few weeks.
NV40 Under the Microscope Of Shader Details …
Comments Locked

77 Comments

View All Comments

  • Regs - Wednesday, April 14, 2004 - link

    Wow, very impressive. Yet very costly. I'm very displeased with the power requirments however. I'm also hoping newer drivers will boost performance even more in games like Far cry. I was hoping to see at least 60 FPS @ 1280x1024 w/ 4x/8x. Even though it's not really needed for such a game and might be over kill, however It would of knocked me off my feet enough where I could over look the PSU requirement. But ripping my system apart yet again for just a video card seems unreasonable for the asking price of 400-500 dollars.
  • Verdant - Wednesday, April 14, 2004 - link

    i don't think the power issue is as big as some make it out to be, some review sites used a 350 W psu, and two connectors on the same lead and had no problems under load
  • dragonballgtz - Wednesday, April 14, 2004 - link

    I can't wait till December when I build me a new computer and use this card. But maybe by then the PCI-E version.
  • DerekWilson - Wednesday, April 14, 2004 - link

    #11 you are correct ... i seem to have lost an image somewhere ... i'll try to get that back up. sorry about that.
  • RyanVM - Wednesday, April 14, 2004 - link

    Just so you guys know, Damage (Tech Report) actually used a watt meter to determine the power consumption of the 6800. Turns out it's not much higher than a 5950.

    Also, it makes me cry that my poor 9700Pro is getting more than doubled up in a lot of the benchmarks :(
  • CrystalBay - Wednesday, April 14, 2004 - link

    Hi Derek, What kind of voltage fluctuations were you seeing... just kinda curious about the PSU...
  • PrinceGaz - Wednesday, April 14, 2004 - link

    A couple of comments so far...

    page 6 "Again, the antialiasing done in this unit is rotated grid multisample" - nVidia used an ordered grid before, only ATI previously used the superior rotated grid.

    page 8 - both pictures are the same, I think the link for the 4xAA one needs changing :)

    Can't wait to get to the rest :)
  • ZobarStyl - Wednesday, April 14, 2004 - link

    dang ive got a 450W...sigh. That power consumption is really gonna kill the upgradability of this card (but then again the x800 is slated for double molex as well). I know it's a bit strange but I'd like to see which of these cards (top end ones) can provide the best dual-screen capability...any GPU worth its salt comes with dual screen capabilities and my dually config needs a new vid card and I dont even know where to look for that...

    and as for cost...these cards blow away 9800XT's and 5950's...it wont be 3-4 fps above the other that makes me pick between a x800 and a 6800...it will be the price. Jeez, what are they slated to hit the market at, 450?
  • Icewind - Wednesday, April 14, 2004 - link

    Upgrade my PSU? I think not Nvidia! Lets see what you got Ati
  • LoneWolf15 - Wednesday, April 14, 2004 - link

    It looks like NVidia has listened to its customer base. I'm particularly interested in the hardware MPEG 1/2/4 encoder/decoder.

    Even so, I don't run anything that comes close to maxing my Sapphire Radeon 9700, so I don't think I'll buy a new card any time soon. I bought that card as a "future-proof" card like this one is, and guess what? The two games I wanted to play with it have not been released yet (HL2 and Doom3 of course), and who knows when they will be? At the time, Carmack and the programmers for Valve screamed that this would be the card to get for these games. Now they're saying different things. I don't game enough any more to justify top-end cards; frankly, and All-In-Wonder 9600XT would probably be the best current card for me, replacing the 9700 and my TV Wonder PCI.

Log in

Don't have an account? Sign up now