Power Requirements


Current generation graphics cards are near the limit for how much current they are allowed to pull from one connection. So, of course, the solution is to add a second power connection to the card. That's right, the GeForce 6800 Ultra requires two independent connections to the power supply. The lines could probably be connected to a fan with no problem, but each line should really be free of any other connection.

Of course, this is a bit of an inconvenience for people who (like the writer of this article) have 4 or more drives connected to their PCs. Power connections are a limited resource in PCs, and this certainly doesn't help. Of course, it might just be worth it. We'll only make you wait a little longer to find out.

The card doesn't necessarily max out both lines (and we are looking into measuring the amperage the cards draw), but, NVIDIA indicated (in the reviewers guide with which we were supplied) that we should use a 480W power supply in conjunction with the 6800 Ultra.

There are a couple factors at work here. First, obviously, the card needs a good amount of power. Second, power supplies generally partition the power they deliver. If you look on the side of a power supply, you'll see a list of voltage rails and amperages. The wattage ratings on a power supply usually indicate (for marketing purposes) the maximum wattage they could supply if the maximum current allowed was drawn on each line. It is not possible to draw all 350 watts of a 350 watt power supply across one connection (or even one rail). NVIDIA indicated that their card needs a stable 12 volt rail, but that generally power supplies offer a large portion of their 12 volt amperage to the motherboard (since the motherboard draws the most power in the system on all rails).

Many people have been worried about heat generated by a card that requires two power connections. Just to be clear, we aren't drawing twice the power because we have twice the connection, nor are we generating twice as much heat. It's a larger chip, it draws more power, but it won't be clocked as high (with the 6800 Ultra version coming in at 400MHz as opposed to the 5950's 475MHz).

Customers who end up buying this card will most likely need to upgrade their power supply as well. Obviously this isn't an optimal solution, and it will turn some people off. But, to those who like the performance numbers, it may be worth the investment. And there are obviously rumors circulating the net about ATI's next generation solution as well, but we will have to wait and see how they tackle the power problem in a few weeks.
NV40 Under the Microscope Of Shader Details …
Comments Locked

77 Comments

View All Comments

  • Marsumane - Wednesday, April 14, 2004 - link

    This card owns... Anyone know when it ships to retail stores? Guesses even?
  • SpaceRanger - Wednesday, April 14, 2004 - link

    I'd like to see what ATI comes up with before I make my decision. I rushed to judgement back when the GF4 TI4600 came out, and regretted making the quick call to buy. If I don't have to get a new PSU for the ATI solution, I'll consider it, even if performance is 5-10FPS slower. Adding 100 bucks to the already costly 500 for the card doesn't justify the expenditure.
  • gordon151 - Wednesday, April 14, 2004 - link

    AtaStrumf is so right. More than likely you'll be able to buy the X800s before you can buy this.
  • Shinei - Wednesday, April 14, 2004 - link

    Well, I'm sold. Yeah, that sounds fanboyish, but this thing is a solid performer and doesn't require me to completely replace my display drivers... Even if ATI wins by five FPS and has a lens flare in a forgotten corner of a screenshot that you have to stare at for ten minutes to spot, my money is going to NV40--assuming the prices come down a little. ;)
    Speaking of DX9/PS2.0, what about a Max Payne 2 benchmark? I'm curious what NV40 can do on that game with maxed out everything... :)
  • skiboysteve - Wednesday, April 14, 2004 - link

    i love anandtech's deep technical reviews but yall did no where near enough testing, the xbit article does a hell of allot more testing, 48 pages!

    http://www.xbitlabs.com/articles/video/display/nv4...

    the card fucking rapes everything.

    the anand tests dont show nearly the rape the xbit ones do...
  • AtaStrumf - Wednesday, April 14, 2004 - link

    I find it really funny when people say that they will wait until ATi releases their X800 to make up their buying decisions.

    It's not you can run out and BUY this card right now or tomorrow. Of yourse you will wait. You don't really have a choice :)
  • ChronoReverse - Wednesday, April 14, 2004 - link

    The Techreport tested out the total power draw of this thing and it only drew slightly higher than the 5950 (both of which draws more than the 9800XT).


    So it seems the recommendation isn't actually necessary (and my Enermax enhanced 12V lines will take it easily).
  • Pete - Wednesday, April 14, 2004 - link

    mkruer #27, all the reviews I've read mention $500 for the 6800U, and $299 for a 12-pipe 128MB 6800.
  • DerekWilson - Wednesday, April 14, 2004 - link

    #27,

    The 6800 Ultra (which we tested) will be priced at $500

    The 6800 (with 12 pipes rather than 16) will be priced at $300
  • Pete - Wednesday, April 14, 2004 - link

    quikah #26: FarCry comparison screens are at HOCP.

    http://hardocp.com/article.html?art=NjA2LDU=

    Apparently PS3 wasn't enabled, but the 6800U looks better than the 5950U running PS2. It's still uglier than the 9800XT, sadly. Banding abounds, both here and in FiringSquad's Lock-On screens. Puzzling, really. If the 6800U really runs FP32 as fast as FP16 within memory limits, I wonder if all it will take to get IQ on a level with ATi is forcing the 6800U to run the ATi path or removing the NV3x path's _pp hints.

Log in

Don't have an account? Sign up now