Power Requirements


Current generation graphics cards are near the limit for how much current they are allowed to pull from one connection. So, of course, the solution is to add a second power connection to the card. That's right, the GeForce 6800 Ultra requires two independent connections to the power supply. The lines could probably be connected to a fan with no problem, but each line should really be free of any other connection.

Of course, this is a bit of an inconvenience for people who (like the writer of this article) have 4 or more drives connected to their PCs. Power connections are a limited resource in PCs, and this certainly doesn't help. Of course, it might just be worth it. We'll only make you wait a little longer to find out.

The card doesn't necessarily max out both lines (and we are looking into measuring the amperage the cards draw), but, NVIDIA indicated (in the reviewers guide with which we were supplied) that we should use a 480W power supply in conjunction with the 6800 Ultra.

There are a couple factors at work here. First, obviously, the card needs a good amount of power. Second, power supplies generally partition the power they deliver. If you look on the side of a power supply, you'll see a list of voltage rails and amperages. The wattage ratings on a power supply usually indicate (for marketing purposes) the maximum wattage they could supply if the maximum current allowed was drawn on each line. It is not possible to draw all 350 watts of a 350 watt power supply across one connection (or even one rail). NVIDIA indicated that their card needs a stable 12 volt rail, but that generally power supplies offer a large portion of their 12 volt amperage to the motherboard (since the motherboard draws the most power in the system on all rails).

Many people have been worried about heat generated by a card that requires two power connections. Just to be clear, we aren't drawing twice the power because we have twice the connection, nor are we generating twice as much heat. It's a larger chip, it draws more power, but it won't be clocked as high (with the 6800 Ultra version coming in at 400MHz as opposed to the 5950's 475MHz).

Customers who end up buying this card will most likely need to upgrade their power supply as well. Obviously this isn't an optimal solution, and it will turn some people off. But, to those who like the performance numbers, it may be worth the investment. And there are obviously rumors circulating the net about ATI's next generation solution as well, but we will have to wait and see how they tackle the power problem in a few weeks.
NV40 Under the Microscope Of Shader Details …
POST A COMMENT

77 Comments

View All Comments

  • Pete - Monday, April 19, 2004 - link

    Shinei,

    I did not know that. </Johnny Carson>

    Derek,

    I think it'd be very helpful if you listed the game version (you know, what patches have been applied) and map tested, for easier reference. I don't even think you mentioned the driver version used on each card, quite important given the constant updates and fixes.

    Something to think about ahead of the X800 deadline. :)
    Reply
  • zakath - Friday, April 16, 2004 - link

    I've seen a lot of comments on the cost of these next-gen cards. This shouldn't surprise anyone...it has always been this way. The market for these new parts is small to begin with. The best thing the next gen does for the vast majority of us non-fanbois-who-have-to-have-the-bleeding-edge-part is that it brings *todays* cutting edge parts into the realm of affordability. Reply
  • Serp86 - Friday, April 16, 2004 - link

    Bah! My almost 2 year old 9700pro is good enough for me now. i think i'll wait for nv50/r500....

    Also, a better investment for me is to get a new monitor since the 17" one i have only supports 1280x1024 and i never turn it that high since the 60hz refresh rate makes me go crazy
    Reply
  • Wwhat - Friday, April 16, 2004 - link

    that was to brickster, neglected to mention that Reply
  • Wwhat - Friday, April 16, 2004 - link

    Yes you are alone Reply
  • ChronoReverse - Thursday, April 15, 2004 - link

    Ahem, this card has been tested by some people with a high-quality 350W power supply and it was just fine.


    Considering that anyone who could afford a 6800U would have a good powersupply (Thermaltake, Antec or Enermax), it really doesn't matter.


    The 6800NU uses only one molex.
    Reply
  • deathwalker - Thursday, April 15, 2004 - link

    Oh my god...$400 and u cant even put it in 75% of the systems on peoples desks today without buying a new power supply at a cost of nearly another $100 for a quailty PS...i think this just about has to push all the fanatics out there over the limit...no way in hell your going to notice the perform improvement in a multiplayer game over a network..when does this maddness stop. Reply
  • Justsomeguy21 - Monday, November 29, 2021 - link

    LOL, this was too funny to read. Complaining about a bleeding edge graphics card costing $400 is utterly ridiculous in the year 2021 (almost 2022). You can barely get a midrange card for that price and that's assuming you're paying MSRP and not scalper prices. 2004 was a great year for PC gaming, granted today's smartphones can run circles around a Geforce 6800 Ultra but for the time PC hardware was being pushed to the limits and games like Doom 3, Far Cry and Half Life 2 felt so nextgen that console games wouldn't catch up for a few years. Reply
  • deathwalker - Thursday, April 15, 2004 - link

    Reply
  • Shinei - Thursday, April 15, 2004 - link

    Pete, MP2 DOES use DX9 effects, mirrors are disabled unless you have a PS2.0-capable card. I'm not sure why, since AvP1 (a DX7 game) had mirrors, but it does nontheless. I should know, since my Ti4200 (DX8.1 compatible) doesn't render mirrors as reflective even though I checked the box in the options menu to enable them...
    Besides, it does have some nice graphics that can bog a card down at higher resolutions/AA settings. I'd love to see what the game looks like at 2048x1536 with 4xAA and maxed AF with a triple buffer... Or even a more comfortable 1600x1200 with same graphical settings. :D
    Reply

Log in

Don't have an account? Sign up now