Power Requirements


Current generation graphics cards are near the limit for how much current they are allowed to pull from one connection. So, of course, the solution is to add a second power connection to the card. That's right, the GeForce 6800 Ultra requires two independent connections to the power supply. The lines could probably be connected to a fan with no problem, but each line should really be free of any other connection.

Of course, this is a bit of an inconvenience for people who (like the writer of this article) have 4 or more drives connected to their PCs. Power connections are a limited resource in PCs, and this certainly doesn't help. Of course, it might just be worth it. We'll only make you wait a little longer to find out.

The card doesn't necessarily max out both lines (and we are looking into measuring the amperage the cards draw), but, NVIDIA indicated (in the reviewers guide with which we were supplied) that we should use a 480W power supply in conjunction with the 6800 Ultra.

There are a couple factors at work here. First, obviously, the card needs a good amount of power. Second, power supplies generally partition the power they deliver. If you look on the side of a power supply, you'll see a list of voltage rails and amperages. The wattage ratings on a power supply usually indicate (for marketing purposes) the maximum wattage they could supply if the maximum current allowed was drawn on each line. It is not possible to draw all 350 watts of a 350 watt power supply across one connection (or even one rail). NVIDIA indicated that their card needs a stable 12 volt rail, but that generally power supplies offer a large portion of their 12 volt amperage to the motherboard (since the motherboard draws the most power in the system on all rails).

Many people have been worried about heat generated by a card that requires two power connections. Just to be clear, we aren't drawing twice the power because we have twice the connection, nor are we generating twice as much heat. It's a larger chip, it draws more power, but it won't be clocked as high (with the 6800 Ultra version coming in at 400MHz as opposed to the 5950's 475MHz).

Customers who end up buying this card will most likely need to upgrade their power supply as well. Obviously this isn't an optimal solution, and it will turn some people off. But, to those who like the performance numbers, it may be worth the investment. And there are obviously rumors circulating the net about ATI's next generation solution as well, but we will have to wait and see how they tackle the power problem in a few weeks.
NV40 Under the Microscope Of Shader Details …
Comments Locked

77 Comments

View All Comments

  • Da3dalus - Thursday, April 15, 2004 - link

    I'd like to see benchmarks of Painkiller in the upcoming NV40 vs R420 tests...
  • Brickster - Thursday, April 15, 2004 - link

    Am I the only one who thinks Nvidia's Nalu is the MOST bone-able cartoon out there?

    Oy, get the KY!
  • Warder45 - Thursday, April 15, 2004 - link

    Did any reviews try and overclock the card? Is it not possible with the test card?
  • DonB - Thursday, April 15, 2004 - link

    Would have been better if it had a coax cable TV input + TV tuner. For $500, I would expect a graphic card to include EVERYTHING imaginable.
  • Pete - Thursday, April 15, 2004 - link

    Shinei #37,

    "Speaking of DX9/PS2.0, what about a Max Payne 2 benchmark?"

    MP2 doesn't use DX9 effects. The game requires DX9 compatability, but only DX8 compliance for full effects.

    Xbit-Labs has a ton of benches of next-gen titles as well, and is worth checking out. NV40 certainly redeems itself in the HL2 leak. :)
  • Wwhat - Thursday, April 15, 2004 - link

    Anybody happen to know if it's possible to use a second (old) PSU to run it, you can pick up cheap 235 watt PSU's and would be helped with both extra connectors and power.
    I'm not sure it won't cause 'sync' problems though as a small difference between the rails of 2 PSU's would cause one to drain the other if the card's connectors aren't decoupled enough from the AGP port.



  • Pumpkinierre - Thursday, April 15, 2004 - link

    Agrre with you Trog #59 on the venting. Also with DX9.0c having fp32 as spec., does this mean that FX series cards redeem themselves? (As the earlier DX9 spec was fp24 which was'nt present on the FX gpus causing a juggling act between fp16 and fp32 to match performance and IQ). Still, full fp32 on the FX cards might be too slow.
  • mrprotagonist - Thursday, April 15, 2004 - link

    What's with all the cheesy comments before the benchmarks? Anyone?
  • Cygni - Thursday, April 15, 2004 - link

    "what mobo and mobo drivers were used? i hear that the nforce2 provides an unfair performance advantage for nvidia"

    The test was on an Athlon 64 3400+ system, so i doubt it was using an Nforce2. But ya, i agree, the system specs were short. More details are required.
  • Brickster - Wednesday, April 14, 2004 - link

    Derek, what was that Monitor you used?

    Thanks!

Log in

Don't have an account? Sign up now