Efficiency and PFC




Looking at the efficiency the first time, we actually couldn't believe the results and ran all our tests again. Unfortunately, the testing is correct and the efficiency is as you see above. Particularly at lower loads, the Zeus 1200W isn't that great, but that's not a huge concern as most buyers likely don't intend to use it with less than 300W of power. This power supply was made for maximum power setups and we don't expect stellar results at lower loads. Anyone even considering this power supply should plan on a minimum of 300W power draw - perhaps on servers or workstations that are rarely idle. As the graph shows, the best efficiency is reached with a power draw of 500W to 600W. If the goal is to stay in the 80% or higher efficiency range, 230VAC users will get this with power draws ranging from around 300W up to 1100W; 120VAC efficiency is quite a bit lower and only reaches 80% or higher in the 400W to 800W range, and 90VAC doesn't even break into the 80%+ efficiency range.


The power factor correction does very well, especially with lower input voltages. We haven't seen such a good result for a long time. 230CAV users will need to be happy with average results of up to .975.

DC Outputs and Quality Fan Speed and Acoustics
Comments Locked

20 Comments

View All Comments

  • thebackwash - Tuesday, July 8, 2008 - link

    At what point does the house's wiring really begin to be a factor in sustaining the power draw? I know you're measuring watts vs. volts, but if someone could give a practical range and a better explanation as to the practical problems with running appliances with a high current draw. Volts being constant generally in a house, is it amperage that cannot be increased past a certain point?

    What I'm trying to ask, to anyone who can elucidate, is at what point does the house become the limiting factor as compared with the computer PSU?

    I know that if I run the air conditioner in a room and someone sends a job to the (laser) printer plugged in in the same room, it trips the circuit breaker downstairs, and everybody gets an 'oh gosh, that was silly' out of it. When does one have to get new wiring run in the house to run their über gaming rig/cluster running department of defense simulations?
  • Carnildo - Tuesday, July 8, 2008 - link

    > At what point does the house's wiring really begin to be a factor in sustaining the power draw?

    Right about here. 1200 watts at 75% efficiency means that, at full load, this PSU is drawing 1600 watts from the outlet. Most 120v house wiring is limited to 1800 watts (15 amps) per circuit, so if this thing is sharing an outlet with almost anything (say, a laser printer), you'll be blowing fuses on a regular basis.

    If your power is only 110v (common enough), then 1200 watts at ~73% efficiency is 1650 watts at the outlet, exactly the limit of what a 15-amp circuit can provide.
  • thebackwash - Tuesday, July 8, 2008 - link

    "Is it amperage…"

    I should say, "Is it amperage and the correlated wattage that can't be increased beyond a certain rating?" What's generally the bottleneck or are the two tightly linked phenomena when it comes to encountering real world engineering limitations?
  • JarredWalton - Tuesday, July 8, 2008 - link

    If you're running 115VAC (i.e. in the US), then you need to look at the circuit that's tripping. It's probably a 15A circuit, which means that you can only run around 1700W worth of equipment on that circuit before you have the problems you describe. (115V * 15A = 1725W) The question is then how much power the various devices use.

    I wouldn't be at all surprised if your AC unit can pull upwards of 750W... and if it's a powerful model it could easily reach the 1250W and higher range. (Yup, AC is expensive!) A laser printer might use anywhere from 100W to 300W I suppose. I'd suggest getting something like a cheap Kill-A-Watt device and plugging the various power users into it.

    Also, don't forget that lights use power as well. That 60W light bulb uses 60W, so if you have a light fixture with three bulbs, there's another 180W (or 225W if you use 75W bulbs). I highly recommend the florescent bulbs as a power efficient alternative.
  • gameman733 - Tuesday, July 8, 2008 - link

    I think theres an error in this graph. http://images.anandtech.com/reviews/psu/2008/silve...">http://images.anandtech.com/reviews/psu/2008/silve... (DC output at 12V, look at the left side, 12.12, 12.00, 11.88, 11.94, 11.40, out of order)
  • JonnyDough - Monday, July 7, 2008 - link

    The real problem with PSUs like this is that sometimes people that are well off and on their first build who want "the best" run out and buy something like this and absolutely do not need it. It just ends up wasting electricity, which we all know is largely derived from strip-mining/coal burning which is horrible for the ozone and natural habitats.
  • serchaing - Monday, July 7, 2008 - link

    This is actually a myth, one that I also thought to be true until recently. For example, a PC that requires 340W to operate will use 340W whether the power supply is a 450W or 600W. PC Power and Cooling's web site dispels this several other PSU myths here:

    http://www.pcpower.com/technology/myths/#m1">http://www.pcpower.com/technology/myths/#m1
  • C'DaleRider - Tuesday, July 8, 2008 - link

    Not the PCP&C myths again. While one or two are actually correct, the modular cable "myth" they dispel has been proven, time and again by independent testing, to be just marketing fluff by PCP&C. Add to that the "single rail is better" myth PCP&C pushes....only taken up, by the way after PCP&C absolutely failed at their design of the multi-railed Turbo Cool 1000W unit (it was horribly under powered on the rails supplied and caused problems....and their solution, instead of fixing the rails and supplying proper voltage/amperage per rail was to dump it for an easier to design single rail.)

    But, outside of efficiency, you are correct in that a power supply will only draw what is needed from the wall to run whatever is connected to it....no more, no less. So, a 1kW ps will only draw XA or X volts from the wall to supply what's required from it, be it 200W or 900W. It's no more expensive to run a 1200W unit, again leaving efficiency out of the equation, than a 500W unit.

    And if you really look at power supplies and their construction, you'd notice that the high power units tend to be built better with better quality internals than lower wattage units.
  • JarredWalton - Tuesday, July 8, 2008 - link

    While that is technically true, efficiency comes into play. If a PSU reaches maximum efficiency with a load of 30-70% of the rated output, then a system that requires 350W should have a 500W PSU minimum, and for optimal efficiency you almost certainly wouldn't want anything larger than 1150W (*cough*).

    Personally, I try to shoot for around 30 to 50% load, but even my most powerful system only draws a rather piddly 400W at peak. With a roughly 80% efficient power supply, that means the system is only using in the vicinity of 320W. Idle power draw drops to under 200W (160W or less power used by the system). This is with a quad-core Q6600 G0 stepping running at 3.40GHz, 2x2GB DDR2-800 RAM, two HDDs, and dual HD 3870 cards. It's been running quite happily with a 650W power supply for over six months.
  • mattclary - Monday, July 7, 2008 - link

    Can anyone explain to me, or point me to the info on how it is a power supply that will be plugged into a 20 amp circuit can provide more than 20 amps?

Log in

Don't have an account? Sign up now