Introduction

Have you ever thought about how much it cost to run your PC -- the one you're using to read this article? What does it cost to play games, surf the Internet, or download files? It all costs money -- money that you, your parents, or whoever is in charge of the monthly electricity will have to pay. Those of you in charge of paying this bill will surely be interested in keeping costs down, which is why you might want to pay a little more attention to what sort of hardware you are using in your computer.

Many users -- especially computer enthusiasts -- put together a new PC that can easily handle any task, without much thought for power efficiency. If you intend to use the computer primarily for gaming, buying a high-end processor and graphics card makes sense. Likewise, if you intend to do complex three animations or movie encoding, you'll probably want to have as much processor power as possible. If all you're going to do is watch movies, run Microsoft Office, and surf the Internet, you're not going to put a big load on any of the components. In that case, your PC will typically be idle and waiting for user input, while any high-power components will still go merrily along sucking down extra power.

We recently looked at the topic of power consumption for each component in the PC. Of course the numbers were merely a rough estimate for our specific setup, programs, and tasks, so that article could serve as a baseline for the amount of power your system might require. We also discussed how power requirements affect the type of power supply that you will want to purchase. In this article, we want to focus more specifically on the costs of running a computer (not counting anything like broken components and upgrades). We look at electricity prices in the US and Europe to calculate how much various types of PCs actually cost to run. Perhaps you're one of those people with multiple systems -- one for gaming, one for office work, maybe one or two for the kids, and perhaps a few extras running distributed computing tasks 24/7. We will look at several different workloads to see how much various types of systems actually end up costing on a hourly, daily, and yearly basis.

KWh prices in the U.S and EU

When we started researching prices of electricity (measured in kilowatts hours/kWh) for the different countries, we were surprised by the huge differences in price. In the US prices range from $0.05 to $0.21, according to the Energy Information Administration -- the average price is $0.089 per kWh. European prices are different for each country, so we will just take Germany as an example. Prices there are high relative to the US but about average for Europe. In 2008, Germany has an average of 17 to 22 Cents (€) -- about $0.22 to $0.29 USD! That's anywhere from 1.5 to 6 times as expensive in the old world depending on where you live; obviously, areas where costs are higher will probably be more interested in PC power consumption, but that is a separate issue from what we are looking at today.

Calculating Power Requirements and Costs
POST A COMMENT

59 Comments

View All Comments

  • JarredWalton - Friday, November 14, 2008 - link

    If the laptop isn't plugged in, the power brick should use 0W (or at least less than 1W). Reply
  • MadMan007 - Friday, November 14, 2008 - link

    ...and thi quote is the most important one that made me decide it's not economically meaningful to upgrade from a ~75% PSU to an 85% one. When you do these estimates on non-24/7 use the savings plummet quickly. Reply
  • MadMan007 - Friday, November 14, 2008 - link

    Grr...quote window didn't work right, why can't we just use tags?

    Anyway here's the quote:
    "If you only run the system eight hours per day, however, the difference in cost drops off quickly."
    Reply
  • Nfarce - Friday, November 14, 2008 - link

    "Hopefully we've made it clear that upgrading an existing power supply to a higher efficiency model purely for the power savings doesn't make sense."

    I am not nor have I ever been concerned how much power my PCs use (or my PS3). Compared to other "hobbies" such as street racing, cruising, spending $50/night bar hopping, and other things people get involved with and in trouble over, PC and console gaming at home is cheap and relatively environmentally friendly. Besides, the logic behind spending hundreds on a higher efficiency PS to lower utility bills is about as brilliant as spending $30,000 on a new hybrid Camry to save money on gas. But if it makes you feel better about yourself, hey, it's *your* money.

    However, as we shift to a new administration in the States next year which has already stated it wants to target the coal industry, I might have a change of tune. We will see utilities skyrocket with the green syndrome of progressing to wind farms and solar power that just won't make up for coal fired plants. We already know the environmentalists and other special interest hacks here will poo-poo on nuclear power.

    Talk to me in two years...
    Reply
  • Griswold - Friday, November 14, 2008 - link

    About time you share our energy pain in europe, then. :P
    You're still not where we are as far as gasoline goes...
    Reply
  • 7Enigma - Monday, November 17, 2008 - link

    Then blame your government. Your high gas prices are a direct result of high taxes (likely to pay for the universal healthcare), not that we in the US get a better deal. Reply
  • yyrkoon - Friday, November 14, 2008 - link

    Using less power will *always* benefit a household more than anything else concerning saving money where power is the concern. It is also not just a one item deal when trying to figure out how to cut power costs. Refrigerators/deep freezers commonly in most households use more power than anything else. Microwaves, coffee makers, rice cookers, and hair dryers etc can all use more power, but typically run for far less time. Another place to save on power costs would be changing the type of lightning one uses, say from incandescent lighting to LED lighting.

    However, as has been said by many people before in the past, many many times: there is no such thing as a free lunch. Saving power by using a more efficient light as an example is of course going to cost you more money. But also with LED lights you're going to pay a premium for those more efficient lights. So, in the short term, best way to save money is just to turn that item off when not in use. This goes for VCR's, Computers, or whatever does not need to be plugged *right_now*(and yes, most of us should know that most appliances do draw at least some power when off, but still plugged in). Even going completely off grid(meaning you get your power 100% from solar, wind, or multiple other sources) is going to be just like paying your power bill up front, with reoccurring charges for batteries, and maintenance for your equipment. In case of the latter expect to pay tens of thousands of US dollars just for the price of admission.

    Now, as for as strictly Power Supplies are concerned, Yes a more efficient power supply *will* save you money. How much really depends, and there are other factors to consider than "how efficient it *is*'. You need to determine exactly how much power your system will consume, and procure a PSU that is most efficient at that power level. Just because a power supply is 99.9% efficient does not mean it will work well for your given application. Other factors would be longevity, and reliability. Data centers often purchase PSU's where the given system using them only uses 25-40% of that PSU's capacity. This is why current technology is 'trending' towards power supplies with a better/broader power efficiency range(e.g. they are most efficient on the power curve where they are planned to be loaded at). That said, these types of power supplies used by data centers, etc are not of the off the shelf variety(usually).

    Reply
  • Staples - Friday, November 14, 2008 - link

    There are a ton of people who leave their computers on 24/7 for no good reason. I am a tree hugger and of course I put mine into S3 sleep if I even walk away for more than 10 minutes. Plus, my second computer is very low power because I bought really low power parts for it including one of the most important, integrated video.

    In my main computer, I have an ATI 4850 which sucks a lot of power even being idle and I have a guilty conscious about even using for non gaming needs. Hybrid VGA power state hardly exists now but I am glad it will be coming eventually because powerful video cards sitting idle is one of the biggest wastes of power. Also, I am glad that Vista has Cool and Quiet built in because most people do not even know you need software to make it work (unlike Intel's speed step which works without any software).
    Reply
  • cyclo - Saturday, November 15, 2008 - link

    This is where nVidia currently has ATI beat. I'm not sure about nVidia's cards on the lower end of the scale but on the GTX 2xx class of cards, they implement a power saving "2D" mode when the GPU is mostly idling (basically when not playing games or videos).

    On my GTX 260, the GPU core downclocks to 301 (from 621), the shader to 602 (from 1295), and the memory to 200 (from 2052) when I am just surfing the web (which is basically "2D" mode). The clocks go up to default as soon as I start playing a video and of course start playing a game. The temps at "2D" mode goes down to 47 C from 54 C in idle "3D" mode (playing a video).

    There is one problem though and I hope nVidia can fix this with a future driver release. That is when you run 2 monitors the video card never goes into "2D" mode... even when you are not gaming or playing a video. This is why I am forced to disable my 2nd monitor whenever I don't have a need for it.
    Reply
  • JarredWalton - Saturday, November 15, 2008 - link

    ATI has been doing the same thing for about as long as NVIDIA. There was an issue with 4870 initially where the power saving modes didn't engage properly, but that has been fixed for a while now. NVIDIA is more aggressive, however, on dropping clocks and reducing voltages as well I think.

    Speaking of multi-monitor support, wasn't there a problem with NVIDIA cards and dual monitors with certain 3D engines? Also seem to recall hearing the second display gets shut off in all 3D games on NVIDIA. Maybe that was fixed as well, though.
    Reply

Log in

Don't have an account? Sign up now