Smaller Die + More Performance = More Power

Power isn't going to be straight forward here, as this is both a die shrink and an overclock. If all other things were equal, the die shrink would have enabled a some power savings, but increasing the clock speeds (and likely voltages) means that we have factors at work that will push against each other. As for which will win, let's take a look at the data and find out.

Since we didn't take a look at power in our GeForce GTX 295 article, we'll keep an eye on that card as well. Also, keep in mind that there have been 55nm GTX 260s being slowly phased in but that our GTX 260 parts are 65nm. The 55nm GTX 260s will show a power advantage over similarly clocked 65nm GTX 260s.

Idle power shows that NVIDIA is able to get some power savings when nothing is going on with the GPU. Power draw at idle decreased by about 10W with the move to 55nm which shows that in addition to their power saving features the die shrink does help. This advantage carries over to SLI as well with the GTX 285 SLI landing between the two single card dual-GPU systems.

The GeForce GTX 295 slides in just above the single GPU 4870 1GB while AMD's 4870 X2 consumes about 10W more than NVIDIA's higher performing dual-GPU card.

We see a different story when we look at load power. In spite of the die shrink, the added overclock pushes the GeForce GTX 285 higher under load than any other single GPU part. When SLI is enabled this becomes the most power hungry dual card setup we tested.

As for the GeForce GTX 295, we once again see good performance with lower power draw than the Radeon HD 4870 X2 and, in fact, less power draw than all the other dual-GPU setups we tested.

While a half node die shrink isn't the holy grail of power savings, the major advantage for NVIDIA comes from the die size decrease. We don't have measurements on the GPU after the shrink (we don't want to tear apart our hardware until we've tested things like 3-way SLI), but with the massive size of GT200 and the heavy price cuts NVIDIA was forced to make shortly after launch, the cost savings is a very important factor in this move.

NVIDIA needs to keep its price competitive and that means it needs to keep its costs down. Building an overclocked GTX 280 helps raise the price while building the parts at 55nm helps lower the cost. NVIDIA wants this card to be successful.

Race Driver GRID Performance Final Words
Comments Locked

76 Comments

View All Comments

  • DerekWilson - Saturday, January 17, 2009 - link

    It comes with a DVI to HTMI converter. It also carries sound.
  • Daeros - Sunday, January 18, 2009 - link

    With Nvidia you only get HDMI sound if you mobo or sound card has the 2-pin SPDIF connector. With ATI, the card actually has a controller built-in.
  • jay401 - Friday, January 16, 2009 - link

    Call me when it's <$300.
  • Average Joe - Friday, January 16, 2009 - link


    As the owner of a 22" LCD. I think 22" is the perfect size. 24" didn't seem to be worth the extra 200 dollars they cost at the time not to mention that my poor 8800gt 512 would be stuck on medium for everything..
    I play games at 1680 x 1050.

    I don't like to run SLI rigs because I'm not willing to deal with the noise, power, heat the other 85% of the time I'm not playing Crysis.

    I buy single cards and usually midrange motherboards like the P45. The 4870X2 seems like a waste to me personally. I'm as likely to be playing Civ, Guild Wars, or Total War as I am Fallout or Crysis. So I only going to be using that 2nd chip some of the time. I think I'm still more likely to buy one of those X2 boards someday than I am of buying second graphics card thats the exact make and model as the one I have. They cards change to fast.

    Fortunately, having the 22" LCD means I can get by pretty easily with just a single card. I keep reading in forums that ATI driver support ain't where it should be for Vista 64. I don't know if that is deserved or not but I'm avoiding ATI for now. I'm probably going to buy a GX280 or GX285 simply because it CAN play Crysis at 1680X1050 with a single card at max settings and a GX260 can't or barely can. 38.5 FPS is playable with some chopiness, 30 FPS is "its time to think about medium quality" when things get busy. I'm not sure how much less power a 280 uses vs 2 gx260's but I bet it makes less noise. I don't want to sit in front of a leaf blower when I running Turbo Tax.


    I saw a 285 on new egg for $370 today thats less than some gx280's. I might wait for the pricing war carnage to bring the 280's down and get one of those.
  • hk6900 - Saturday, February 21, 2009 - link


    Die painfully okay? Prefearbly by getting crushed to death in a
    garbage compactor, by getting your face cut to ribbons with a
    pocketknife, your head cracked open with a baseball bat, your stomach
    sliced open and your entrails spilled out, and your eyeballs ripped
    out of their sockets. Fucking bitch

    I really hope that you get curb-stomped. It'd be hilarious to see you
    begging for help, and then someone stomps on the back of your head,
    leaving you to die in horrible, agonizing pain. *beep*

    Shut the *beep* up f aggot, before you get your face bashed in and cut
    to ribbons, and your throat slit.

    You're dead if I ever meet you in real life, f ucker. I'll f ucking
    kill you.

    I would love to f ucking send your f ucking useless ass to the
    hospital in intensive care, fighting for your worthless life.

    http://www.youtube.com/watch?v=Po0j4ONZRGY">http://www.youtube.com/watch?v=Po0j4ONZRGY

    I wish you a truly painful, bloody, gory, and agonizing death, *beep*
  • FAHgamer - Saturday, January 17, 2009 - link

    You have heard that ATI's Vista 64 drivers are sub-par?

    However, I've heard that nVidia's drivers are far behind ATI's as regards (any flavor of) Vista.

    While it might be true that Vista 64 Catalyst have issues (I don't know, I am using Vista 32 Ultimate and drivers are fine), I believe that you would be far worse off with an nVidia card.
  • Stonedofmoo - Saturday, January 17, 2009 - link

    I don't know where you heard Nvidia's x64 drivers are behind Ati's as thats nonsense. If you look you will see all review sites use Vista x64 SP1 to do their reviews. They wouldn't do that if they were getting held back with poor drivers.

    In my experience the x64 drivers are every bit as good as x86 drivers, and Nvidia's are better than ATI's despite their more frequent driver releases. That's half ATI's trouble, by sticking to their monthly schedule quality control suffers.
  • hk6900 - Saturday, February 21, 2009 - link


    Die painfully okay? Prefearbly by getting crushed to death in a
    garbage compactor, by getting your face cut to ribbons with a
    pocketknife, your head cracked open with a baseball bat, your stomach
    sliced open and your entrails spilled out, and your eyeballs ripped
    out of their sockets. Fucking bitch

    I really hope that you get curb-stomped. It'd be hilarious to see you
    begging for help, and then someone stomps on the back of your head,
    leaving you to die in horrible, agonizing pain. *beep*

    Shut the *beep* up f aggot, before you get your face bashed in and cut
    to ribbons, and your throat slit.

    You're dead if I ever meet you in real life, f ucker. I'll f ucking
    kill you.

    I would love to f ucking send your f ucking useless ass to the
    hospital in intensive care, fighting for your worthless life.

    http://www.youtube.com/watch?v=Po0j4ONZRGY">http://www.youtube.com/watch?v=Po0j4ONZRGY

    I wish you a truly painful, bloody, gory, and agonizing death, *beep*
  • FAHgamer - Sunday, January 25, 2009 - link

    Huh?

    Who are you replying to? You got me a little confused here...
  • Average Joe - Friday, January 16, 2009 - link


    As the owner of a 22" LCD. I play games at 1680 x 1050. I run a mild overclock because the core 2's run so cool. I don't like to run SLI rigs because I'm not willing to deal with the noise, power and heat the other 85% of the time when I'm not playing Crysis. I tend to buy single cards. The 4870 X2 seems like a waste to me personally because I'm as likely to be playing Civ or Guild Wars as I am Fallout or Crysis.

    Having the 22" LCD means I can get by pretty easily with just a single card. The word on the street is ATI driver support ain't where it should be. I don't know if that is deserved or not. I'm probably going to buy a GX280 or GX285 simply because I can play at 1680X1050 with a single card at max settings with one and a GX260 can't or barely can. 38.5 FPS FPS is quite a bit better than 30 I'm not sure how much less power a 280 uses vs 2 gx260's but I bet it makes less noise.

    I saw a 285 on new egg for $370 today thats less than some gx280's. I might wait for the pricing war carnage to bring the 280's down and get one of those.

Log in

Don't have an account? Sign up now