Smaller Die + More Performance = More Power

Power isn't going to be straight forward here, as this is both a die shrink and an overclock. If all other things were equal, the die shrink would have enabled a some power savings, but increasing the clock speeds (and likely voltages) means that we have factors at work that will push against each other. As for which will win, let's take a look at the data and find out.

Since we didn't take a look at power in our GeForce GTX 295 article, we'll keep an eye on that card as well. Also, keep in mind that there have been 55nm GTX 260s being slowly phased in but that our GTX 260 parts are 65nm. The 55nm GTX 260s will show a power advantage over similarly clocked 65nm GTX 260s.

Idle power shows that NVIDIA is able to get some power savings when nothing is going on with the GPU. Power draw at idle decreased by about 10W with the move to 55nm which shows that in addition to their power saving features the die shrink does help. This advantage carries over to SLI as well with the GTX 285 SLI landing between the two single card dual-GPU systems.

The GeForce GTX 295 slides in just above the single GPU 4870 1GB while AMD's 4870 X2 consumes about 10W more than NVIDIA's higher performing dual-GPU card.

We see a different story when we look at load power. In spite of the die shrink, the added overclock pushes the GeForce GTX 285 higher under load than any other single GPU part. When SLI is enabled this becomes the most power hungry dual card setup we tested.

As for the GeForce GTX 295, we once again see good performance with lower power draw than the Radeon HD 4870 X2 and, in fact, less power draw than all the other dual-GPU setups we tested.

While a half node die shrink isn't the holy grail of power savings, the major advantage for NVIDIA comes from the die size decrease. We don't have measurements on the GPU after the shrink (we don't want to tear apart our hardware until we've tested things like 3-way SLI), but with the massive size of GT200 and the heavy price cuts NVIDIA was forced to make shortly after launch, the cost savings is a very important factor in this move.

NVIDIA needs to keep its price competitive and that means it needs to keep its costs down. Building an overclocked GTX 280 helps raise the price while building the parts at 55nm helps lower the cost. NVIDIA wants this card to be successful.

Race Driver GRID Performance Final Words
Comments Locked

76 Comments

View All Comments

  • MadMan007 - Thursday, January 15, 2009 - link

    The benchmark numbers are there below the graphs but I agree that charting 2560x1600 isn't very realistic. Maybe the benchmarkers are getting a little out of touch with what real people have for monitors.
  • Beno - Thursday, January 15, 2009 - link

    ffs its been 2 years and we still cant get pass 100 fps burrier in crysis at 1650x !!

    every new cards ati and nv makes, only gives around extra 10 fps on that game :(
  • MadMan007 - Thursday, January 15, 2009 - link

    One detail that's not clear, and this is partly because of NVs confusingly named releases, is which GTX 260 is included in the charts. We know it's not the 55nm, but is it 192 or 216 shader? Lots of websites forget to put this detail in their testing, just writing GTX26-192 or -216 would make it clear. Thanks.
  • jabber - Thursday, January 15, 2009 - link

    ....those bizarre S-Video outputs?

    Why not something more useful? Or just drop them completely.
  • Odeen - Thursday, January 15, 2009 - link

    The S-Video outputs are industry standard, and are used to connect to SD TV sets.. I don't see what's so bizzare or useless about them.
  • jabber - Friday, January 16, 2009 - link

    But who uses them?

    I've never seen anyone use them and I havent read about anyone trying for years. When they did all those years ago the VIVO thing was a mess or a pain to get working.

    Just seems pointless now especially for SDTV.
  • MadMan007 - Friday, January 16, 2009 - link

    While it's an s-video looking output it's not just for s-video, they are used for component output as well I believe.
  • SpacemanSpiff46 - Thursday, January 15, 2009 - link

    Any reason the 4850 X2 is being neglected so much? I have not seen any reviews with this card. Also, it would be nice to see how the 9800GX2 is stacking up with these cards.
  • bob4432 - Thursday, January 15, 2009 - link

    wonder the same thing myself - the 4850 is a good card alone and the price is very nice. add to that that many people are running a 4850, this could be a very attractive upgrade - lets see some 4850 cf setup #s/comparisons too
  • Sunagwa - Friday, January 16, 2009 - link

    I have to agree. I always go for the most value when I purchase my parts.

    Granted "value" can easily be taken out of context considering obviously wide ranging income.

    For me however the 4850 (this time around, I am a PC gamer at heart) was the absolute choice when I purchased it.

    Getting back on topic, I would love to see the CF setup as well as the dual GPU setup included in your review. If only to be able to compare the performance and possible upgrade potential of my current computer to your test bed.

    Just a side note for those who care but my C2DUO-Wolfdale OC'D to 4Ghz that I payed 160$US for has me very happy and I could care less about Corei7...wait...no I could not. 8)

    Regards,
    Sunagwa

Log in

Don't have an account? Sign up now