Power and Power Management

Power is a major concern of many tech companies going forward, and just adding features "because we can" isn't the modus operandi anymore. Now it's cool (pardon the pun) to focus on power management, performance per watt, and similar metrics. To that end, NVIDIA has beat their GT200 into such submission that it's 2D power consumption can reach as low as 25W. As we will show below, this can have a very positive impact on idle power for a very powerful bit of hardware.

These enhancements aren't breakthorugh technologies: NVIDIA is just using clock gating and dynamic voltage and clock speed adjustment to achieve these savings. There is hardware on the GPU to monitor utilization and automatically set the clock speeds to different performance modes (either off for hybrid power, 2D/idle, HD video, or 3D/performance). Mode changes can be done on the millisecond level. This is very similar to what AMD has already implemented.

With increasing transistor count and huge GPU sizes with lots of memory, power isn't something that can stay low all the time. Eventually the hardware will actually have to do something and then voltages will rise, clock speed will increase, and power will be converted into dissapated heat and frames per second. And it is hard to say what is more impressive, the power saving features at idle, or the power draw at load.

There is an in between stage for HD video playback that runs at about 32W, and it is good to see some attention payed to this issue specifically. This bodes well for mobile chips based off of the GT200 design, but in the desktop this isn't as mission critical. Yes reducing power (and thus what I have to pay my power company) is a good thing, but plugging a card like this into your computer is like driving an exotic car: if you want the experience you've got to pay for the gas.

Idle Power 

Idle power so low is definitely nice to see. Having high end cards idle near midrange solutions from previous generations is a step in the right direction.

Load Power 

But as soon as we open up the throttle, that power miser is out the door and joules start flooding in by the bucket.

Cooling NVIDIA's hottest card isn't easy and you can definitely hear the beast moving air.  At idle, the GPU is as quiet as any other high-end NVIDIA GPU.  Under load, as the GTX 280 heats up the fan spins faster and moves much more air, which quickly becomes audible. It's not GeForce FX annoying, but it's not as quiet as other high-end NVIDIA GPUs; then again, there are 1.4 billion transistors switching in there.  If you have a silent PC, the GTX 280 will definitely un-silence it and put out enough heat to make the rest of your fans work harder.  If you're used to a GeForce 8800 GTX, GTS or GT, the noise will bother you.  The problem is that returning to idle from gaming for a couple of hours results in a fan that doesn't want to spin down as low as when you first turned your machine on.  

While it's impressive that NVIDIA built this chip on a 65nm process, it desperately needs to move to 55nm.

GT200 vs. G80: A Clock for Clock Comparison The Test
Comments Locked

108 Comments

View All Comments

  • Chaser - Monday, June 16, 2008 - link

    Maybe I'm behind the loop here. The only competition this article refers to is some up coming new INTEL product in contrast to an announced hard release of the next AMD GPU series a week from now?

  • BPB - Monday, June 16, 2008 - link

    Well nVidia is starting with the hi end, hi proced items. Now we wait to see what ATI has and decide. I'm very much looking forward to the ATI release this week.
  • FITCamaro - Monday, June 16, 2008 - link

    Yeah but for the performance of these cards, the price isn't quite right. I mean you can get two 8800GTs for under $400 and they typically outperform both the 260 and the 280. Yes if you want a single card, these aren't too bad a deal. But even the 9800GX2 outperforms the 280 normally.

    So really I have to question the pricing on them. High end for a single GPU card yes. Better price/performance than last generations card, no. I just bought two G92 8800GTSs and now I don't feel dumb about it because my two cards that I paid $170 for each will still outperform the latest and greatest which cost more.
  • Rev1 - Monday, June 16, 2008 - link

    Maybe lack of any real competition from ATI?
  • hadifa - Monday, June 16, 2008 - link


    No, The reason is high cost to produce. over a Billion transistors, low yields, 512 bit bus ...

    Unfortunately the high cost and the advance tech doesn't translate to equally impressive performance at this stage. For example, if the card had much lower power usage under load, still it would have been considered a good move forward for having comparable performance to a dual GPU solution but with much cooler running and less demanding hardware.

    As the review mentions, this card begs for a die shrink. It will make it use less power, be cheaper, run cooler and even have a higher clock.
  • Warren21 - Monday, June 16, 2008 - link

    That competition won't come for another two weeks, but when it does -- rumour has it NV plan to lower their prices. Most preliminary info has HD 4870 at 299-329 and pretty much GTX 260 performance, if not, then biting at it's heels.
  • smn198 - Tuesday, June 17, 2008 - link

    You haven't seen anything yet. check out this picture of the GTX2 290!! http://tinypic.com/view.php?pic=350t4rt&s=3">http://tinypic.com/view.php?pic=350t4rt&s=3
  • Mr Roboto - Wednesday, June 18, 2008 - link

    Soon it will be that way if Nvidia has their way.

Log in

Don't have an account? Sign up now