Power and Power Management

Power is a major concern of many tech companies going forward, and just adding features "because we can" isn't the modus operandi anymore. Now it's cool (pardon the pun) to focus on power management, performance per watt, and similar metrics. To that end, NVIDIA has beat their GT200 into such submission that it's 2D power consumption can reach as low as 25W. As we will show below, this can have a very positive impact on idle power for a very powerful bit of hardware.

These enhancements aren't breakthorugh technologies: NVIDIA is just using clock gating and dynamic voltage and clock speed adjustment to achieve these savings. There is hardware on the GPU to monitor utilization and automatically set the clock speeds to different performance modes (either off for hybrid power, 2D/idle, HD video, or 3D/performance). Mode changes can be done on the millisecond level. This is very similar to what AMD has already implemented.

With increasing transistor count and huge GPU sizes with lots of memory, power isn't something that can stay low all the time. Eventually the hardware will actually have to do something and then voltages will rise, clock speed will increase, and power will be converted into dissapated heat and frames per second. And it is hard to say what is more impressive, the power saving features at idle, or the power draw at load.

There is an in between stage for HD video playback that runs at about 32W, and it is good to see some attention payed to this issue specifically. This bodes well for mobile chips based off of the GT200 design, but in the desktop this isn't as mission critical. Yes reducing power (and thus what I have to pay my power company) is a good thing, but plugging a card like this into your computer is like driving an exotic car: if you want the experience you've got to pay for the gas.

Idle Power 

Idle power so low is definitely nice to see. Having high end cards idle near midrange solutions from previous generations is a step in the right direction.

Load Power 

But as soon as we open up the throttle, that power miser is out the door and joules start flooding in by the bucket.

Cooling NVIDIA's hottest card isn't easy and you can definitely hear the beast moving air.  At idle, the GPU is as quiet as any other high-end NVIDIA GPU.  Under load, as the GTX 280 heats up the fan spins faster and moves much more air, which quickly becomes audible. It's not GeForce FX annoying, but it's not as quiet as other high-end NVIDIA GPUs; then again, there are 1.4 billion transistors switching in there.  If you have a silent PC, the GTX 280 will definitely un-silence it and put out enough heat to make the rest of your fans work harder.  If you're used to a GeForce 8800 GTX, GTS or GT, the noise will bother you.  The problem is that returning to idle from gaming for a couple of hours results in a fan that doesn't want to spin down as low as when you first turned your machine on.  

While it's impressive that NVIDIA built this chip on a 65nm process, it desperately needs to move to 55nm.

GT200 vs. G80: A Clock for Clock Comparison The Test
Comments Locked

108 Comments

View All Comments

  • elchanan - Monday, June 30, 2008 - link

    VERY eye-opening discussion on TMT. Thank you for it.
    I've been trying to understand how GPUs can be competitive for scientific applications which require lots of inter-process communication, and "local" memory, and this appears to be an elegant solution for both.

    I can identify the weak points of it being hard to program for, as well as requiring many parallel threads to make it practical.

    But are there other weak points?
    Is there some memory-usage profile, or inter-process data bandwidth, where the trick doesn't work?
    Perhaps some other algorithm characteristic which GPUs can't address well?



  • Think - Friday, June 20, 2008 - link

    This card is a junk bond when taking into consideration cost/perfomance/power consumption.

    Reminds me of a 1976 Cadillac with a 7.7litre v8 with only 210 horsepower/3600 rpm.

    It's a PIG.
  • Margalus - Tuesday, June 24, 2008 - link

    this shows how many people don't run a dual monitor setup. I would snatch up one of these 260/280's over the gx2's anyday, gladly!!

    The performance may not be quite as good as an sli setup, but it will be much better than a single card which is what a lot of us are stuck with since you CANNOT run a dual monitor setup with sli!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
  • iamgud - Wednesday, June 18, 2008 - link

    "I can has vertex data"


    LOL

    These look fine, but need to be moved to 55nm. By the time I save up for one they will .
  • calyth - Tuesday, June 17, 2008 - link

    Well what the heck are they doing with 1.4B transistors, which is becoming the largest die that TMSC has been producing so far?
    The larger the core, the more likely that an blemish would take out the core. As far as I know, didn't Phenom (4 cores on die) suffered low-yield problems?
  • gochichi - Tuesday, June 17, 2008 - link

    You know, when you consider the price and you look at the benchmarks, you start looking for features and NVIDIA just doesn't have the features going on at all.

    COD4 -- Ran perfect at 1920x1200 with last gen stuff (the HD3870 and 8800GT(S))so now the benchmarks have to be for outrageous resolutions that a handful of monitors can handle (and those customers already bought SLI or XFIRE, or GTX2 etc.)

    Crysis is a pig of a game, but it's not that great (it is a good technical preview though, I admit), and I don't think even these new cards really satisfy this system hog... so maybe this is a win, but I doubt too many people care... if you had an 8800GT or whatever, you're already played this game "well enough" on medium settings and are plenty tired of it. Though we'll surely fire it up in the future once our video cards "happen to be able to run it on high" very few people are going to go out of their way $500+ for this silly title.

    In any case, then you look at ATI, and they have the HDMI audio, the DX 10.1 support and all they have to do at this point is A) Get a good price out the door, B) Make a good profit (make them cheap, which these NVIDIA are expensive to make, no doubt) and C) handily beat the 8800GTS and many of us are going to be sold.

    These cards are what I would call a next gen preview. Some overheated prototypes of things to come. I doubt AMD will be as fast, and in fact I hope they aren't just as long as they keep the power consumption in check, the price, and the value (HDMI, DX10.1, etc).

    Today's release reminded me that NVIDIA is the underdog, they are the company that released the FX series (desperate technology, like these are). ATI has been around well before 3DFX made 3d-accelerators. They were down for a bit, and we all said it was over for ATI but this desperate release from NVIDIA makes me think that ATI is going to be quite tought to beat.

  • Brazofuerte - Tuesday, June 17, 2008 - link

    Can I go somewhere to find the exact settings used for these benchmarks? I appreciate the tech side of the write up but when it comes to determining whether I want one of these for my gaming machine (I ordered mine at midnight), I find HardOCP's numbers much more useful.
  • woofermazing - Tuesday, June 17, 2008 - link

    AMD/ATI isn't going to abandon the high end like your article implies. Their plan is to make a really good mid range chip, and ductape to cores together ala the X2's. Nvidia goes from the high-end down, ATI from the mid-end up. From the look of it, ATI might have the right idea, atleast this time around. I seriously doubt we'll see a two core version of this monster anytime soon.
  • DerekWilson - Tuesday, June 17, 2008 - link

    they are abandoning the high end single GPU ...

    we did state that they are planning on competing in the high end space with multiGPU cards, but that there are drawbacks to that.

    we'll certainly have another article coming out sometime soon that looks a little more closely at AMD's strategy.
  • KeypoX - Tuesday, June 17, 2008 - link

    i dont like it, not impressed either :(. Hopefully my 8800gt last for a while, far past this crap atleast

Log in

Don't have an account? Sign up now