Power, Temperature, & Noise

We only have 1 GT 430 card on hand, so unlike past articles we cannot compile any data on the load voltages of this line of cards. Our lone Asus card has a load voltage of 1.08v, and an idle voltage of 0.88v. Idle clocks are 50.6MHz for the core, and 270MHz effective for the memory.

While GT 430 may not be competitive on a performance-per-dollar basis, it’s hard to argue with these power results. Even at these low wattages where our 1200W power supply isn’t very efficient, the GT 430 still delivers an idle power consumption level 7W under the Radeon 5570, and an even larger 11W advantage over the otherwise performance-superior Radeon 5670. We can’t measure the card’s actual power consumption and NVIDIA does not provide a reference level, but the GT 430 can’t be drawing more than a couple of watts here.

Under load things are less rosy for the GT 430. Under Crysis it ends up drawing 10W over the 5570, which serves as yet another testament to the amazing level of performance-per-watt that AMD has been able to attain; remember that the 5570 had better performance at the same time. Under Furmark the situation is just as bad, but at this point we’re looking at a more pathological case. Overall 10W isn’t going to break the bank in an HTPC (especially since these cards would rarely get up to full clocks in the first place) but it’s something to consider if every watt is going to count.

When it comes to idle temperatures, the GT 430 is second to none. With the card only consuming a couple of watts at idle in the first place, its idle temperatures are barely above room temperature (and even closer to ambient case temperatures). It’s tied with the GTS 450, which impressed us last month with its highly capable cooler. Meanwhile our 5570, which is a low-profile card just like the GT 430, ends up being a good 10C higher. AMD’s higher idle power consumption directly translates to a higher idle temperature.

Looking at load temperatures, these results aren’t all that surprising given the cooler in use. The Asus cooler is practically a passive cooler, as the small fan is not capable of moving much air (though it is unusual to not see Asus aggressive on cooling). As a result it manages to reach higher temperatures, but we’re still only talking about 60C under Crysis and 72C under Furmark. This is worse than the 5570 and its larger fan by a bit, but not significantly so.

At idle the GT 430 is consistent with our other cards. With the exception of a couple ridiculous cards like our GT 240 and in this case the 5570, everything is at roughly 42-44dB(A). The 5570 is more fan than heatsink, which is likely why it has such a poor showing here compared to the GT 430.

The payoff of accepting higher temperatures is less noise to contend with. The GT 430 never needs to ramp up its fan in our tests, delivering a load noise level even lower than the GTS 450, and only worse than a GT 220 that runs up against ambient noise levels. If you want to make a good HTPC card then it needs to be silent, and Asus/NVIDIA have delivered on that here. It’s not quite silent since it’s not passive, but it’s about as close as one can reasonably achieve. It’s also noticeably better than the 5570, a card which is by no means noisy. If noise is a primary concern then the GT 430 is a very good candidate for a HTPC.

Compute Performance & Synthetics Final Words
Comments Locked

120 Comments

View All Comments

  • heflys - Monday, October 11, 2010 - link

    Seriously?
  • Belard - Tuesday, October 12, 2010 - link

    Overall, this card isn't impressive at all... the PRO's are there, and AMD does need 3D and physics abilities.

    But at $80, it goes against the 5650 cards and easily loses.

    About HDMI 1.4b... it doesn't really matter. HDMI is dead... faster than it should be, but there is no future in it. CAT6-A/V will start replacing HDMI in 2011.... all the big TV players are on board - they don't have to pay licensing fees or use special expensive connectors or cabling of HDMI.

    And HTPC's will not get very popular until the Cable companies loosen up about people access channels like HBO, SHO, etc. Windows7 Media player is nice, but the interface is still rather weak for power users compare to some of the others out there. For example, the program grid is HORRIBLE... when others allow 2~4hours of blocks and around 20 channels at a time... none of this 1.5hr / 6 channel junk. Oh, and the DRM of Media player makes archiving your shows near impossible. Like if you have to reinstall the OS or do a system upgrade.
  • heflys - Tuesday, October 12, 2010 - link

    According to most review sites, things like PhysX and 3d vision are nothing but gimmicks that contribute little to actual performance. Instead, most view them as pointless system hogs.
  • Belard - Tuesday, October 12, 2010 - link

    er... PhysX and 3D has never been about improving performance. It was about adding to the visual experience. Like Avatar looks great in 2D and 3D... but 3D sucks you in a bit more.

    Games like Mirror's Edge come more realistic with PhysX, even thou it doesn't improve game play one bit.

    Those technologies are new, and until PhysX becomes shared/standard on all video cards - it will be more gimmick then a standard. But who knows...

    Hmmm... back around 1988 when computers were 8~16mhz, only Mac and Amigas pretty much had a native GUI OS, MS was horrible MS-DOS with 8.3 file names, no multi-tasking, horrible graphics and forget about sound. Someone from the DOS camp said "Who needs graphics and sound, those are for toys. PC are REAL Computers".

    Uh huh. And now we have 1000Mhz cell phones with 16GB of RAM.

    The 1986 vintage Amiga had Graphics, sound and Multi-tasking... was it a gimmick?
  • heflys - Tuesday, October 12, 2010 - link

    "Performance" was a typo on my part, since I clearly indicated that it was a system hogs. Physx, in most cases-as displayed titles such as Mafia II- contribute little to nothing (in some games) towards graphics. Most players won't even notice such things as enhanced physics or improved decals. In fact, the most noticeable thing displayed in Mafia II was the presence of debris. Players will, however, notice the impressive amount of lag brought on by such features.

    3d Vision, as displayed in one review, rendered the GPU (a gtx 460 1gb) to unplayable frame-rates. It essentially required the player go to SLI. Which brings me to another point.....Why are you bringing up Physx or 3d vision in regards to this product? You seriously think this cheap HTPC card could handle any of the above features, particularly when a 1gb 460 struggles to?

    And are we seriously comparing the Amiga to such an insignificant thing as cheesy video game effects? You can't be serious. Particularly when there are other physics engines (Havok being one of the most prominent) doing some of the same things.

    However, please tell me how Physx made Mirror's Edge a more realistic experience. Particularly since that game, like Mafia II, only added physics to debris.
  • Belard - Tuesday, October 12, 2010 - link

    I agree with you on the first paragraph. We want constant visual abilities, but without the cost of general performance.

    This was one of the arguments of 3Dfx's Voodoo3 vs TNT cards -
    Performance with 16bit graphics vs nVidia's 24bit.

    When I played the JSF game around 1999-2000, the 16bit limitation was noticed BIG time on my Voodoo1, but the frame rate murdered the ATI I had. It was a trade off. This is always a constant battle with out GPUs... remember when AA was added? Even today, AA effects the performance of every single video card - but unlike 8 years ago, it no longer renders most cards useless.

    Yeah, 3d Vision & PhysX is useless on the GF430... pretty much like the ATI Eyefinity's tech doesn't belong on every ATI card (reduce the cost by $10, improve airflow) - especially for the low-end, but its very handy for business users.

    You said: "And are we seriously comparing the Amiga to such an insignificant thing as cheesy video game effects?"

    Yes, in that PhysX and 3D tech is still baby tech. In a few years, we'll be start seeing 3D TV's that don't require glasses. PhysX or Havok or other becomes more standard - or perhaps MS adds it to DX12. It's going to be years before we see results of the latest technology. Just like the PC folk's of the 80's who said the Amiga was a toy and computers didn't need graphics and sound. And yes, my Amigas still work.

    "please tell me how Physx made Mirror's Edge a more realistic experience." Look up the various side-by side videos. It adds cloth effects, broken glass and yes, debris. A side by same example: http://www.youtube.com/watch?v=w0xRJt8rcmY and check out batman too.

    Of course, that didn't help to actually POPULATE the city of Mirror's edge with people... funny, a huge modern city with only a few people and police, with all that construction - where are the workers? Another example. A burger that is just meat and bread is bland... but add some tomatoes, lettuce, cheese and it becomes a better meal.
  • heflys - Tuesday, October 12, 2010 - link

    Thanks for the civil discussion....I half expected you to call me an idiot for some reason......Don't know why.....

    I think ATI's just going to bide its time with the 3d/Physics display, since at this point, they don't really need to invest in that platform. Maybe in the future.
  • Belard - Wednesday, October 13, 2010 - link

    Would it make you feel better if I did? :)

    I've been into computers for a long long time - and I do my best to NOT be a fanboy. Give credit where credit is due... Apple, Intel, MS, AMD, Nvidia, Opera, FireFox, etc.

    What gives me/us the best deal at the time of purchase.

    In 2015, our graphics on consoles (don't know about computers) will make todays GPUs look like GeForce 5900/ATI 9700 in terms of performance and abilities.

    We'll see. Perhaps Archive this page?
  • drjonz - Tuesday, October 12, 2010 - link

    Why no comparison to integrated Intel Clarksdale? Many of us with HTPC went with that since we're not gamers. I've been really happy with it. Maybe once per Blu-Ray watching, I'll get a stutter. Not sure if it's because I'm underpowered or what. Would be cool to see what more I'd get for $100.
  • ganeshts - Tuesday, October 12, 2010 - link

    We have mentioned the HQV score for Clarkdale (Intel HD Graphics) as 133, much lower than 5570 and slightly lower than the 430.

    Please take a look at the Core 100 review we carried a few months back. It reviewed the Arrandale platform for HTPCs and it is quite good for casual HTPC users.

Log in

Don't have an account? Sign up now