Conclusion

Compared to AMD’s previous generation of bottom-tier cards, the Radeon HD 5450 doesn’t offer too many surprises. Cards at this end of the spectrum have to give up a lot of their performance to meet their cost, power, and form-factor needs, and the 5450 is no different. It certainly produces playable framerates for most games (and even at high settings for some of them), and it’s going to be a great way to convince IGP users to move up to their first discrete GPU. But for a bottom-tier GPU, spending a little more money has always purchased you a great deal more powerful video card, and this hasn’t changed with the 5450.

The concern we have right now is the same concern we’ve had for most of AMD’s other launches, which is the price. The card we tested is a $60 card, smack-dab in the middle of the territory for the Radeon HD 4550, the DDR2 Radeon HD 4650, and the DDR2 GT 220. We don’t have the DDR2 cards on hand, but the performance gap between bottom-tier cards like the 5450 and those cards is enough that the DDR2 penalty won’t come close to closing the gap. If performance is all you need and you can’t spend another dime, then a last-generation card from the next tier up is going to offer more performance for the money. The 5450 does have DX11, but it’s not fast enough to make practical use of it.

Things are more in favor of the 5450 however when we move away from gaming performance. For a passively cooled low-profile card, its competition is the slower GeForce 210, and a few Radeon HD 4550s. The 4550 is still a better card from a performance standpoint, but it’s not a huge gap. Meanwhile the 5450 is cooler running and less power hungry.

Currently it’s HTPC use that puts the 5450 in the most favorable light. As the Cheese Slices test proved, it’s not quite the perfect HTPC card, but it’s very close. Certainly it’s the best passively cooled card we have tested from an image quality perspective, and it’s the only passive card with audio bitstreaming. If you specifically want or need Vector Adaptive Deinterlacing, the Radeon HD 5670 is still the cheapest/coolest/quietest card that’s going to meet your needs. But for everyone else the 5450 is plenty capable and is as close to being perfect as we’ve seen any bottom-tier card get.

To that end the Sapphire card looks particularly good, since based on our testing they're able to drop the reference 5450's clumsy double-wide heatsink for a single-wide heatsink without the card warming up too much more. For Small Form Factor PCs in particular, it's going to be a better choice than any card that uses the reference heatsink, so long as there's enough clearance for the part of the heatsink on the back side of the card.

Moving away from the 5450 for a moment, besides the Radeon HD 5770 this is the only other card in the 5000-series that is directly similar to a 4000-series card. In fact it’s the most similar, being virtually identical to the 4550 in terms of functional units and memory speeds. With this card we can finally pin down something we couldn’t quite do with the 5770: clock-for-clock, the 5000-series is slower than the 4000-series.

This is especially evident on the 5450, where the 5450 has a 50MHz core speed advantage over the 4550, and yet with everything else being held equal it is still losing to the 4550 by upwards of 10%. This seems to the worst in shader-heavy games, which leads us to believe that actual cause is that the move from DX10.1 shader hardware on the 4000-series to DX11 shader hardware on the 5000 series. Or in other words, the shaders in particular seem to be what’s slower.

AMD made several changes here, including adding features for DX11 and rearranging the caching system for GPGPU use. We aren’t sure whether the slowdown is a hardware issue, or if it’s the shader compiler being unable to fully take advantage of the new hardware. It’s something that’s going to bear keeping an eye on in future driver revisions.

This brings us back to where we are today, with the launch of the 5450. AMD has finally pushed the final Evergreen chip out the door, bringing an end to their 6 month launch plan and bringing DirectX 11 hardware from the top entirely to the bottom – and all before NVIDIA could launch a single DX11 card. AMD is still fighting to get more 40nm production capacity, but the situation is improving daily and even with TSMC’s problems it didn’t stop AMD from doing this entirely in 6 months. With the first Cedar card launched, now we’re going have a chance to see how AMD chooses to fill in the obvious gaps in their pricing structure, and more importantly how NVIDIA will ultimately end up responding to a fully launched 5000-series.

Power & Temperatures
Comments Locked

77 Comments

View All Comments

  • Purri - Monday, March 8, 2010 - link

    Ok, so i read a lot of comments that the cheap passive DP-Adapters wont work for a EyeFinity 3 Monitor setup.

    But, can i use this card for a 3 monitor windows-desktop setup without eyefinity - or do i need an expensive adapter for this too?

    I'm looking for a cheapish, passivly(silent) cooled card that supports 3 monitors for windows applications, that has enough performance to play a few old games now and then (like quake3) on 1 monitor.

    Will this card work?
  • waqarshigri - Wednesday, December 4, 2013 - link

    yes of course it has amd eyefinity technology .... i played new games on it like nfs run,call of duty MW3, battlefield 3,
  • plopke - Friday, February 5, 2010 - link

    :o what about the 5830 , wasn't it delayed until the 5th. It is suddenly very quiet about it on all techsite. And not launched today.
  • yyrkoon - Thursday, February 4, 2010 - link

    Your charts are all buggered up. Just looking over the charts, in Crysis: Warhead, you test the nvidia 9600GT for performance. Ok fine. Then we move a long to the Power consumption charts, and you omit the 9600GT for the 9500GT ? Better still, we move to both heat tests, and both of these card are omitted.

    WTH ?! Come on guys, is there something wrong with a bit of consistency ?
  • Ryan Smith - Friday, February 5, 2010 - link

    Some of those cards are out of Anand's personal collection, and I don't have a matching card. We have near-identical hardware that produces the same performance numbers; however we can't replicate the power/noise/temperature data due to differences in cases and environment.

    So I can put his cards in our performance tests, but I can't use his cards for power/temp/noise testing. It's not perfect, but it allows us to bring you the most data we can.
  • yyrkoon - Friday, February 5, 2010 - link

    Well, the only real gripe that I have here is that I actually own a 9600GT. Since we moved last year, and are completely off grid ( solar / wind ), I would have liked to compare power consumption between the two. Without having to actually buy something to find out.

    Oh well, nothing can be done about it now I suppose.

    I can say however that a 9600GT in a P35 system with a Core 2 E6550, 4GB of ram, and 4 Seagate barracudas uses ~167-168W idle. While gaming, the most CPU/GPU intensive games for me were world in conflict, and Hellgate: London. The two games "sucked down" 220-227W at the wall. This system was also moderately over clocked to get the memory and "FSB" at 1:1. Also these numbers are pretty close, but not super accurate, But as close as I can come eyeballing a kill-a-watt while trying to create a few numbers. The power supply was an 80Plus 500W variant. Manufactured by Seasonic if anyone must know( Antec EarthWATTS 500 ).
  • yyrkoon - Friday, February 5, 2010 - link

    Ah I forgot. The numbers I gave for the "complete" system at the wall included powering a 19" WS LCD that consistently uses 23W.
  • dagamer34 - Thursday, February 4, 2010 - link

    Where's the low-profile 5650?? I don't want to downgrade my 4650 to a 5450 just for HD bitstreaming. =/
  • Roy2001 - Thursday, February 4, 2010 - link

    Video game is on XBOX360 and Wii, so i3-530 for $117 is a better solution for me. It supports bitstream through HDMI too. My 2 cents.
  • Taft12 - Thursday, February 4, 2010 - link

    I apologize if this has been confirmed already, but does this mean we won't see a chip from ATI that falls between 5450 and 5670?

    There were four GPUs in this range last gen (4350, 4550, 4650, 4670)

Log in

Don't have an account? Sign up now