Conclusion

Compared to AMD’s previous generation of bottom-tier cards, the Radeon HD 5450 doesn’t offer too many surprises. Cards at this end of the spectrum have to give up a lot of their performance to meet their cost, power, and form-factor needs, and the 5450 is no different. It certainly produces playable framerates for most games (and even at high settings for some of them), and it’s going to be a great way to convince IGP users to move up to their first discrete GPU. But for a bottom-tier GPU, spending a little more money has always purchased you a great deal more powerful video card, and this hasn’t changed with the 5450.

The concern we have right now is the same concern we’ve had for most of AMD’s other launches, which is the price. The card we tested is a $60 card, smack-dab in the middle of the territory for the Radeon HD 4550, the DDR2 Radeon HD 4650, and the DDR2 GT 220. We don’t have the DDR2 cards on hand, but the performance gap between bottom-tier cards like the 5450 and those cards is enough that the DDR2 penalty won’t come close to closing the gap. If performance is all you need and you can’t spend another dime, then a last-generation card from the next tier up is going to offer more performance for the money. The 5450 does have DX11, but it’s not fast enough to make practical use of it.

Things are more in favor of the 5450 however when we move away from gaming performance. For a passively cooled low-profile card, its competition is the slower GeForce 210, and a few Radeon HD 4550s. The 4550 is still a better card from a performance standpoint, but it’s not a huge gap. Meanwhile the 5450 is cooler running and less power hungry.

Currently it’s HTPC use that puts the 5450 in the most favorable light. As the Cheese Slices test proved, it’s not quite the perfect HTPC card, but it’s very close. Certainly it’s the best passively cooled card we have tested from an image quality perspective, and it’s the only passive card with audio bitstreaming. If you specifically want or need Vector Adaptive Deinterlacing, the Radeon HD 5670 is still the cheapest/coolest/quietest card that’s going to meet your needs. But for everyone else the 5450 is plenty capable and is as close to being perfect as we’ve seen any bottom-tier card get.

To that end the Sapphire card looks particularly good, since based on our testing they're able to drop the reference 5450's clumsy double-wide heatsink for a single-wide heatsink without the card warming up too much more. For Small Form Factor PCs in particular, it's going to be a better choice than any card that uses the reference heatsink, so long as there's enough clearance for the part of the heatsink on the back side of the card.

Moving away from the 5450 for a moment, besides the Radeon HD 5770 this is the only other card in the 5000-series that is directly similar to a 4000-series card. In fact it’s the most similar, being virtually identical to the 4550 in terms of functional units and memory speeds. With this card we can finally pin down something we couldn’t quite do with the 5770: clock-for-clock, the 5000-series is slower than the 4000-series.

This is especially evident on the 5450, where the 5450 has a 50MHz core speed advantage over the 4550, and yet with everything else being held equal it is still losing to the 4550 by upwards of 10%. This seems to the worst in shader-heavy games, which leads us to believe that actual cause is that the move from DX10.1 shader hardware on the 4000-series to DX11 shader hardware on the 5000 series. Or in other words, the shaders in particular seem to be what’s slower.

AMD made several changes here, including adding features for DX11 and rearranging the caching system for GPGPU use. We aren’t sure whether the slowdown is a hardware issue, or if it’s the shader compiler being unable to fully take advantage of the new hardware. It’s something that’s going to bear keeping an eye on in future driver revisions.

This brings us back to where we are today, with the launch of the 5450. AMD has finally pushed the final Evergreen chip out the door, bringing an end to their 6 month launch plan and bringing DirectX 11 hardware from the top entirely to the bottom – and all before NVIDIA could launch a single DX11 card. AMD is still fighting to get more 40nm production capacity, but the situation is improving daily and even with TSMC’s problems it didn’t stop AMD from doing this entirely in 6 months. With the first Cedar card launched, now we’re going have a chance to see how AMD chooses to fill in the obvious gaps in their pricing structure, and more importantly how NVIDIA will ultimately end up responding to a fully launched 5000-series.

Power & Temperatures
Comments Locked

77 Comments

View All Comments

  • andy o - Thursday, February 4, 2010 - link

    [I got an error, so sorry if this is posted twice.]

    It's not overclocking at all. Powerplay is, as the poster above said, for power efficiency only. It actually doesn't overclock at all, but underclocks when the GPU is not being stressed.

    If you're referring to one of the posts that requires you to enable overdrive, notice that it's only being enabled so you can stabilize the clock (and thus effectively disabling powerplay), but the GPU/mem are actually being underclocked by messing with an xml file and lowering the clocks manually via overdrive.
  • ATWindsor - Thursday, February 4, 2010 - link

    First of all, its not really a "audiophile feature" to get audio without droputs and other problems over HDMI, its devastating for the audio, no matter if you are a audiophile or not, secondly, powerplay is also used for power efficiency. The result is that HDMI audio doesn't work with default-setting for many people, this is a pretty major issue.

    AtW
  • andy o - Thursday, February 4, 2010 - link

    OK so hyperlinks aren' working.

    This is the first thread I linked.
    http://www.avsforum.com/avs-vb/showthread.php?p=17...">http://www.avsforum.com/avs-vb/showthread.php?p=17...

    this is the doom9 thread.
    http://forum.doom9.org/showthread.php?p=1359418#po...">http://forum.doom9.org/showthread.php?p=1359418#po...

    ATI is giving some users the runaround.
  • ereavis - Thursday, February 4, 2010 - link

    try this ATI hotfix
    http://support.amd.com/us/kbarticles/Pages/ATICata...">http://support.amd.com/us/kbarticles/Pages/ATICata...
  • andy o - Thursday, February 4, 2010 - link

    already did, and it's the same with 9.11, 9.12, 9.12 hotfix, 10.1, and version 8.70RC2 (presumably 10.2 RC2).
  • toyota - Thursday, February 4, 2010 - link

    am I missing something? why are you saying Far Cry 2 benchmark cant go lower than high settings? all you have to do is select DX9 and you can choose low or medium from there.
  • Ryan Smith - Thursday, February 4, 2010 - link

    We stick to DX10 mode for benchmarking DX10-enabled games. In fact I never even tried DX9, otherwise I would have noticed that it goes lower.

    Humm...
  • toyota - Thursday, February 4, 2010 - link

    well anybody trying to game on this thing will have to use whatever realistic playable settings are available. that means DX9 for Crysis/Warhead and Far Cry 2 would need to be used.
  • andy o - Thursday, February 4, 2010 - link

    That option has been there for a while, but there's no info on what exactly it does.
  • Ryan Smith - Thursday, February 4, 2010 - link

    Frankly, AMD has never documented it well. It has something to do with working with Windows MMCSS and DXVA to do exactly what the name describes, and that's all I know off-hand.

    It's aptly named though; I've seen a number of examples where enabling it does what's on the label.

Log in

Don't have an account? Sign up now