4 chips in 6 months reaches its end today, with the launch of the final chip in AMD’s Evergreen stack: Cedar. Cedar, the baby of the family, will be powering AMD’s bottom-tier cards. Today we’re seeing the  the first of what we expect will be a couple of Cedar cards with the launch of the Radeon 5450.

  AMD Radeon HD 5670 AMD Radeon HD 4670 AMD Radeon HD 5450 AMD Radeon HD 4550
Stream Processors 400 320 80 80
Texture Units 20 32 8 8
ROPs 8 8 4 4
Core Clock 775MHz 750MHz 650MHz 600MHz
Memory Clock 1000MHz (4000MHz data rate) GDDR5 1000MHz (2000MHz data rate) GDDR3 800MHz (1600MHz data rate) DDR3 800MHz (1600MHz data rate) DDR3
Memory Bus Width 128-bit 128-bit 64-bit 64-bit
Frame Buffer 1GB / 512MB 1GB / 512MB 1GB / 512MB 1GB / 512MB
Transistor Count 627M 514M 292M 242M
TDP 61W 59W 19.1W 25W
Manufacturing Process TSMC 40nm TSMC 55nm TSMC 40nm TSMC 55nm
Price Point $99 / $119 $60-$90 $49-$59 $35-$55


It should come as little-to-no surprise that Cedar and the Radeon 5450 finally deviate from the rule-of-2 that has marked the difference between the other Evergreen family members. Whereas all of the larger Evergreen chips have effectively been ½ of their bigger sibling, Cedar cuts right to the bone. It’s half as many ROPs as the Redwood-powered Radeon 5670, but 40% of the texturing capacity, and a mere 20% of the shader capacity. As has always been the case for video cards, once you drop below $100 you have to start sacrificing a lot of hardware to meet lower price targets, and Cedar is no different.

For throwing all of those functional units out along with GDDR5 capabilities, Cedar comes in at a slender 292M transistors with a die size of 59mm2.  This is a little less than half the transistor count of Redwood while being a little more than half the die size. In this case the limited reduction in transistor count in spite of the significant reduction in shader capabilities is an excellent example in what makes cutting a design down to budget-levels such a tricky proposition. AMD won’t release a die shot of Cedar (or anything else of Evergreen for that matter) but it’s a safe assumption that most of Cedar is occupied by fixed and semi-fixed units such as the PCIe controller, UVD2.2, and AMD’s fixed function rendering pipeline. AMD can’t scale down any of these units like they can shaders, hence shaders had to take the brunt of the cuts to get a sub-300M transistor chip.

Attached to Cedar is a 64bit memory bus, which as we stated before drops GDDR5 memory support. Instead Cedar will be paired with DDR2 and DDR3 – with today’s launch card being a DDR3 variant clocked at 800MHz. This also makes the 5450 the first card to launch with something other than GDDR5, which has otherwise been paired with everything from the 5970 to the 5670.

Compared to the RV710 chip at the core of the Radeon 4350 and 4550, Cedar and the 5450 are virtually identical to those parts. It has the same number of functional units and the same memory interface running at the same speeds, making it the closest thing yet to a 4000-series card with DX11 + Eyefinity functionality. Cedar is even pin-compatible with RV710, so that manufacturers can drop it in to existing Radeon 4350/4550 designs. And just to put things in perspective, in spite of these similarities Cedar is 50M transistors larger than RV710, which means AMD spent most of their gains from moving to the 40nm process on adding Evergreen family features and getting a slightly smaller chip. This also means that it’s a safe bet that we’ll see AMD double-up on functional units for the next die shrink.

One of the advantages of throwing out so much shader hardware and dropping GDDR5 is that the power usage of the card comes down significantly, playing well in to the low-power nature of budget video cards. AMD specs the 5450 at a mere 19.1W TDP, and an idle power usage of 6.4W. This is more than 2/3rds lower than the 5670. Lower clockspeeds also play a part here, as the 5450 is the lowest clocked 5000-series card yet, at a core clock of 650MHz.

It goes without saying that as a budget card AMD is not targeting hardcore gamers with the 5450, instead the target market is a mix of buyers who need their first real GPU on a tight budget. This means pushing the Radeon 5450’s UVD/HTPC capabilities, Eyefinity, GPGPU acceleration, and it’s significantly improved gaming performance over IGP solutions. AMD is making sure to tag the 5450 as a DX11 card too, but as we’ve already established from our 5670 review, cards this slow are too slow to take advantage of any of DX11’s wonder features – the tessellator is probably going to be the only DX11 feature to see any action on cards of this performance level.

AMD is framing the 5450 as competition for NVIDIA’s bottom-tier GeForce, the GeForce 210. From a power and form factor standpoint this is a good matchup, however the 210 uses an even smaller GPU than Cedar along with DDR2 memory, which means there’s certainly a performance difference but also a pricing difference, since NVIDIA should be able to build the 210 for less. Pricing-wise the 5450 is in competition with the DDR2 GeForce GT 220, the Radeon 4550, and the Radeon 4650, all of which can be found for around the same price if not lower in the case of the 4550.

Meet the 5450
POST A COMMENT

76 Comments

View All Comments

  • Purri - Monday, March 08, 2010 - link

    Ok, so i read a lot of comments that the cheap passive DP-Adapters wont work for a EyeFinity 3 Monitor setup.

    But, can i use this card for a 3 monitor windows-desktop setup without eyefinity - or do i need an expensive adapter for this too?

    I'm looking for a cheapish, passivly(silent) cooled card that supports 3 monitors for windows applications, that has enough performance to play a few old games now and then (like quake3) on 1 monitor.

    Will this card work?
    Reply
  • waqarshigri - Wednesday, December 04, 2013 - link

    yes of course it has amd eyefinity technology .... i played new games on it like nfs run,call of duty MW3, battlefield 3, Reply
  • plopke - Friday, February 05, 2010 - link

    :o what about the 5830 , wasn't it delayed until the 5th. It is suddenly very quiet about it on all techsite. And not launched today. Reply
  • yyrkoon - Thursday, February 04, 2010 - link

    Your charts are all buggered up. Just looking over the charts, in Crysis: Warhead, you test the nvidia 9600GT for performance. Ok fine. Then we move a long to the Power consumption charts, and you omit the 9600GT for the 9500GT ? Better still, we move to both heat tests, and both of these card are omitted.

    WTH ?! Come on guys, is there something wrong with a bit of consistency ?
    Reply
  • Ryan Smith - Friday, February 05, 2010 - link

    Some of those cards are out of Anand's personal collection, and I don't have a matching card. We have near-identical hardware that produces the same performance numbers; however we can't replicate the power/noise/temperature data due to differences in cases and environment.

    So I can put his cards in our performance tests, but I can't use his cards for power/temp/noise testing. It's not perfect, but it allows us to bring you the most data we can.
    Reply
  • yyrkoon - Friday, February 05, 2010 - link

    Well, the only real gripe that I have here is that I actually own a 9600GT. Since we moved last year, and are completely off grid ( solar / wind ), I would have liked to compare power consumption between the two. Without having to actually buy something to find out.

    Oh well, nothing can be done about it now I suppose.

    I can say however that a 9600GT in a P35 system with a Core 2 E6550, 4GB of ram, and 4 Seagate barracudas uses ~167-168W idle. While gaming, the most CPU/GPU intensive games for me were world in conflict, and Hellgate: London. The two games "sucked down" 220-227W at the wall. This system was also moderately over clocked to get the memory and "FSB" at 1:1. Also these numbers are pretty close, but not super accurate, But as close as I can come eyeballing a kill-a-watt while trying to create a few numbers. The power supply was an 80Plus 500W variant. Manufactured by Seasonic if anyone must know( Antec EarthWATTS 500 ).
    Reply
  • yyrkoon - Friday, February 05, 2010 - link

    Ah I forgot. The numbers I gave for the "complete" system at the wall included powering a 19" WS LCD that consistently uses 23W. Reply
  • dagamer34 - Thursday, February 04, 2010 - link

    Where's the low-profile 5650?? I don't want to downgrade my 4650 to a 5450 just for HD bitstreaming. =/ Reply
  • Roy2001 - Thursday, February 04, 2010 - link

    Video game is on XBOX360 and Wii, so i3-530 for $117 is a better solution for me. It supports bitstream through HDMI too. My 2 cents. Reply
  • Taft12 - Thursday, February 04, 2010 - link

    I apologize if this has been confirmed already, but does this mean we won't see a chip from ATI that falls between 5450 and 5670?

    There were four GPUs in this range last gen (4350, 4550, 4650, 4670)
    Reply

Log in

Don't have an account? Sign up now