Conclusion

With the performance and price of the 5670, AMD has put themselves into an interesting position, with some good things and some bad things coming from it.

From a product perspective, AMD has placed the 5670 against NVIDIA’s GT 240, and completely dominates the card at every last performance metric. Although the 8800 GT did a good job of already nullifying the GT 240, the 5670 finishes the job. In a product comparison it’s faster, cooler, and more future-proof since it supports DX11. NVIDIA can’t and in fact isn’t going to maintain the $99 price point with the GT 240, and as of this writing the average GT 240 price is closer to $80, effectively regulating it to another price bracket altogether. Ultimately this can’t be good for NVIDIA, since the Redwood GPU is smaller (and hence cheaper) to produce than the GT215 GPU at the heart of the GT 240.

Meanwhile compared to the 4670, AMD is pricing this appropriately ahead of a card that has slipped down to $70 and below. As the 4670’s successor the 5670 is much faster, cooler running, and sports a much better feature set, including audio bitstreaming. You’re going to have to pay for it however, so the 4670 still has a purpose in life, at least until the 5500 series gets here.

Then we have the well-established cards – NVIDIA’s 9800 GT and AMD’s Radeon 4850. The 9800 GT can be commonly found for $99 or less, while the 4850 comes in and out of stock around that price point. AMD is continuing to manufacture the 4850 (in spite of earlier reports that it was EOL'd), so while it’s hard to get it’s not discontinued like the 4770 was. Considering its availability and the fact that it hasn’t been EOL’d like we previously believed, I’m not going to write it off.

So where does that leave the 5670? The 5670 does surprisingly well against the 9800 GT. It wins in some cases, trails very slightly in a few more, and then outright loses only in games where the 5670 is already playable up to 1920x1200. From a performance standpoint I think the 9800 GT is ahead, but it’s not enough to matter; meanwhile the “green” 9800 GT shortens the gap even more, and it still is over 10W hotter than the 5670. The 5670 is a good enough replacement for the 9800 GT in that respect, plus it has support for DX11, Eyefinity, and 3D Blu-Ray when that launches later this year.

Then we have the 4850. The 4850 won’t last forever (at some point AMD will EOL it), but we can currently find a pair of them on Newegg for $99 each. In our existing games, the 4850 wins and it wins by a lot. While the 5670 clearly beats a GT 240 and is a good enough alternative to a 9800 GT, I can’t make a performance case against the 4850. The 4850 has more of everything, and that means it’s a much more capable card with today’s games.

AMD’s argument for this matter is that the 4850 is an older card and doesn’t support everything the 5670 does. This is true – forgoing the 5670 means you lose DX11, bitstreaming audio, and Eyefinity among other things. But while this and the much lower power draw make the 5670 a better HTPC card, I’m not sure this a convincing argument as a pure gaming card.

To prove a point, we benchmarked the 5670 on some DX11 games using what we’d consider to be reasonable “medium” settings. For Battleforge we used the default Medium settings with SSAO set to Very High (to take advantage of the use of ComputeShader 5.0 there), and for the STALKER benchmark we also used Medium settings with Tessellation and Contact Shadows enabled. These are settings we believe a $99 card should be good enough to play at, with DX11’s big features in use.

Radeon HD 5670 DirectX 11 Performance
 
Battleforge DX11
STALKER DX11
Frames Per Second 19.4 27.2

The fact of the matter is that neither game is playable at those settings; the 5670 is simply too slow. This is a test that would be better served with more DX11 benchmarks, but based on our limited sample we have to question whether the 5670 is fast enough for DX11 games. If it’s not (and these results agree with that perspective) then being future-proof can’t justify the lower performance. Until AMD retires the 4850 it’s going to be the better gaming card, so long as you can deal with the greater power requirements and the space requirements of the card.

There’s really no way to reconcile the fact that in the short-term the performance of cards at the $99 price point is going to get slower, so we won’t try to reconcile this. In an ideal world we’d like to go from a 4850 to a 5670 that has similar performance and all of the 5670’s other advantages, but that isn’t something that is going to happen until 5750 cards fall about $30. On the flip side at least it’s significantly better than the GT 240.

Ultimately, AMD has produced a solid card. It’s not the 5850 or the 5750 – cards which immediately turned their price brackets upside down – but it’s fast enough to avoid the fate of the GT 240 and has enough features to stand apart. It’s a good HTPC card, and by pushing a DX11 card out at $99, buyers can at least get a taste of what DX11 can do even if it’s not quite fast enough to run it full-time (not to mention it further propagates DX11, an incentive for developers). Pure gamers can do better for now, but in the end it’s a good enough card.

Stay tuned, as next month we’ll have a look at the 5500 series and the 5450, finishing off AMD’s Evergreen chip stack.

Power, Temperature, & Noise
Comments Locked

73 Comments

View All Comments

  • Spoelie - Friday, January 15, 2010 - link

    Because the GPU can never truly be isolated, the CPU/memory/buses need to perform some work too to keep the GPU fed with data and instructions to process.
  • Slaimus - Thursday, January 14, 2010 - link

    It is not too long ago that the Geforce 6200 debuted at $150. Low end gaming cards are slowly pickup up prices again.
  • dagamer34 - Thursday, January 14, 2010 - link

    When do the low profile 5650/5670 cards come out? I've been waiting one for my HTPC to bitstream Blu-ray HD codecs.
  • SmCaudata - Thursday, January 14, 2010 - link

    Unless you already have an HTPC why would anyone get this card. If building a new HTPC you could get a Clarkdale to bitstream the audio-codecs.

    Also...why do we care if it is bitstreamed? I have a reciever that can decode this but it doesn't matter if the digital information is converted to PCM before or after the HDMI cable. The only advantage is to see those lights on the front of my reciever...
  • papapapapapapapababy - Thursday, January 14, 2010 - link

    future what? dx11 at 5fps? no thanks ati, remember the 4770? that was a good sub $100 card, (thanks) this crap is overpriced, $45 or bust.

  • TheManY2K3 - Thursday, January 14, 2010 - link

    Ryan,

    I don't see any of the applications at 12x10 include data for the 8800GT, however, you are comparing the 8800GT to the HD5670 in most applications.

    Could you include the 8800GT in the 12x10 data, so that we can accurately gauge the performance of the HD5670?
  • Ryan Smith - Thursday, January 14, 2010 - link

    The 8800 GT data was originally collected for past articles, where we started at 16x10. The 8800 GT isn't part of my collection (it's Anand's) so I wasn't able to get 12x10 data in time for this article.
  • silverblue - Thursday, January 14, 2010 - link

    It's probably fair to point out that, in most tests, the 5670 is very close to the 8800, and as such listing it may not mean anything. However, the 1280x1024 tests are also without AA - it might be nice to see the effect of turning AA on with this oldie but goodie as compared to the more modern competition, so including it may make sense. You may think that the higher core clock of the 5670 would give it an advantage without AA but if it goes anything like Batman, this would probably be an incorrect assumption as well.
  • pjladyfox - Thursday, January 14, 2010 - link

    Last I looked ANY Radeon card with the x5xx, x6xx, or x7xx model number was denoted as a mainstream card which is clearly noted here:

    http://en.wikipedia.org/wiki/Radeon#Product_naming...">http://en.wikipedia.org/wiki/Radeon#Product_naming...

    By that definition that means that these cards were designed to run in systems that have power supplies from 350 to 400w, support HD quality video, and support games at a resolution of no higher than 1440x900 at medium quality settings with 2x AA and 8x anisotropic filtering. By putting them at settings that most will not run these cards at it makes these results for the most part worthless.

    I mean who cares how these cards run at 1920x1200 at high detail settings since we already know they're going to fail anyway? I'm more interested in how these run with all the details on at say 1440x900 or possibly 1680x1050 which are the more common widescreen monitors most people have.

    For that matter where are details about how these cards compare running HD quality video, if the fan speed can be controlled via speedfan, or even if they have fixed some of the video quality issues like black crush when outputting via HDMI?
  • Ryan Smith - Thursday, January 14, 2010 - link

    We traditionally run every game at 3 resolutions, particularly since some games are more GPU-intensive than others. Based on the 12x10 performance, I decided that 10x7 would be silly and instead went up a level to 19x12 - a few games were actually playable even at those high resolutions.

    16x10 is accounted for, and 12x10 is close enough to 14x9 that the results are practically the same.

    HD Video: All the 5000 series cards, except perhaps the Cedar are going to be exactly the same.

    Fan speed: Can be modified (I use it to cool down the cards before extraction after running FurMark)

    Black Crush: I honestly have no idea

Log in

Don't have an account? Sign up now