Final Words

We had no problems expressing our disappointment with NVIDIA over the lackluster performance of their 8600 series. After AMD's introduction of the 2900 XT, we held some hope that perhaps they would capitalize on the huge gap NVIDIA left between their sub $200 parts and the higher end hardware. Unfortunately, that has not happened.

In fact, AMD went the other way and released hardware that performs consistently worse than NVIDIA's competing offerings. The only game that shows AMD hardware leading NVIDIA is Rainbow Six: Vegas. Beyond that, our 4xAA tests show the mainstream Radeon HD lineup, which already lags in performance, scales even worse than NVIDIA. Not that we really expect most people with this level of hardware to enable 4xAA, but it's still a disappointment.

Usually it's easier to review hardware that is clearly better or worse than it's competitor under the tests we ran, but this case is difficult. We want to paint an accurate picture here, but it has become nearly impossible to speak negatively enough about the AMD Radeon HD 2000 Series without sounding comically absurd.

Even with day-before-launch price adjustments, there is just no question that, in the applications the majority of people will be running, AMD has created a series of products that are even more unimpressive than the already less than stellar 8600 lineup.

While we will certainly concede that video decode capability may be a saving grace in some applications, the majority of end users are not saving their money for a DX10 class video card in order to play movies on their PC. For those who really are interested in this, stay tuned for an article comparing UVD and PureVideo coming next week.

We also won't have data on the performance of these cards under DX10 until next week. Maybe DX10 could make a difference, but we still won't have the full picture. These first DX10 games are more like DX9 titles running on a different API. Of course, this is a valid way to use DX10, but we will probably see more intense and demanding uses of DX10 when developers start targeting the new features as a baseline.

All we can do at this point is lament the sad state of affordable next generation graphics cards and wait until someone at NVIDIA and AMD gets the memo that their customers would actually like to see better performance that at least consistently matches previous generation hardware. For now, midrange DX10 remains MIA.

Supreme Commander Performance
Comments Locked

96 Comments

View All Comments

  • TA152H - Thursday, June 28, 2007 - link

    Because not everyone is going to run Rainbow Six, duh!!!!!!

    For some people these cards would be fine because they aren't running all the titles here, or are willing to run them at lower resolutions so they don't have to hear some damn egg beater in their computer. Resolution isn't important to everyone, not everyone is some jackass kid that thinks blowing up space aliens with the highest degree of resolution is what life is all about and would be willing to sacrifice some of that for something that is quieter and cooler. I would have thought that much was obvious.
  • DerekWilson - Thursday, June 28, 2007 - link

    We would have tested the 2400 Pro if we had been able to get a hold of one. AMD was not able to send us a 2400 Pro, so we'll have to wait until we can get one from one of their board partners.
  • DerekWilson - Thursday, June 28, 2007 - link

    I'm gonna disagree.

    DX9 is much more important in these tests. How many people used a 9500 or an FX 5600 to play any serious DX9 games (read hl2 or better)? And how long did they have to wait for it when it finally mattered?

    The reason we do real world tests is because we want to evaluate how the card will behave in normal use. To the customer, the hardware is only as good as the software that runs on it. And right now the software that runs on these parts is almost exclusively DX9.

    It'll be at least a year or so before we see any real meaningful DX10 titles. Remember TRAOD, Tron 2.0 and Halo? Not the best DX9 implementations even if they were among the first.

    DX10 tests are certianly interesting, and definitely relevant. But I think DX9 is much more important right now.
  • TA152H - Thursday, June 28, 2007 - link

    Yes, but you miss the point that these cards were made for DX10. There are already some titles out, and they will become more and more popular, although initially, without all the features. It obviously wasn't the focus of the product at all, so why make it yours?

    Let me ask you a simple question. If you were buying a card, even today, would you buy it for the performance of DX9, or DX10? If you had the choice of two cards, one that had obscenely bad DX9 performance, but good DX10, and the other the reverse, which would you choose? I'd choose the one that performs well on DX10, because that's where things are going, and I'd put up with poor DX9 performance while new titles came out. However, these might suck on DX10 too, that's what we need to know.
  • swaaye - Thursday, June 28, 2007 - link

    Well, Radeon 9700 didn't have too much trouble rocking DirectX 8 games. Nor did GeForce FX (hell that's all it was really good for). G80 slaughters other cards at DirectX 9 games. I highly, highly doubt that these new cards are optimized for DirectX 10. How can they be? The first cards of each generation are usually disappointments for the new APIs.
  • TA152H - Thursday, June 28, 2007 - link

    You're missing the point, I'm not saying it will, I'm saying let's see.

    But, let's be realistic, at the price of these cards, they aren't going to be extremely powerful, but they have a great feature set for the price. For a lot of people, these are going to be good cards.

    Having said that, I'm inclined to agree they probably will not have great DX10 performance, but they didn't even test it. Strange, to say the least. Some of their decisions are baffling, and you wonder how much thought they actually put into them, if any.

    I also agree the first generation for a feature set isn't great. I'm not expecting much, but I'll withhold criticism until I see the results. Besides, in the case of the 2400, wouldn't you think that with this type of feature set, for $60 or so, it would be a very good product for a lot of people running Vista? It's not going to be for the alien blasters, of course, but don't you think it's got some market ?
  • Tamale - Thursday, June 28, 2007 - link

    you make it sound like you'd never even play any of the games tested in this review. wouldn't you be mad your "midrange" card performed this awful on OLDER technology games?

    i don't understand why anyone WOULDN'T care about dx9 performance when there are so many good dx9 games out there...
  • swaaye - Thursday, June 28, 2007 - link

    And before you rip me apart for bringing up 9700 and telling me how awesome it was for DX9, remember the mid-range 6600 GT beat it handily. Both are designed for the same API.
  • erple2 - Thursday, June 28, 2007 - link

    You're comparing apples to oranges here. Remember, the 9700 was the FIRST DX9 part available from ATI. The 6600GT was the second gen DX9 part from NVidia. I WILL say that the 9700 was light-years ahead of the nVidia competing DX9 part, the 5800XT.

    Your statement is more or less the same as saying that the 9700 was crap, because the 7600GT handily beat it (ok, I'm slightly exaggerating here...)

    The point is that this is a reversal of the DX9 situation. The 9700 did handily beat the 5800 in DX8 generation games. In this case, the 8800GTX handily beats the 2900XT (the jury's still out on the 8800GTS).

    I view this more like the 2600 appears similar to the horribly performing (in DX9) GeForce 5600.. At least the 5600 did reasonably well in DX8 games...
  • swaaye - Friday, June 29, 2007 - link

    No, I agree that the HD2600 and 2400 are reminiscent of the FX 5600 and FX5200. They are pretty awful. And I'm not going to sit here and dreamily imagine 3x the performance when they are running more complex DX10 shader code. I think these cards are flops and that's all they will really ever be. For non-gamers and HD video people, the only people who should buy these, they will of course be fine.

    If you want to play games, don't jump to DX10 dreaming. How many years did it take for DX9 to become the only API in use? Years. DX9 arrived in 2002 and only a couple of years ago at best was it becoming the primary API. UT2004, for example, is basically a DX7 engine. Guild Wars arrived with a DX8 renderer.

    DX9 had multiple OS's backing it. DX10 is Vista only. Its adoption rate is likely to really be slowed down due to this and the fact that the only cards with remotely decent DX10 performance are $300+.

    I brought up 9700 and 6600GT just to say that the first generation of cards for a new API is never very good at that API.

Log in

Don't have an account? Sign up now