Supreme Commander Performance

Supreme Commander

S.T.A.L.K.E.R. Performance Final Words
Comments Locked

96 Comments

View All Comments

  • TA152H - Thursday, June 28, 2007 - link

    Because not everyone is going to run Rainbow Six, duh!!!!!!

    For some people these cards would be fine because they aren't running all the titles here, or are willing to run them at lower resolutions so they don't have to hear some damn egg beater in their computer. Resolution isn't important to everyone, not everyone is some jackass kid that thinks blowing up space aliens with the highest degree of resolution is what life is all about and would be willing to sacrifice some of that for something that is quieter and cooler. I would have thought that much was obvious.
  • DerekWilson - Thursday, June 28, 2007 - link

    We would have tested the 2400 Pro if we had been able to get a hold of one. AMD was not able to send us a 2400 Pro, so we'll have to wait until we can get one from one of their board partners.
  • DerekWilson - Thursday, June 28, 2007 - link

    I'm gonna disagree.

    DX9 is much more important in these tests. How many people used a 9500 or an FX 5600 to play any serious DX9 games (read hl2 or better)? And how long did they have to wait for it when it finally mattered?

    The reason we do real world tests is because we want to evaluate how the card will behave in normal use. To the customer, the hardware is only as good as the software that runs on it. And right now the software that runs on these parts is almost exclusively DX9.

    It'll be at least a year or so before we see any real meaningful DX10 titles. Remember TRAOD, Tron 2.0 and Halo? Not the best DX9 implementations even if they were among the first.

    DX10 tests are certianly interesting, and definitely relevant. But I think DX9 is much more important right now.
  • TA152H - Thursday, June 28, 2007 - link

    Yes, but you miss the point that these cards were made for DX10. There are already some titles out, and they will become more and more popular, although initially, without all the features. It obviously wasn't the focus of the product at all, so why make it yours?

    Let me ask you a simple question. If you were buying a card, even today, would you buy it for the performance of DX9, or DX10? If you had the choice of two cards, one that had obscenely bad DX9 performance, but good DX10, and the other the reverse, which would you choose? I'd choose the one that performs well on DX10, because that's where things are going, and I'd put up with poor DX9 performance while new titles came out. However, these might suck on DX10 too, that's what we need to know.
  • swaaye - Thursday, June 28, 2007 - link

    Well, Radeon 9700 didn't have too much trouble rocking DirectX 8 games. Nor did GeForce FX (hell that's all it was really good for). G80 slaughters other cards at DirectX 9 games. I highly, highly doubt that these new cards are optimized for DirectX 10. How can they be? The first cards of each generation are usually disappointments for the new APIs.
  • TA152H - Thursday, June 28, 2007 - link

    You're missing the point, I'm not saying it will, I'm saying let's see.

    But, let's be realistic, at the price of these cards, they aren't going to be extremely powerful, but they have a great feature set for the price. For a lot of people, these are going to be good cards.

    Having said that, I'm inclined to agree they probably will not have great DX10 performance, but they didn't even test it. Strange, to say the least. Some of their decisions are baffling, and you wonder how much thought they actually put into them, if any.

    I also agree the first generation for a feature set isn't great. I'm not expecting much, but I'll withhold criticism until I see the results. Besides, in the case of the 2400, wouldn't you think that with this type of feature set, for $60 or so, it would be a very good product for a lot of people running Vista? It's not going to be for the alien blasters, of course, but don't you think it's got some market ?
  • Tamale - Thursday, June 28, 2007 - link

    you make it sound like you'd never even play any of the games tested in this review. wouldn't you be mad your "midrange" card performed this awful on OLDER technology games?

    i don't understand why anyone WOULDN'T care about dx9 performance when there are so many good dx9 games out there...
  • swaaye - Thursday, June 28, 2007 - link

    And before you rip me apart for bringing up 9700 and telling me how awesome it was for DX9, remember the mid-range 6600 GT beat it handily. Both are designed for the same API.
  • erple2 - Thursday, June 28, 2007 - link

    You're comparing apples to oranges here. Remember, the 9700 was the FIRST DX9 part available from ATI. The 6600GT was the second gen DX9 part from NVidia. I WILL say that the 9700 was light-years ahead of the nVidia competing DX9 part, the 5800XT.

    Your statement is more or less the same as saying that the 9700 was crap, because the 7600GT handily beat it (ok, I'm slightly exaggerating here...)

    The point is that this is a reversal of the DX9 situation. The 9700 did handily beat the 5800 in DX8 generation games. In this case, the 8800GTX handily beats the 2900XT (the jury's still out on the 8800GTS).

    I view this more like the 2600 appears similar to the horribly performing (in DX9) GeForce 5600.. At least the 5600 did reasonably well in DX8 games...
  • swaaye - Friday, June 29, 2007 - link

    No, I agree that the HD2600 and 2400 are reminiscent of the FX 5600 and FX5200. They are pretty awful. And I'm not going to sit here and dreamily imagine 3x the performance when they are running more complex DX10 shader code. I think these cards are flops and that's all they will really ever be. For non-gamers and HD video people, the only people who should buy these, they will of course be fine.

    If you want to play games, don't jump to DX10 dreaming. How many years did it take for DX9 to become the only API in use? Years. DX9 arrived in 2002 and only a couple of years ago at best was it becoming the primary API. UT2004, for example, is basically a DX7 engine. Guild Wars arrived with a DX8 renderer.

    DX9 had multiple OS's backing it. DX10 is Vista only. Its adoption rate is likely to really be slowed down due to this and the fact that the only cards with remotely decent DX10 performance are $300+.

    I brought up 9700 and 6600GT just to say that the first generation of cards for a new API is never very good at that API.

Log in

Don't have an account? Sign up now