Introduction

We've known about the basic architecture of AMD's lower end DX10 hardware ever since mid May, but retail product hasn't made its way out the door until now. Finally launching today, and available within the next two weeks (says AMD), the Radeon HD 2400 XT and Pro and the Radeon HD 2600 XT and Pro will serve to bring competition to the $50 - $150 DX10 graphics card market. These are the cards that most people will actually end up purchasing, so both AMD and NVIDIA would like to come out on top in this market.

But even before we begin, we have to go back to the 8800 GTS 320 and talk about what a terrific value it is for people who want great performance and don't need ultra high resolutions with AA cranked up. If $300 is in the budget for graphics, this is the way to spend it. We would really love to offer more flexibility in our recommendation, but both NVIDIA and AMD have seen fit to leave a huge gap in performance between their lower high end part and upper low end parts. We saw this with the 8600 GTS falling way short of the 8800 series, and we will see it again with the HD 2600 XT not even getting close to the 2900 XT.

AMD's price gap will be even larger than NVIDIA's, leaving a hole between $150 and $400 with nothing to fill it. This seems quite a bit excessive with no other real product lines hinted at until we see a product refresh down the line. When the 8600 series launched, we were quite disappointed with the performance of the part and hoped that AMD would step up to the plate and offer a real challenger that could fill the needs of midrange graphics hardware buyers everywhere. Now we are left with a sense of desolation and a feeling that neither AMD nor NVIDIA know how to properly target the $200 - $300 price range. We would go so far as to say that neither camp offers top-to-bottom DX10, but something more along the lines of top and bottom end solutions.

But regardless of what is lacking in their lineup, the new Radeon HD cards are aimed at filling a specific need. We will talk about what they bring to the table and how they manage to do the job AMD has designed them to perform. First up is a brief look back at what's actually inside these GPUs.

UPDATE: In going back to add power tests, we discovered that the GeForce 8600 GTS we used had a slight overclock over the stock version. We have gone back and rerun our tests with the GeForce 8600 GTS at stock clock speeds and our current graphs reflect the new data. The changes, generally on the order of 5%, did not have a significant impact on the overall outcome of the article. There are a couple cases where the performance gap narrows, but the fact remains that the 8600 GTS is under powered and the 2600 XT is generally more so.

We do apologize for the initial testing error, and we will certainly do everything we can to avoid such problems in the future.


A Closer Look at RV610 and RV630
Comments Locked

96 Comments

View All Comments

  • TA152H - Thursday, June 28, 2007 - link

    While I see your points, I don't agree entirely. Does the 7600GT beat the 2400 in DX10? Well, no, it's not for that. So, I'd agree you wouldn't upgrade to the 2400 for DX9 performance, you might for DX10. But they didn't publish any results for DX10, so we don't know if it's even worthwhile for that.

    You make a good point with timing on an upgrade, but I'll offer a few good reasons. Sometimes, you NEED something right now, either because your old card died, or because you're building a new machine, or whatever. So, it's not always about an upgrade. And, in these instances, I think most people will be very interested in DX10 performance since we're already seeing titles for them coming out, and obviously DX9 will cease to be relevant in the future as DX10 takes over. The advantages of DX10 are huge and meaningful, and DX9 can't die fast enough. But, it will take a while because XP and old computers represent a huge installed base, and software companies will have to keep making DX9 versions along with DX10. But who will want to run them degraded? Also, Vista itself will be pushing demand for DX10, and XP is pretty much dead now. I don't think you'll see too many new machines with this OS on it, but Vista of course will continue to grow. Would you want a DX9 card for Vista? I wouldn't, and most people wouldn't, so they make these DX10 cards that at least nominally support DX10. Maybe that's all this is, just products that are made to fill a need for the company to say they have inexpensive DX10 cards to run Vista on. I don't know, because they didn't put out DX10 results, instead wasted the review on DX9 which isn't why these cards were put out. And naturally, most of the idiots who post here eat it up without understanding DX9 wasn't the point of this product. I won't be surprised if DX10 sucks too, and these were just first attempts that fell short, but I'd like to see results before I condemn the cards for being poor at something that wasn't their focus.
  • leexgx - Thursday, June 28, 2007 - link

    quote:

    and XP is pretty much dead now

    ?? how can XP be dead most Offices will be useing it probly for the next 5-8 years home users its going to take 2-3 years before users upgrade if thay are even bothered if it works it works


    i love to use vista but Drivers Suck for it M$ Should Demand Fully working drivers Before WHQL an driver not just It works an slap on an WHQL cert on it (Nvidia crapy chip set drivers and Creative Poor driver support {but thats M$ fault for changeing the Driver model}..)

    most norm Home users do not care what video card is in there pc so the Low end video parts whould not bother them
  • defter - Thursday, June 28, 2007 - link

    quote:

    I'd agree you wouldn't upgrade to the 2400 for DX9 performance, you might for DX10.


    ROFL. If 2400 XT gets 9 fps in a "DX9" game at 1280x1024, do you expect that any real "DX10" game will be playable on it at any reasonable resolution?

    It's quite clear by now, that in order to really use those DX10 features (i.e. run games with settings where there is a noticable image quality difference compared to DX9 codepath), you need to have 8800/2900 class card. DX10 performance of 2600 series, let alone 2400 series, is quite irrelevant.
  • Frumious1 - Thursday, June 28, 2007 - link

    Have you even tried any DX10 hardware? Because I have to say that it sounds like you're talking out of your ass! The reason I say that is because the current DX10 titles (which are really just DX9 titles with hacked in DX10 support) perform atrociously! Company of Heroes, for example, has performance cut in half or more when you enable DirectX 10 mode -- and that's running on a mighty 8800 GTX! (Yes, I own one.) It looks fractionally better (improved shadows for the most part) but the performance hit absolutely isn't worthwhile.

    Company of Heroes isn't the only example of games that perform poorly under DirectX 10. Call of Juarez and Lost Planet also perform worse in DirectX 10 mode than they do in DirectX 9 on the same hardware (all this testing having been done on HD 2900 XT and 8800 series hardware by several reputable websites - see FiringSquad among others). You don't actually believe that crippled hardware like the 8500/8600 or 2400/2600 will somehow magically perform better in DirectX 10 mode than the more powerful high-end hardware, do you?

    Extrapolate those performance losses over to hardware that is already about one third as powerful as the 8800/2900 GPUs, and this low-end DirectX 10 hardware is simply a marketing attempt. If it's slower in DX9 mode and the hihg-end stuff suffers in performance, why would having fewer ROPs, Shader Units, etc. help matters?
    This is blatant marketing -- marketing that people like you apparently buy into? "The 2600 isn't intended to perform well in DirectX 9. That's why it has DirectX 10 support! Wait until you see how great it performs in DirectX 10 -- something cards like the 7600/7900 and X1950 type cards can't run at all because they lack the feature. Blah de blah blah...." Don't make me laugh! (Too late.)

    Radeon 9700 was the first DirectX 9 hardware to hit the planet, followed by the (insert derogatory adjectives) GeForce FX cards several months later. Is but is because we're seeing the exact same thing in reverse this time: NVIDIA 8800 hardware it appears to be pretty fast all told, and almost 9 months after the hardware became available ATI's response can't even keep up! Anyway, how long did it take for us to finally get software that required DirectX 9 support? I would say Half-Life 2 is about as early as you can go, and that came out in November 2004. That was more than two years after the first DirectX 9 hardware (as well as the DirectX 9 SDK) was released -- August 2002 for the Radeon 9700, I believe.

    Sure, there were a few titles that tried to hack in rudimentary DirectX 9 support, but it wasn't until 2005 that we really started to see a significant number of games requiring DirectX 9 hardware for the higher-quality rendering modes. Given that DirectX 10 requires Windows Vista, I expect the transition to be even slower! I have a shiny little copy of Windows Vista sitting around my house. I installed it, played around with it, tried out a few DirectX 10 patched titles... and I promptly uninstalled. (Okay, truth be told I installed it on a spare 80GB hard drive, so technically I can still dual-boot if I want to.) Vista is a questionable investment at best, high-end DirectX 10 hardware is expensive as well, and the nail in the coffin right now is that we have yet to see any compelling DirectX 10 stuff. Thanks, but I'll pass on any claims of DX10 being useful for now.
  • Jedi2155 - Friday, June 29, 2007 - link

    Farcry was February 04, and that was definitely full DX9 support.
  • coldpower27 - Friday, June 29, 2007 - link

    That was DX8 largely with some DX9 shader in. But the majority of the code use was still in DX8 mode.

    Were talking about native DX9 games which would be something on the magnitude of Oblivion, or Neverwinter Nights 2.
  • titan7 - Saturday, June 30, 2007 - link

    d3d9 is basically d3d8, but with SM2.0 instead of SM1. Just because Oblivion doesn't support SM1 doesn't mean it is any more or less of a d3d9 title than one that also supported sm1 or OpenGL.
  • TA152H - Thursday, June 28, 2007 - link

    With any technology, the initial software and drivers is going to be sub-optimal, and not everyone is going to be playing at the same resolutions or configurations. But, these cards aren't made for the ultimate gaming experience, they are low end cards to perform tasks that are not so demanding well enough for the mainstream user.

    What you don't seem to understand is, not everyone is a dork that plays games all day. Some people actually have lives, and they don't need high end hardware to shoot aliens. For you, sure, get the high end stuff, but that's not what you compare these cards to. $60 card versus $800? Duh. Why would you even make the comparision? You make the comparison of how well the perform for their target, which is a low end, feature rich card. Will it work well with Vista and mainstream type of apps? Does the feature set and performance justify the cost?

    Only an idiot would use this type of card for a high end gaming computer, and only the same type of person would even compare it to that.
  • LoneWolf15 - Thursday, June 28, 2007 - link

    With any technology, the initial software and drivers is going to be sub-optimal, and not everyone is going to be playing at the same resolutions or configurations. But, these cards aren't made for the ultimate gaming experience, they are low end cards to perform tasks that are not so demanding well enough for the mainstream user.
    Yes, and by the time the technology and drivers (in this case, DirectX 10) ARE optimal, these cards will be replaced by a completely new generation, making them a pretty poor purchase at this time. Remember the timetable for going from DirectX 8 to DirectX 9?

    Derek is completely right at this point. If you're buying a card, you should buy based on what kind of DirectX 9 performance you're going to get, because by the time REAL DX10 games come out (and I'm going to make a bet that will be a minimum of 12 months from now, and I mean for good ones, not some DX-9 game with additional hacks) there will be product refreshes or new designs from both ATI and nVidia. Buying a card solely for DirectX 10 performance, especially a low-to-midrange one, is completely silly. If it's outclassed now, it will be far worse in a year. It really makes sense to choose to either a)buy a minimum of a 320MB Geforce 8800GTS, or b)Stick with upper-middle to high-end last-gen hardware (read: Radeon x19xx, Geforce 79xx) until there's a reason to upgrade.
  • defter - Thursday, June 28, 2007 - link

    quote:

    And then, the most interesting part of all, you don't test, the 2400 Pro


    Yes, it would be interesting to know the performance of 2400 Pro. After all, faster 2400 XT got whopping 8.9 fps at 1280x1024 no FSAA, no AF in Rainbow Six and 9.3 fps in Stalker with the same settings. I wonder if 2400 Pro gets more than 6 fps?


    I actually don't see what is the point of 2400 XT. If somebody needs just multimedia functionality and doesn't care about gaming, then $50 2400 Pro/8400 GS is a cheaper choice. If somebody cares about gaming, then the 2400 XT is useless since 8600GT/2600 Pro are much faster and are only slightly more expensive.

Log in

Don't have an account? Sign up now