A Faster, Cheaper High-End

While the X1900 XTX made its debut at over $600USD, this new product launch sees a card with a bigger, better HSF and faster memory debuting at a much lower "top end" price of $450. Quite a few factors play into this, not the least of which is the relatively small performance improvement over the X1900 XTX. We never recommended the X1900 XTX over the X1900 XT due to the small performance gain, but those small differences add up and with ATI turning their back on the X1900 XTX for its replacement. We can finally say that there is a tangible difference between the top two cards offered by ATI.

This refresh part isn't as different as other refresh parts, but the price and performance are about right for what we are seeing. Until ATI brings out a new GPU, it will be hard for them to offer any volume of chips that run faster than the X1950 XTX. The R5xx series is a very large 384 million transistor slice of silicon that draws power like its going out of style, but there's nothing wrong with using the brute force method every once in a while. The features ATI packed in the hardware are excellent, and now that the HSF is much less intrusive (and the price is right) we can really enjoy the card.

Speaking of the thermal solution, it is worth noting that ATI has put quite abit of effort into improving the aural impact of its hardware. The X1900 XTX is not only the loudest card around, but it also possesses a shrill and quite annoying sound quality. In contrast, the X1950 XTX is not overly loud even during testing when the fan runs at full speed, and the sound is not as painful to hear. We are also delighted to find that ATI no longer spins the fan at full speed until the drivers load. After the card spins up, it remains quiet until it gets hot. ATI has upgraded their onboard fan header to a 4-pin connection (following in the footsteps of NVIDIA and Intel), allowing them a more fine grained control over their fan speed.

While the X1950 XTX is not as quiet as NVIDIA's 7900 GTX solution, it is absolutely a step in the right direction. That's not to say their aren't some caveats to this high end launch.

Even before the introduction of SLI, every NVIDIA GPU had the necessary components to support multiple GPU configurations in silicon. Adding an "over the top" SLI bridge connector to cards has resulted in the fact that nearly every NVIDIA card sold is capable of operating in multi-GPU mode. While lower end ATI products don't require anything special to work in tandem, the higher end products have needed a special "CrossFire" branded card with an external connector and dongle capable of receiving data from a slave card.

While this isn't necessarily a bad solution to the problem, it is certainly less flexible than NVIDIA's implementation. In the past, in order to run a high end multi-GPU ATI configuration, a lower clocked (compared to the highest speed ATI cards) more expensive card was needed. With the introduction of X1950 CrossFire, we finally have an ATI multi-GPU solution available at the highest available clock speed offered and at the same price as a non-CrossFire card.

While this may not be a problem for us, it might not end up making sense for ATI in the long run. Presumably, they will see higher margins from the non-CrossFire X1950 card, but the consumer will see no benefit from staying away from CrossFire. (Note that the CrossFire cable still offers a second DVI port.) In fact, the benefits of having a CrossFire version are fairly significant in the long run. As we mentioned, 2 CrossFire cards can be used in CrossFire with no problem, each card could be used as a master in other systems offering greater flexibility and a higher potential resale value in the future.

If the average consumer realizes the situation for what it is, we could see some bumps in the road for ATI. It's very likely that we will see lower availability of CrossFire cards, as the past has shown a lower demand for such cards. Now that ATI has taken the last step in making their current incarnation of multi-GPU technology as attractive and efficient as possible, we wouldn't be surprised if demand for CrossFire cards comes to completely eclipse demand for the XTX. If demand does go up for the CrossFire cards, ATI will either have a supply problem or a pricing problem. It will be very interesting to watch the situation and see which it will be.

Before we move on to the individual game tests, lets take a look at how the X1950 XTX stacks up against its predecessor the X1900 XTX. Mouse over the links below the image to look at the performance difference between the X1950 XTX and the X1900 XTX at that resolution.


1280 x 1024 1920 x 1440 2048 x 1536

For our 29% increase in memory clock speed, we are able to gain at most an 8.5% performance increase in SC:CT. This actually isn't bad for just a memory clock speed boost. Battlefield 2 without AA took home the least improvement with a maximum of 2.3% at our highest resolution.

Our DirectX games seem to show a consistently higher performance improvement with AA enabled due to memory speed. This is in contrast to our OpenGL games (Quake 4 and F.E.A.R.) which show a pretty constant percent improvement at each resolution with AA enabled while scaling without AA improves as resolution increases. Oblivion improvement seems to vary between 2% and 5%, but this is likely due to the variance of our benchmark between runs.

A Matter of Memory: Revisiting the Mid-Range Battlefield 2 Performance
Comments Locked

74 Comments

View All Comments

  • Ecmaster76 - Wednesday, August 23, 2006 - link

    Is it a GDDR3 or a DDR2 product?

    If the former, any chance it will crossfire with x1600 xt? Oficially I mean (methinks a bios flash might work, though x1650 is maybe a 80nm part)
  • coldpower27 - Wednesday, August 23, 2006 - link

    No I don't think that would work.

    an X1650 Pro has 600/1400 Speeds so 100% sure is GDDR3, DDR2 doesn't exisit at such high clockspeed.

  • Genx87 - Wednesday, August 23, 2006 - link

    Some of the other reviews had this x1950XT beating the GX2 almost every time, sometimes by a wide margin.

    I still cant get over the power\transistor\die size to performance advantage Nvidia has over ATI right now.

  • PrinceGaz - Wednesday, August 23, 2006 - link

    Interesting. The first review I read was at where the X1950XTX beat or equalled the 7950GX2 every time, then here the reverse is true. I think I'll have to read more reviews to decide what is going on (it certainly isn't CPU limitations). Maybe 's focus on optimum quality settings rather than raw framerate is the reason they favoured ATI, and another is the clear fact that when it came to minimum framerates instead of average framerates ( posted both for all tests) the X1950XTX was especially good.

    In other words the 7950GX2 posted great average numbers, but the X1950XTX was playable at higher quality settings because the minimum framerate didn't drop so low. Hopefully some other sites also include minimum framerates along with graphs to clearly show the cards perform.

    I remember a few years ago when ATs graphics card articles included image-quality comparisons and all sorts of other reports about how the cards compared in real-world situations. Now it seems all we get is a report on average framerate with a short comment that basically says "higher is better". Derek- I strongly suggest you look at how test cards and the informative and useful comments that accompany each graph. There may only have been three cards in their comparison but it gave a much better idea of how the cards compare to each other.

    Anyway I'll not be getting any of these cards. My 6800GT has plenty of performance for now so I'll wait until Vista SP1 and the second-generation of DX10 cards which hopfully won't require a 1KW PSU :)
  • PrinceGaz - Wednesday, August 23, 2006 - link

    It seems the comments system here uses the brackets in HardOCP's abbreviation as some sort of marker. Apologies for making the rest of the text invisible, please amend my comment appropriately. I was talking about HardOCP by the way, when I said they use minimum framerates and optimum quality settings for each card.
  • JarredWalton - Wednesday, August 23, 2006 - link



    Don't use {H} in the comments, please. Just like {B} turns on bold, {H} turns on highlighting (white text).
  • JarredWalton - Wednesday, August 23, 2006 - link

    Ah, seems you figured that out already. ;) I need to see if we can disable that feature....
  • haris - Wednesday, August 23, 2006 - link

    Actually if you look at all of the reviews a bit more closely the sites the scores depend on which processor is being used for the test. It appears that nVidia cards tend to run better on Conroes(probably just means the games are slightly less cpu bottlenecked at the resolutions being tested) while ATi tends to run better on AMD systems(or when the cpu is slowing things down) Of course that is IIRC from the 5 reviews I skimmed through today.
  • coldpower27 - Wednesday, August 23, 2006 - link

    No just no X1950 XTX alone is not more powerful then the 7950GX2. Only in ATI favourable scenarios or where SLI flat out doesn't work will this occur.



  • UNESC0 - Wednesday, August 23, 2006 - link

    quote:

    You get the same performance, same features and better flexibility with the CrossFire card so why not?


    you might want to run dual monitors...

Log in

Don't have an account? Sign up now