The Test

For the most part, this is a high end article focusing on the faster 3 cards ATI announced today. We will include benchmarks of the X1900 XT 256MB in both our high end tests, and in a comparison with the numbers we ran for our recent summer midrange roundup. Our high end tests will consist of higher resolutions and will use the same high end platform we employed for our midrange article. This time, along with the benefits we see from using the fastest CPU we can get our hands on, this is also the type of system we might recommend for high end gamers to run their cards in. Thus, people interested in these cards can get a glimpse of what actual performance might look like on their personal system using our numbers.

CPU: Intel Core 2 Extreme X6800 (2.93GHz/4MB)
Motherboard: Intel D975XBX (LGA-775)
ASUS P5N32SLI SE Deluxe
Chipset: Intel 975X
NVIDIA nForce4 Intel x16 SLI
Chipset Drivers: Intel 7.2.2.1007 (Intel)
NVIDIA nForce 6.86
Hard Disk: Seagate 7200.7 160GB SATA
Memory: Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 2)
Video Card: Various
Video Drivers: ATI Catalyst 6.8
NVIDIA ForceWare 91.33
Desktop Resolution: 1920 x 1440 - 32-bit @ 60Hz
OS: Windows XP Professional SP2

The games we have chosen to test represent a wide variety of engines and styles. We are testing 7 games today due to the time constraints of this article. As the interest in HDR and advanced visual effects continues to rise, the tradeoff required for antialiasing is often overshadowed by the quality available from other options. This is especially true in games like Splinter Cell: Chaos Theory, Oblivion, and Black & White 2. In every game but Splinter Cell: Chaos Theory and Oblivion, we will be testing with and without 4x antialiasing. These games really shine when HDR is enabled, so we won't bother disabling it. (ATI still offers the "Chuck Patch" to enable both HDR and antialiasing, which can be seen as an advantage for their hardware. However, this doesn't work with all HDR modes and is currently targetted mostly at Oblivion and Splinter Cell: Chaos Theory.)

For all of our tests, the only default driver setting we change is vsync which we set to off. All other settings are left alone, as the default settings from each camp yeild generally comparable image quality. There are a few exceptions to the rule, but none of the test we ran show any shimmering or other problems noted in the past with NVIDIA's default quality.

In reporting our results, in hopes to increase readability, we will be including a snapshot of one resolution using our standard graphing engine graphs along side a resolution scaling line graph.

What is GDDR4? A Matter of Memory: Revisiting the Mid-Range
Comments Locked

74 Comments

View All Comments

  • Ecmaster76 - Wednesday, August 23, 2006 - link

    Is it a GDDR3 or a DDR2 product?

    If the former, any chance it will crossfire with x1600 xt? Oficially I mean (methinks a bios flash might work, though x1650 is maybe a 80nm part)
  • coldpower27 - Wednesday, August 23, 2006 - link

    No I don't think that would work.

    an X1650 Pro has 600/1400 Speeds so 100% sure is GDDR3, DDR2 doesn't exisit at such high clockspeed.

  • Genx87 - Wednesday, August 23, 2006 - link

    Some of the other reviews had this x1950XT beating the GX2 almost every time, sometimes by a wide margin.

    I still cant get over the power\transistor\die size to performance advantage Nvidia has over ATI right now.

  • PrinceGaz - Wednesday, August 23, 2006 - link

    Interesting. The first review I read was at where the X1950XTX beat or equalled the 7950GX2 every time, then here the reverse is true. I think I'll have to read more reviews to decide what is going on (it certainly isn't CPU limitations). Maybe 's focus on optimum quality settings rather than raw framerate is the reason they favoured ATI, and another is the clear fact that when it came to minimum framerates instead of average framerates ( posted both for all tests) the X1950XTX was especially good.

    In other words the 7950GX2 posted great average numbers, but the X1950XTX was playable at higher quality settings because the minimum framerate didn't drop so low. Hopefully some other sites also include minimum framerates along with graphs to clearly show the cards perform.

    I remember a few years ago when ATs graphics card articles included image-quality comparisons and all sorts of other reports about how the cards compared in real-world situations. Now it seems all we get is a report on average framerate with a short comment that basically says "higher is better". Derek- I strongly suggest you look at how test cards and the informative and useful comments that accompany each graph. There may only have been three cards in their comparison but it gave a much better idea of how the cards compare to each other.

    Anyway I'll not be getting any of these cards. My 6800GT has plenty of performance for now so I'll wait until Vista SP1 and the second-generation of DX10 cards which hopfully won't require a 1KW PSU :)
  • PrinceGaz - Wednesday, August 23, 2006 - link

    It seems the comments system here uses the brackets in HardOCP's abbreviation as some sort of marker. Apologies for making the rest of the text invisible, please amend my comment appropriately. I was talking about HardOCP by the way, when I said they use minimum framerates and optimum quality settings for each card.
  • JarredWalton - Wednesday, August 23, 2006 - link



    Don't use {H} in the comments, please. Just like {B} turns on bold, {H} turns on highlighting (white text).
  • JarredWalton - Wednesday, August 23, 2006 - link

    Ah, seems you figured that out already. ;) I need to see if we can disable that feature....
  • haris - Wednesday, August 23, 2006 - link

    Actually if you look at all of the reviews a bit more closely the sites the scores depend on which processor is being used for the test. It appears that nVidia cards tend to run better on Conroes(probably just means the games are slightly less cpu bottlenecked at the resolutions being tested) while ATi tends to run better on AMD systems(or when the cpu is slowing things down) Of course that is IIRC from the 5 reviews I skimmed through today.
  • coldpower27 - Wednesday, August 23, 2006 - link

    No just no X1950 XTX alone is not more powerful then the 7950GX2. Only in ATI favourable scenarios or where SLI flat out doesn't work will this occur.



  • UNESC0 - Wednesday, August 23, 2006 - link

    quote:

    You get the same performance, same features and better flexibility with the CrossFire card so why not?


    you might want to run dual monitors...

Log in

Don't have an account? Sign up now