The Test

For the most part, this is a high end article focusing on the faster 3 cards ATI announced today. We will include benchmarks of the X1900 XT 256MB in both our high end tests, and in a comparison with the numbers we ran for our recent summer midrange roundup. Our high end tests will consist of higher resolutions and will use the same high end platform we employed for our midrange article. This time, along with the benefits we see from using the fastest CPU we can get our hands on, this is also the type of system we might recommend for high end gamers to run their cards in. Thus, people interested in these cards can get a glimpse of what actual performance might look like on their personal system using our numbers.

CPU: Intel Core 2 Extreme X6800 (2.93GHz/4MB)
Motherboard: Intel D975XBX (LGA-775)
ASUS P5N32SLI SE Deluxe
Chipset: Intel 975X
NVIDIA nForce4 Intel x16 SLI
Chipset Drivers: Intel 7.2.2.1007 (Intel)
NVIDIA nForce 6.86
Hard Disk: Seagate 7200.7 160GB SATA
Memory: Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 2)
Video Card: Various
Video Drivers: ATI Catalyst 6.8
NVIDIA ForceWare 91.33
Desktop Resolution: 1920 x 1440 - 32-bit @ 60Hz
OS: Windows XP Professional SP2

The games we have chosen to test represent a wide variety of engines and styles. We are testing 7 games today due to the time constraints of this article. As the interest in HDR and advanced visual effects continues to rise, the tradeoff required for antialiasing is often overshadowed by the quality available from other options. This is especially true in games like Splinter Cell: Chaos Theory, Oblivion, and Black & White 2. In every game but Splinter Cell: Chaos Theory and Oblivion, we will be testing with and without 4x antialiasing. These games really shine when HDR is enabled, so we won't bother disabling it. (ATI still offers the "Chuck Patch" to enable both HDR and antialiasing, which can be seen as an advantage for their hardware. However, this doesn't work with all HDR modes and is currently targetted mostly at Oblivion and Splinter Cell: Chaos Theory.)

For all of our tests, the only default driver setting we change is vsync which we set to off. All other settings are left alone, as the default settings from each camp yeild generally comparable image quality. There are a few exceptions to the rule, but none of the test we ran show any shimmering or other problems noted in the past with NVIDIA's default quality.

In reporting our results, in hopes to increase readability, we will be including a snapshot of one resolution using our standard graphing engine graphs along side a resolution scaling line graph.

What is GDDR4? A Matter of Memory: Revisiting the Mid-Range
Comments Locked

74 Comments

View All Comments

  • nextsmallthing - Wednesday, August 23, 2006 - link

    Did anyone else notice that the specs for some of the NVIDIA cards are wrong? For example, the core clock of the 7900GTX is supposed to be 650 MHz, not 700 MHz, and the core clock of the 7900GT should be 450 MHz, not 470 MHz. Also, the pipeline configuration for the 7300GT (according to Wikipedia anyway) should be 8 pixel & 4 vertex.

    This many mistakes really makes me question the accuracy of other specs I read on Anandtech.

    (And by the way, would somebody please inform the DailyTech writers that it's "Xbox 360", not "XBOX 360". And yes I'm aware of the conventions that punctuation goes inside quotes and you shouldn't start sentences with "and".)
  • Anand Lal Shimpi - Wednesday, August 23, 2006 - link

    The 7900GTX/GT clock speeds that were listed were actually vertex clock speeds, not general core clock speeds, so they were technically correct (parts of the GPU do run at those frequencies) just not comparable to the other numbers. They have been corrected.

    The 7300GT is indeed 8 pipes, that was a copy/paste error. Thanks for the heads up.

    Take care,
    Anand
  • nextsmallthing - Thursday, August 24, 2006 - link

    Wow--prompt correction and courteous reply. I'm impressed, and my faith in Anandtech is restored!
  • Josh7289 - Wednesday, August 23, 2006 - link

    From the looks of the pricing structure for ATI's cards on the first page, and especially from the looks of the pricing structure for ATI's cards after they simplify their lineup, it looks like ATI is giving up on midrange cards, from $100 - $200. The 7600GT and the upcoming 7900GS both are alone in that price range (about $150 and $200, respecitively), with no competition from ATI, so it seems they really are giving that price range to Nvidia.

    Am I right with this or am I seriously missing something?
  • yyrkoon - Wednesday, August 23, 2006 - link

    there is a x1800GTO2, price last I looked was around $230, of course, they released it rather quietly. Still, thats about $90 higher than the 7600 GT (or in my case the eVGA 7600GT KO).
  • OrSin - Wednesday, August 23, 2006 - link

    I wondering the same thing. Are they going to stop making any 1800's. They should be dropping in this price range nicely. Not sure how competative they are with the 7900's. And now that the 7900GS is coming out the 1800 might be just too outclassed. (you guys just missed a great deal on woot 7900GS for $145).

    I hope 1800 drop is still being made and I hope it drops to $150-180 range to fill that gap.
  • JarredWalton - Wednesday, August 23, 2006 - link

    I think they've already stopped making all of the X1800 series, but there are still cards floating around.
  • Josh7289 - Wednesday, August 23, 2006 - link

    The X1900GT is a card meant to compete with the stock 7900GT, and as such is somewhere around the $200 - $250 price range.

    As for the X1950 Pro and X1650 XT, what are these supposed to compete against and at what prices. More importantly, when are these supposed to launch?
  • coldpower27 - Wednesday, August 23, 2006 - link

    As well the X1650 XT is also in the works.
  • coldpower27 - Wednesday, August 23, 2006 - link

    X1950 Pro, is upcoming, as well they still have the X1900 GT.

Log in

Don't have an account? Sign up now