The Test

For the most part, this is a high end article focusing on the faster 3 cards ATI announced today. We will include benchmarks of the X1900 XT 256MB in both our high end tests, and in a comparison with the numbers we ran for our recent summer midrange roundup. Our high end tests will consist of higher resolutions and will use the same high end platform we employed for our midrange article. This time, along with the benefits we see from using the fastest CPU we can get our hands on, this is also the type of system we might recommend for high end gamers to run their cards in. Thus, people interested in these cards can get a glimpse of what actual performance might look like on their personal system using our numbers.

CPU: Intel Core 2 Extreme X6800 (2.93GHz/4MB)
Motherboard: Intel D975XBX (LGA-775)
ASUS P5N32SLI SE Deluxe
Chipset: Intel 975X
NVIDIA nForce4 Intel x16 SLI
Chipset Drivers: Intel 7.2.2.1007 (Intel)
NVIDIA nForce 6.86
Hard Disk: Seagate 7200.7 160GB SATA
Memory: Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 2)
Video Card: Various
Video Drivers: ATI Catalyst 6.8
NVIDIA ForceWare 91.33
Desktop Resolution: 1920 x 1440 - 32-bit @ 60Hz
OS: Windows XP Professional SP2

The games we have chosen to test represent a wide variety of engines and styles. We are testing 7 games today due to the time constraints of this article. As the interest in HDR and advanced visual effects continues to rise, the tradeoff required for antialiasing is often overshadowed by the quality available from other options. This is especially true in games like Splinter Cell: Chaos Theory, Oblivion, and Black & White 2. In every game but Splinter Cell: Chaos Theory and Oblivion, we will be testing with and without 4x antialiasing. These games really shine when HDR is enabled, so we won't bother disabling it. (ATI still offers the "Chuck Patch" to enable both HDR and antialiasing, which can be seen as an advantage for their hardware. However, this doesn't work with all HDR modes and is currently targetted mostly at Oblivion and Splinter Cell: Chaos Theory.)

For all of our tests, the only default driver setting we change is vsync which we set to off. All other settings are left alone, as the default settings from each camp yeild generally comparable image quality. There are a few exceptions to the rule, but none of the test we ran show any shimmering or other problems noted in the past with NVIDIA's default quality.

In reporting our results, in hopes to increase readability, we will be including a snapshot of one resolution using our standard graphing engine graphs along side a resolution scaling line graph.

What is GDDR4? A Matter of Memory: Revisiting the Mid-Range
Comments Locked

74 Comments

View All Comments

  • DerekWilson - Saturday, August 26, 2006 - link

    yeah ... i didn't test power with crossfire -- which is a whole lot higher. also, i have a minimal set of componets to make it work -- one hdd, one cdrom drive, and no addin cards other than graphics.

    we'll do multi-gpu power when we look at quadsli
  • ElFenix - Thursday, August 24, 2006 - link

    the review states that power consumption was measured at the wall wtih a kill-a-watt, during a 3Dmark run.

    in addition to the water cooling, it could be he's running a more efficient PSU. in a powerful system drawing 220 watts from the power supply would draw 277 watts from the wall with an 80% efficient PSU (like a good seasonic) and draw 314 watts with a 70% efficient PSU. that's a pretty decent difference right there.

    ... still waiting for nvidia's HQ driver run...
  • poohbear - Thursday, August 24, 2006 - link

    thanks
  • Rock Hydra - Wednesday, August 23, 2006 - link

    With those competitively price parts, hopefully nVIDIA will respond with lower prices.
  • CreepieDeCrapper - Wednesday, August 23, 2006 - link

    I'm not familiar with 1920x1440, did you mean 1920x1200? What resolution were these tests performed? Thank you!

  • JarredWalton - Wednesday, August 23, 2006 - link

    1920x1440 is a standard 4:3 aspect ratio used on many CRTs. It is often included as performance is somewhat close to 1920x1200 performance.
  • CreepieDeCrapper - Wednesday, August 23, 2006 - link

    Thanks, I've been using my LCD for so long I forgot about the vintage CRT res's out there ;) Plus I never ran that particular res on my CRT when I had one, so I just wasn't familiar.
  • cgaspar - Wednesday, August 23, 2006 - link

    While average frame rates are interesting, I _really_ care about minimum frame rates - 300fps average is useless if at a critical moment in a twitch game the frame rate drops to 10fps for 3 seconds - this is especially true in Oblivion. Of course it's possible that the minimums would be the same for all cards (if the game is CPU bound in some portion), but they might not be.
  • JarredWalton - Wednesday, August 23, 2006 - link

    A lot of games have instantaneous minimums that are very low due to HDD accesses and such. Oblivion is a good example. Benchmarking also emphasizes minimum frame rates, as in regular play they occur less frequently. Basically, you run around an area for a longer period of time in actual gaming, as opposed to a 30-90 second benchmark. If there's a couple seconds at the start of the level where frame rates are low due to the engine caching textures, that doesn't mean as much as continuos low frame rates.

    More information is useful, of course, but it's important to keep things in perspective. :)
  • kmmatney - Wednesday, August 23, 2006 - link

    The charts show tht the 7900GT gets a huge boost from being factory overclocked. It would be nice to see if the X1900XT 256 MB can also be overclocked at all, or if there is any headroom.

Log in

Don't have an account? Sign up now