A Matter of Memory: Revisiting the Mid-Range

When 512MB cards first came along (with the GeForce 7800 GTX 512), we ran some performance tests to try and ascertain the real world performance difference between these cards and ones with 256MB of RAM. We came up empty handed at the time. Today we are able to show that memory sizes above 256MB are actually starting to matter. With ATI's launch of the new X1900 XT 256MB, we have a direct comparison between two cards which are almost identical aside from the amount of memory on board. To be completely fair, X1900 XT 256MB cards built by ATI will also have full HDCP support, keys and all, but the major difference remains RAM.

ATI dropped down to eight 8Mx32 as opposed to the eight 16Mx32 GDDR3 modules used on the original X1900 XT. Memory speed, bandwidth, and even layout can remain the same between cards, with only a slight difference in timings due to the different capabilities of each chip type. The result is that the X1900 XT 256MB is a slower solution than the X1900 XT that still offers exceptional performance for a terrific price.

The graphs below compare the new $279 X1900 XT 256MB to the rest of the sub-$300 cards we included in last week's mid-range GPU roundup. Note that the X1900 XT's chief competitor is NVIDIA's GeForce 7900 GT, which itself can be found for around $270. However, for as little as $20 more you can get a factory overclocked 7900 GT such as the eVGA GeForce 7900 GT KO SC clocked at 580/790 that all of the sudden becomes far more competitive. Because of the prevelance of factory overclocked (and warrantied) 7900 GTs, we've included the eVGA card as a reference of what you can get for the same price.

Battlefield 2 Performance

The 256MB drop in memory size doesn't impact BF2 enough to drop performance below the 7900 GT. The overclocked EVGA card does out perform even the 512MB X1900 XT in this test, but the 256MB version doesn't loose much value here as 85 fps is still way more than playable.

Black and White 2 Performance

The 256MB X1900 XT falls in performance to just below the level of the stock 7900 GT. The competition is still tight, and we are about on par for the money here.

F.E.A.R. Performance

If F.E.A.R. performance is important to you, the X1900 XT 256MB is a better value than even the overclocked 7900 GT. The 512MB card still retains a small 5.5% lead over the 256MB card. This is one of the smaller performance drops we will see.

Half Life 2: Episode 1 Performance

Under HL2:Ep1, performance drops a very small amount, but both X1900 XT cards are in strong competition with the overclocked 7900 GT.

Quake 4 Performance

Quake 4 does give the 512MB card a bit of an advantage at 1600 x 1200, but the performance of the 256MB X1900 XT is still quite respectable given its target price of $279. The stock GeForce 7900 GT isn't in the same league as the X1900 XT.

Splinter Cell: Chaos Theory Performance

Spinter Cell rounds out our tests as one of those games where the larger frame buffer on the older X1900 XT does not do much. You lose less than 4% of your performance when going to the cheaper 256MB X1900 XT, which is a trade off we can live with. The regular GeForce 7900 GT can't hope to keep up with the 256MB X1900 XT, but if you get one of the factory overclocked cards such like the eVGA GeForce 7900 GT KO SC (more acronyms please) then you'll actually have performance competitive to the X1900 XT 256MB.

If the X1900 XT 256MB actually debuts at the ATI suggested price of $280, there won't be much of a reason to recommend anything but ATI parts from $220 up until we reach the highest end parts at above $400 where the lines start to blur again. While performance can fall very short of the X1900 XT 512MB at times, the X1900 XT 256MB remains competitive with our overclocked 7900 GT in every case but Black & White 2. The reduced memory version of the X1900 XT is just what ATI needed to pull out in order to fight back against the incredible overclockability of the 7900 GT.

The Test A Faster, Cheaper High-End
Comments Locked

74 Comments

View All Comments

  • JarredWalton - Wednesday, August 23, 2006 - link

    We used factory overclocked 7900 GT cards that are widely available. These are basically guaranteed overclocks for about $20 more. There are no factory overclocked ATI cards around, but realistically don't expect overclocking to get more than 5% more performance on ATI hardware.

    The X1900 XTX is clocked at 650 MHz, which isn't much higher than the 625 MHz of the XT cards. Given that ATI just released a lower power card but kept the clock speed at 650 MHz, it's pretty clear that there GPUs are close to topped out. The RAM might have a bit more headroom, but memory bandwidth already appears to be less of a concern, as the X1950 isn't tremendously faster than the X1900.
  • yyrkoon - Wednesday, August 23, 2006 - link

    I think its obvious why ATI is selling thier cards for less now, and that reason is alot of 'tech savy' users, are waiting for Direct3D 10 to be released, and want to buy a capable card. This is probably to try an entice some people into buying technology that will be 'obsolete', when Direct3D 10 is released.

    Supposedly Vista will ship with Directx 9L, and Directx 10 (Direct3D 10), but I've also read to the contrary, and that Direct3D 10 wont be released until after Vista ships (sometime). Personally, I couldnt think of a better time to buy hardware, but alot of people think that waiting, and just paying through the nose for a Video card later, is going to save them money. *shrug*
  • Broken - Wednesday, August 23, 2006 - link

    In this review, the test bed was an Intel D975XBX (LGA-775). I thought this was an ATI Crossfire only board and could not run two Nvidia cards in SLI. Are there hacked drivers that allow this, and if so, is there any penalty? Also, I see that this board is dual 8x pci-e and not dual 16x... at high resolutions, could this be a limiting factor, or is that not for another year?

  • DerekWilson - Wednesday, August 23, 2006 - link

    Sorry about the confusion there. We actually used an nForce4 Intel x16 board for the NVIDIA SLI tests. Unfortunately, it is still not possible to run SLI on an Intel motherboard. Our test section has been updated with the appropriate information.

    Thanks for pointing this out.

    Derek Wilson
  • ElFenix - Wednesday, August 23, 2006 - link

    as we all should know by now, Nvidia's default driver quality setting is lower than ATi's, and makes a significant difference in the framerate when you use the driver settings to match the quality settings. your "The Test" page does not indicate that you changed the driver quality settings to match.
  • DerekWilson - Wednesday, August 23, 2006 - link

    Drivers were run with default quality settings.

    Default driver settings between ATI and NVIDIA are generally comparable from an image quality stand point unless shimmering or banding is noticed due to trilinear/anisotropic optimizations. None of the games we tested displayed any such issues during our testing.

    At the same time, during our Quad SLI followup we would like to include a series of tests run at the highest possible quality settings for both ATI and NVIDIA -- which would put ATI ahead of NVIDIA in terms of Anisotropic filtering or in chuck patch cases and NVIDIA ahead of ATI in terms of adaptive/transparency AA (which is actually degraded by their gamma correction).


    If you have any suggestions on different settings to compare, we are more than willing to run some tests and see what happens.

    Thanks,
    Derek Wilson
  • ElFenix - Wednesday, August 23, 2006 - link

    could you run each card with the quality slider turned all the way up, please? i believe that the the default setting for ATi, and the 'High Quality' setting for nvidia. someone correct me if i'm wrong.

    thanks!

    michael
  • yyrkoon - Wednesday, August 23, 2006 - link

    I think as long as all settings from both offerings are as close as possible per benchmark, there is no real gripe.

    Although, some people seem to think it nessisary to run AA as high resolutions (1600x1200 +), but I'm not one of them. Its very hard for me to notice jaggies even at 1440x900, especially when concentrating on the game, instead of standing still, and looking with a magnifying glass for jaggies . . .
  • mostlyprudent - Wednesday, August 23, 2006 - link

    When are we going to see a good number of Core 2 Duo motherboards that support Crossfire? The fact that AT is using an Intel made board rather than a "true enthusiast" board says something about the current state of Core 2 Duo motherboards.
  • DerekWilson - Wednesday, August 23, 2006 - link

    Intel's boards are actually very good. The only reason we haven't been using them in our tests (aside from a lack of SLI support) is that we have not been recommending Intel processors for the past couple years. Core 2 Duo makes Intel CPUs worth having, and you definitely won't go wrong with a good Intel motherboard.

Log in

Don't have an account? Sign up now