Round 2: Performance

In an attempt to demonstrate the capabilities of these cards, we have run these game tests at their highest quality levels. This means that the cards will be pushed harder with more detail. Most importantly, the entire card will be well stressed. This will give us an idea of how these lower memory cards can deliver on the promise of an experience comparable to all other cards of the current generation.

In the end, it will be up to the end user whether to enable the highest quality settings and run at lower resolutions, or to turn off a few of the bells and whistles and pump up the number of pixels packed on to the screen a bit. The tradeoff really comes down to preference. Even for the AT staff, the resolution vs. settings argument is addressed on a game by game basis.

We will be taking a look at 4 flavors of cards today. We have 3 different TurboCache models and 1 HyperMemory board. Our HyperMemory board is the 32 MB model while we have 16, 32, and 64 MB TurboCache cards. With the exception of the 16MB TC board, these boards have a 64-bit memory interface. The 16MB card's memory bus is only 32-bits wide.

Aside from memory size and bus width, speed is an important factor with these boards. Vendors are looking at cheap for these boards, so we won't be seeing <2ns GDDR3 here. Instead, memory speeds are at most 700 MHz (the speed of the 16 and 32 MB TC cards). The HyperMemory card runs with RAM clocked at about 665MHz while the 64MB TC card runs at a rather slow 550 MHz. This will serve to change the performance landscape of the TurboCache cards, as having faster memory will really help the boards with less memory. The advantage of the 64MB card then becomes the ability to run applications that require 256MB of graphics memory. This will not likely be as useful as having higher performance under a 128MB graphics memory setup.

It is also possible to create 32bit wide 32MB cards, but these will experience a definite performance hit. It will be important for the consumer to pay attention to the amount of graphics memory that their solution supports, the amount of RAM that it has locally, and the bit width of the local memory interface. Needless to say, we will probably be less than satisfied with the way that these cards are marketed. Of course, giving vendors a wide range of choices based on their needs will hopefully help to keep competition up and prices down in the market.

The test setup that we used was designed to put the most emphasis on the graphics cards capabilities. As such, we should keep in mind that we are very graphics card limited here and should see very similar performance on quite a range of CPUs (until we start becoming CPU limited in games). These cards will be most sensitive to RAM used in a system, and it is our recommendation that if these cards are intended to be used by the casual gamer, memory choice should be given careful consideration. Here's what the cards ran in:

Microsoft Windows XP SP2
ASUS A8N-SLI Deluxe
AMD Athlon FX-53
1GB OCZ PC3200 @ 2:2:2:9
Seagate 7200.7 HD
OCZ Powerstream 600W PS

The 32MB 304/665 (core/mem) HyperMemory card that we have is direct from ATI, while the 32MB and 64MB 350/700 (core/mem) TurboCache cards are from PNY. The 16MB TurboCache part was from NVIDIA and is clocked the same as the PNY parts.

This is quite a lot to keep in mind when looking at our performance tests. Unfortunately, it's not a simple matter to understand what we are seeing at first glance. But, that being said, what follows paints quite a good picture of the budget market as it stands.

Round 1: Architecture Doom 3 Performance
Comments Locked

33 Comments

View All Comments

  • DAPUNISHER - Thursday, May 12, 2005 - link

    I'm on my first cup of coffee, so help me out here Derek,what was the test setup config please?
  • erinlegault - Thursday, May 12, 2005 - link

    LOL. I beat the losers who care about having the 1st post.
  • erinlegault - Thursday, May 12, 2005 - link

    Unfortunately, I don't think this was a completely fair test of hypermemory, since the Radeon X300 (synonymous with 9200) is complete crap, while Geforce 6200 is probably adequate for most users (I think there equivalent or better than the Radeon 9600's)

Log in

Don't have an account? Sign up now