16-bit vs 32-bit Performance

Since not everyone plays in 32-bit color, it is also important to see how video cards perform when in 16-bit color mode. This is important for two reasons. First off, some users may choose to play at a higher resolution with a lower number of colors, making game play possible at higher resolutions. Secondly, the 16-bit performance of a video card becomes important as a card ages. With game complexity and demands constantly increasing, it is important that a current generation video card is able to handle the next generation of games. In many cases to achieve a desirable resolution, it may be necessary to decrease the color depth of a scene, thus relieving some of the stress on the video card. In order to see how each card is able to adapt to the future by speeding up with less color depth, we took the resolution of 1024x768 and tested each card at both 16-bit and 32-bit color. Although in an ideal world, the 16-bit performance of a card and the 32-bit performance of a card would be identical, this is not the case due to memory bandwidth limitations. Therefore, what we look for is a large jump in performance when going from 32-bit color to 16-bit color. This gives us an idea of how much performance can be gained by reducing the color depth.

In Quake III Arena, our OpenGL color depth comparison, we find that the GeForce2 MX responds best to a decrease in color depth. Increasing a full 26.3 FPS, the MX speeds up almost 50% when changing color depth. This speed increase just might be enough to make future games playable at desirable resolutions.

The Radeon SDR, as a result of its HyperZ technology, can not gain as much when moving from 16-bit to 32-bit. This is because even at 32-bit the card is performing at almost full speed due to the effective memory bandwidth increase from HyperZ technology. Because the card is not hitting as large of a memory bottleneck in 32-bit color, not too much performance can be gained by decreasing color depth.

The Voodoo4 4500 numbers clearly show that it is memory bandwidth limited at 32-bit color, thus there is quite a bit to gain when decreasing color depth. A jump of 55%, the Voodoo4 4500 becomes playable at 1024x768x16.

In our D3D color performance test, we find a very similar situation to that of the OpenGL test, except with the Radeon SDR possessing a slightly larger gain. The card continues, however, to possess the smallest performance difference when going from 32-bit color to 16-bit color with a gain of 40%. The GeForce2 MX shows a 47% improvement and the Voodoo4 4500 shows a 89% improvement.

Unreal Tournament Performance CPU Scaling Performance
Comments Locked

0 Comments

View All Comments

Log in

Don't have an account? Sign up now