ATI Radeon VE 32MB

by Matthew Witheiler on February 19, 2001 3:09 AM EST

The Drivers

Besides the changes detailed in the Multiple Monitor section of this review, ATI's Radeon drivers remain unchanged for the Radeon VE. This, unfortunately lead to many problems.

First off, as we have noted many times in the past, ATI chooses to disable full 32-bit rendering in both Direct3D as well as OpenGL by incorporating some tricks. Let's take a look at what is going on in the default OpenGL screen first off.

The drivers default to "Performance" mode where a very crucial item is clicked: the "Convert 32 bit textures to 16 bit" box. With this check block there, all Radeon cards force games to render 32-bit textures in only 16-bit color, decreasing the image quality of the game. Why would ATI do this? It is because by converting the textures to a lower quality, speed is increased. No other card manufacturer that we know of sets this setting as default in OpenGL. Both NVIDIA as well as 3dfx leave this feature disabled upon the install of the drivers. For this reason, the first thing that we do when testing cards that use the Radeon driver set is to unchecked this box and put the Radeon cards in the same boat as the rest of the products out there.

The second form of trickery comes in D3D mode. Let's see what is default here.

The highlighted "16;24" selection in the drop down menu for Z-buffer bit depths is the default selection for Radeon based cards. Once again, image quality is compromised with this setting chosen because the Z-buffer can not store the proper amount of information necessary to render a game in full 32-bit color. The trickery once again results in a performance increase for the Radeon when compared to NVIDIA based cards that allow 16, 24, and 32 bit Z-buffer bit depths. 3dfx and Matrox also play this trick upon the consumer, and features the same 16 and 24 bit default settings that the Radeon has. The second thing we do when setting up a system with Radeon based drivers is select the "16;24;32" selection from the drop down menu, allowing proper 32-bit rendering.

The reason we bring this up is because some out there may be testing without having a level playing field. Without unchecking the convert 32 bit textures to 16 bit box and selecting the 16;24;32 as allowed Z-buffer depths, you are giving the Radeon an unfair advantage (as long as you select the 16;24;32 Z-buffer depths in 3dfx and Matrox products as well). It is a shame that ATI does this, as it only ends up hurting the consumer by making games play faster but in lesser quality. However, recently it seems that all that consumers are looking for is speed. If that is the case, then at least the speed comparison needs to be performed without one product having an advantage over another.

Multiple Monitor ATI's Way The Test
Comments Locked

0 Comments

View All Comments

Log in

Don't have an account? Sign up now