ATI Radeon 32MB SDR

by Anand Lal Shimpi on October 13, 2000 4:46 AM EST

16-bit vs 32-bit Performance

We recently switched our testing methodology to a 32-bit only test suite for card performance, however it is important to investigate 16-bit performance as well, and thus we have put together a section on the card's 16-bit performance in comparison to its 32-bit performance.

Here we see that although the GeForce2 MX has a higher level of 16-bit performance, it's 32-bit performance is approximately 4% lower than the Radeon SDR's. ATI's argument here is that you don't buy a card that can run well under 32-bit color and play all of your games in 16-bit color, and it is a very fair and valid argument to make.

At 1600 x 1200 the GeForce2 MX still holds a lead in 16-bit color, and once again the Radeon offers superior performance under 32-bit color. The argument that 32-bit color doesn't really matter would've worked just a year and a half ago, however now that games are actually taking advantage of the extra color information, it is becoming increasingly important to concentrate on 32-bit color performance. In this case, the Radeon comes away with a 24% advantage in 32-bit color.

UnrealTournament Performance CPU Scaling Performance
Comments Locked

0 Comments

View All Comments

Log in

Don't have an account? Sign up now