NVIDIA GeForce2 Ultra

by Anand Lal Shimpi on August 14, 2000 9:01 AM EST
NVIDIA's Testing "Suggestion"

ATI gave NVIDIA a nice surprise with the performance of their Radeon in comparison to their flagship GeForce2 GTS product.  In an attempt to “level the playing field”, NVIDIA suggested that if we were to test using MDK2, we should be aware of the fact that the GeForce2 and Radeon use different default texture color depths.  They further suggested that if we wanted to produce an “apples to apples” comparison, then we should run the GeForce2 GTS in 16-bit color mode.  This obviously would create a problem for the ATI Radeon whose 16-bit performance is not nearly as good as NVIDIA’s, but what it doesn’t do is make for a true “apples to apples” comparison.

The reason NVIDIA made this suggestion is because there is an option in ATI’s drivers (and there always has been) to convert 32-bit textures to 16-bit.  But by placing the GeForce2 GTS and the GeForce2 Ultra in 16-bit color mode you immediately place the Radeon at a severe disadvantage because of its poor 16-bit color performance.  You buy a card like a Radeon or a GeForce2 GTS and definitely a GeForce2 Ultra in order to run in 32-bit color, and benchmarking solely in 16-bit color doesn’t make much sense at all.


The proper way to level the playing field, disable ATI's texture conversion

While NVIDIA’s suggestions are one method of approaching benchmarking, we thought of a better idea: simply disable ATI’s “convert 32-bit textures to 16-bit” option, which is what we’ve always done when benchmarking ATI cards, and the playing field is now leveled.  This is what we did for our comparison; while we didn’t use MDK2, this applies for all benchmarks and we are disappointed that NVIDIA would suggest such a thing in order to produce an “apples to apples” comparison. 

Faster T&L Means...? The Test
Comments Locked

0 Comments

View All Comments

Log in

Don't have an account? Sign up now