Taking advantage of it all

To summarize, realistic reflections, skin, hair, walls and overall extremely detailed surfaces are now finally going to be made possible through the use of the programmable technology behind NVIDIA's nfiniteFX engine.

As far as compatibility with competitors goes, NVIDIA licensed out this technology to Microsoft for use in DirectX 8. It is an open standard that is accessible by all of NVIDIA's competitors, so you can expect to see similar functions in ATI's next product as well. The OpenGL API will be able to take advantage of these functions through NVIDIA's own extensions to the API that are implemented through the drivers.

NVIDIA has put together a helpful set of custom-made vertex and pixel shader programs that are available on their development site. Their effects browser can be used to not only see the effects of these programs but also the code behind them and is a useful introductory tool into the world of the programmable GPU.

Unfortunately, and this is a very big caveat for early adopters of the GeForce3, it is still going to be a little while before we see DirectX 8 games hitting store shelves and it will be even longer before we see the general mass of games truly take advantage of this power. idSoftware's Doom3 is still looking at an estimated 2002 release date, between now and then there will definitely be titles that take advantage of the features, whether they are worthy of the $500 price tag of the GeForce3 however is another question. It may almost be worth waiting another 6-months for NVIDIA's next card release (by that time ATI will have a DX8 compatible part similar to the GeForce3 out as well) before upgrading.

To the current crop of games that don't utilize these advanced features, the GeForce3 is still much like a GeForce2 Ultra. This was unfortunately a reality that NVIDIA had to face with the GeForce3. It is an excellent technology and the transition to it will have to occur at some point but the value of an upgrade to it at this point is questionable because of the lack of gaming titles.

Luckily NVIDIA realized this and did attempt to sweeten the performance at least a little bit.

Finally a true GPU Lightspeed Memory Architecture
Comments Locked

0 Comments

View All Comments

Log in

Don't have an account? Sign up now