Losing the battle without any weapons

By the end of the first quarter of 2000, the GeForce DDR became the semi-affordable card of choice for all gamers. Card manufacturers stopped producing GeForce 256 based cards that used regular SDRAM since the price of DDR SDRAM was not much more yet the performance gains were astronomical.

This posed a major problem for 3dfx. NVIDIA's flagship product was getting faster, and 3dfx still had no new products other than the aging Voodoo3, which was almost a year old at that point. The Voodoo4 and Voodoo5 cards that were announced at Fall Comdex 1999 had still yet to appear, and there was no word as to when they would appear. At the same time, 3dfx had to build up public desire for these products, otherwise the launch would definitely be a flop.

The major feature that 3dfx was touting that would be in this next-generation of Voodoo cards was the notorious T-Buffer, with FSAA being a major part of the feature set offered by this T-Buffer. Unfortunately, 3dfx encountered two major problems with convincing the population that FSAA was worth the wait.

First of all, it is very difficult to show off a feature that really follows the old axiom, seeing is believing. The still screenshots that were presented online of what dramatic increase FSAA can provide just didn't do it for most users and when combined with this next problem, made FSAA more of a laughable feature than something that was desired.

The second and most difficult problem 3dfx encountered was that enabling FSAA, using the current methods, resulted in a 50 - 75% drop in peak fill rate, which means performance that is equal to or less than the Voodoo3 3000 in many situations.

So while NVIDIA encountered similar resistance to their boasting that Hardware T&L was the way of the future, the users didn't complain since having Hardware T&L didn't degrade performance any - it was simply not used enough by developers.

In the end, the market was waiting for 3dfx to produce a competitor to NVIDIA's GeForce 256 while NVIDIA was getting ready to release its successor, the GeForce2 GTS.

April Showers

Ever since the release of the TNT2, NVIDIA had been operating on a 6-month product cycle. This was the aggressive NVIDIA that had risen from a little known producer of the NV1 to a force to be reckoned with.

NVIDIA's goal was to release a new technology every Fall, and then "refresh" it in the following Spring. The refresh usually meant a smaller manufacturing process, higher clock speeds, and as many additional features that could be included without dramatically changing the core. For example, the TNT2 was the Spring refresh of the TNT that was released the previous Fall. And the GeForce 256 which was released in the Fall of 1999, was due to be "refreshed" in the Spring of 2000, and it was.

At the end of April 2000, NVIDIA decided to launch their GeForce2 GTS product, the GTS being an acronym for Giga Texel Shader. The GTS was actually ready to go much earlier, but it didn't make sense to release it any sooner as the GeForce DDR was still selling very strong since 3dfx had no competing offering.

Five months after their incredible announcement at Fall Comdex 1999, 3dfx also brought their first next-generation card to the public view, the same week as NVIDIA's GeForce2 GTS launch. Unfortunately for 3dfx, while the GeForce2 GTS would be available just a couple of weeks later, the only card to make it out of 3dfx's doors since the Voodoo3 3500TV would be the Voodoo5 5500 and it would not be available until June. Even worse for 3dfx was the fact that the 5500 was noticeably slower than the GeForce2 GTS.

In a surprise move by ATI, the launch-week of the GTS and the 5500 was concluded with the presentation of ATI's next-generation core at WinHEC, the Radeon. While we were very skeptical of ATI's ability to deliver the Radeon core on time, we couldn't help but think that the Radeon's core was quite impressive on paper at least.

Three months later, ATI delivered the Radeon to the public. Well ahead of when we expected to see it and very unexpected by NVIDIA, ATI was back in the game.

The performance lead goes to: ATI? The value of a dollar
Comments Locked

1 Comments

View All Comments

  • ruxandy - Monday, February 10, 2020 - link

    Almost 20 years since Anand wrote this article and I still have vivid memories of that time (my high school years), one of the best in computer history. Such an intense period! You bought a CPU/video card in January, and by the end of September it was already obsolete. Good times...

Log in

Don't have an account? Sign up now