NV4x's Video Processor - What Happened?

When NVIDIA launched NV40, they were very quick to tout a huge hunk of transistors on the chip, which they called the NV40 Video Processor. This "Video Processor" was composed of more than 20 million transistors and NVIDIA was proud to announce that they put more transistors into NV40's Video Processor than they did in the entire original GeForce 256 chip itself. NVIDIA promised quite a bit with the Video Processor. They promised full hardware accelerated MPEG1/2 and WMV9 encoding and decoding at 1080i resolutions. What it meant was that our CPU video encoding tests would be a thing of the past - a slow CPU paired with any graphics card featuring NVIDIA's Video Processor would be able to handle even the most tasking video encoding without a problem. NVIDIA originally told us that they would have a driver which could take advantage of the processor 2 weeks after the launch of the GeForce 6800 Ultra. We even pressured NVIDIA to work on getting support for the Video Processor in the DiVX codec, since it's quite popular with our readers. The launch came and went, as did the two weeks with nothing from NVIDIA.

I personally emailed NVIDIA every other week from May until August asking for an update, with no official or unofficial response as to why nothing had happened with the illustrious Video Processor. Finally, when 23 out of the 35 slides of the NVIDIA press presentation about the GeForce 6200 featured the GPU's "Video Processor", I had had enough. It was only then that NVIDIA came clean about the current state of the Video Processor.

The Video Processor (soon to receive a true marketing name) on the NV40 was somewhat broken, although it featured MPEG 2 decode acceleration. Apparently, support for WMV9 decode acceleration was not up to par with what NVIDIA had hoped for. As of the publication of this article, NVIDIA still has not answered our questions of whether or not there is any hardware encoding acceleration as was originally promised with NV40. So, the feature set of the Video Processor on NV40 (the GeForce 6800) was incomplete, only in its support for WMV9 acceleration (arguably the most important feature of it).

NVIDIA quietly fixed the problem in the 6600GT and since the 6200 is based on the 6600, the 6200 also features the "fixed" Video Processor with WMV9 decode acceleration support. After much explaining to NVIDIA that their credibility when it comes to the Video Processor is pretty much shot, they decided to pull the talk about the Video Processor from their launch of the 6200. As a result, you won't see any benchmarks of it here. NVIDIA is currently aiming to have us a functional driver and codec that will enable the Video Processor and take advantage of its capabilities in the next month or so; given that the feature has already been on cards (in one form or another) for 6 months now, we're just going to have to wait and see.

There are still some unresolved issues here - mainly clarification of what the Video Processor really can and can't do. NVIDIA is touting excellent deinterlacing and video scaling quality, which are both important to DVD playback as well as TV playback. They are also claiming hardware assisted WMV9 decode, although they have yet to provide us with information on how much of the decoding process is actually handled by the video processor and how much of it is still software (CPU) driven. Finally, we still don't know what this thing does when it comes to encoding, but we're inclined to believe that it's far less than full-fledged GPU based video encoding.

We'll keep you updated on this topic as we get more information and we will get more information.

Index The Contenders
POST A COMMENT

44 Comments

View All Comments

  • Saist - Monday, October 11, 2004 - link

    xsliver : I think it's because ATi has generally cared more about optimizing for DirectX, and more recently just optimizing for API. OpenGl was never really big on ATi's list of supported API's... However, adding in Doom3, and the requirement of OGL on non-Windows-based systems, and OGL is at least as important to ATi now as DirectX. How long it will take to convert that priority into performance is unknown.

    Also, keep this in mind: Nvidia specifically built the Geforce mark-itecture from the ground up to power John Carmack's 3D dream. Nvidia has specifically stated they create their cards based on what Carmack says. Wether or not that is right or wrong I will leave up to you to decide, but that does very well explain the disparity between ID games and other games, even under OGL.
    Reply
  • xsilver - Monday, October 11, 2004 - link

    Just a conspiracy theory -- does the NV cards only perform well on the most popular / publicised games whereas the ATI cards excel due to a better written driver / better hardware?

    Or is the FRAPS testing biasing ATI for some reason?

    Reply
  • Cygni - Monday, October 11, 2004 - link

    What do you mean "who has the right games"? If you want to play Doom3, look at the Doom3 graphs. If you want to play FarCry, look a the FarCry graphs. If you want to play CoH, Madden, or Thunder 04, look at HardOCP's graphs. Every game is going to handle the cards differently. I really dont see anything wrong with AnandTech's current group of testing programs.

    And Newegg now has 5 6600 non-GTs in stock, ranging in price from $175-$148. But remember that it takes time to test and review these cards. When Anand went to get a 6600, its very likely that that was the only card he could find. I know I couldnt find one at all a week ago.
    Reply
  • T8000 - Monday, October 11, 2004 - link

    Check this, a XFX 6600 in stock for just $143:
    http://www.gameve.com/gve/Store/ProductDetails.asp...

    Furthermore, the games you pick for a review make a large difference for the conclusion. Because of that, HardOCP has the 6200 outperforming the x600 by a small margin. So, I would like to know who has the right games.

    And #2:
    The X700/X800 is simular enough to the 9800 to compare them on pipelines and clock speeds. Based on that, the x700 should perform about the same.
    Reply
  • Anand Lal Shimpi - Monday, October 11, 2004 - link

    Thanks for the responses, here are some answers in no specific order:

    1) The X300 was omitted from the Video Stress Test benchmark because CS: Source was released before we could finish testing the X300, no longer giving us access to the beta. We will run the cards on the final version of CS: Source in future reviews.

    2) I apologize for the confusing conclusion, that statement was meant to follow the line before it about the X300. I've made the appropriate changes.

    3) No prob in regards to the Video Processor, I've literally been asking every week since May about this thing. I will get the full story one way or another.

    4) I am working on answering some of your questions about comparing other cards to what we've seen here. Don't worry, the comparisons are coming...

    Take care,
    Anand
    Reply
  • friedrice - Monday, October 11, 2004 - link

    Here's my question, what is better? A Geforce 6800 or a Geforce 6600 GT? I wish there was like a Geforce round-up somewhere. And I saw some benchmarks that showed SLI does indeed work, but these were just used on 3dmark and anyone know if there is any actual tests out yet on SLI?

    Also to address another issue some of you have brought up, these new line of cards beat the 9800 Pro by a huge amount. But it's not worth the upgrade. Stick with what you have until it no longer works, and right now a 9800 Pro works just fine. Of course if you do need a new graphics card, the 6600 GT seems the way to go. If you can find someone that sells them.

    O, and to address the pricing. nVidia only offers suggested retail prices. Vendors can up the price on parts so that they can still sell the inventory they have on older cards. In the next couple of months we should see these new graphics cards drop to the MSRP
    Reply
  • ViRGE - Monday, October 11, 2004 - link

    #10, because it's still an MP game at the core. The AI is as dumb as rocks, and is there for the console users. Most PC users will be playing this online, not alone in SP mode. Reply
  • rbV5 - Monday, October 11, 2004 - link

    Thanks for the tidbit on the 6800's PVP. I'd like to see Anandtech take on a video card round up aimed at video processing and what these cards are actually capable of. It would fit in nicely with the media software/hardware Andrew's been looking at, and let users know what to actually expect from their hardware. Reply
  • thebluesgnr - Monday, October 11, 2004 - link

    Buyxtremegear has the GeForce 6600 from Leadtek for $135. Gameve has 3 different cards (Sparkle, XFX, Leadtek) all under $150 for the 128MB version.

    #1,
    they're probably talking about the power consumption under full load.
    Reply
  • Sunbird - Monday, October 11, 2004 - link

    All I hope is that the 128bit and 64bit versions have some easy way of distinguishing between them. Reply

Log in

Don't have an account? Sign up now