NV4x's Video Processor - What Happened?

When NVIDIA launched NV40, they were very quick to tout a huge hunk of transistors on the chip, which they called the NV40 Video Processor. This "Video Processor" was composed of more than 20 million transistors and NVIDIA was proud to announce that they put more transistors into NV40's Video Processor than they did in the entire original GeForce 256 chip itself. NVIDIA promised quite a bit with the Video Processor. They promised full hardware accelerated MPEG1/2 and WMV9 encoding and decoding at 1080i resolutions. What it meant was that our CPU video encoding tests would be a thing of the past - a slow CPU paired with any graphics card featuring NVIDIA's Video Processor would be able to handle even the most tasking video encoding without a problem. NVIDIA originally told us that they would have a driver which could take advantage of the processor 2 weeks after the launch of the GeForce 6800 Ultra. We even pressured NVIDIA to work on getting support for the Video Processor in the DiVX codec, since it's quite popular with our readers. The launch came and went, as did the two weeks with nothing from NVIDIA.

I personally emailed NVIDIA every other week from May until August asking for an update, with no official or unofficial response as to why nothing had happened with the illustrious Video Processor. Finally, when 23 out of the 35 slides of the NVIDIA press presentation about the GeForce 6200 featured the GPU's "Video Processor", I had had enough. It was only then that NVIDIA came clean about the current state of the Video Processor.

The Video Processor (soon to receive a true marketing name) on the NV40 was somewhat broken, although it featured MPEG 2 decode acceleration. Apparently, support for WMV9 decode acceleration was not up to par with what NVIDIA had hoped for. As of the publication of this article, NVIDIA still has not answered our questions of whether or not there is any hardware encoding acceleration as was originally promised with NV40. So, the feature set of the Video Processor on NV40 (the GeForce 6800) was incomplete, only in its support for WMV9 acceleration (arguably the most important feature of it).

NVIDIA quietly fixed the problem in the 6600GT and since the 6200 is based on the 6600, the 6200 also features the "fixed" Video Processor with WMV9 decode acceleration support. After much explaining to NVIDIA that their credibility when it comes to the Video Processor is pretty much shot, they decided to pull the talk about the Video Processor from their launch of the 6200. As a result, you won't see any benchmarks of it here. NVIDIA is currently aiming to have us a functional driver and codec that will enable the Video Processor and take advantage of its capabilities in the next month or so; given that the feature has already been on cards (in one form or another) for 6 months now, we're just going to have to wait and see.

There are still some unresolved issues here - mainly clarification of what the Video Processor really can and can't do. NVIDIA is touting excellent deinterlacing and video scaling quality, which are both important to DVD playback as well as TV playback. They are also claiming hardware assisted WMV9 decode, although they have yet to provide us with information on how much of the decoding process is actually handled by the video processor and how much of it is still software (CPU) driven. Finally, we still don't know what this thing does when it comes to encoding, but we're inclined to believe that it's far less than full-fledged GPU based video encoding.

We'll keep you updated on this topic as we get more information and we will get more information.

Index The Contenders
POST A COMMENT

44 Comments

View All Comments

  • Sunbird - Monday, October 11, 2004 - link

    Reply
  • bpt8056 - Monday, October 11, 2004 - link

    Anand, thanks so much for updating us on the PVP feature in the NV40. I think it's high-time somebody held nVidia accountable for a "broken" feature. Do you know if the PVP is working in the PCI-Express version (NV45)? Any information you can get would be great. Thanks Anand! Reply
  • mczak - Monday, October 11, 2004 - link

    That's an odd conclusion... "In most cases, the GeForce 6200 does significantly outperform the X300 and X600 Pro, its target competitors from ATI."
    But looking at the results, the X600Pro is _faster_ in 5 of 8 benchmarks (sometimes significantly), 2 are a draw, and only slower in 1 (DoomIII, by a significant margin). Not to disregard DoomIII, but if you base your conclusion entirely on that game alone why do you even bother with the other titles?
    I just can't see why that alone justifies "...overall, the 6200 takes the crown".

    There are some other odd comments as well, for instance at the Star Wars Battlefront performance: "The X300SE is basically too slow to play this game. There's nothing more to it. The X300 doesn't make it much better either." Compared to the 6200 which gets "An OK performer;..." but is actually (very slightly) slower than the X300?
    Reply
  • gordon151 - Monday, October 11, 2004 - link

    "In most cases, the GeForce 6200 does significantly outperform the X300 and X600 Pro, its target competitors from ATI."

    Eh, am I missing something or wasnt it the X600 Pro the card that significantly outperformed the 6200 in almost all areas with the exception of Doom3.
    Reply
  • dragonic - Monday, October 11, 2004 - link

    #6 Why would they drop it because the multiplayer framerate is locked? They benchmark using the single player, not the multiplayer Reply
  • DAPUNISHER - Monday, October 11, 2004 - link

    Thanks Anand! I've been on about the PVP problems with nV40 for months now, and have become increasing fustrated with the lack of information and/or progress by nV. Now that a major site is pursuing this with vigor I can at least take comfort in the knowledge that answers will be forthcoming one way or another!

    Again, thanks for making this issue a priority and emphatically stating you will get more information for us. It's nV vs Anand so "Rumble young man! Rumble!" :-)
    Reply
  • AlphaFox - Monday, October 11, 2004 - link

    if you ask me, all these low end cards are stupid if you have a PCIe motherboard.. who the heck would get one of these crappy cards if they spent all the money for a brand new PCIe computer??? these cards would be perfect for AGP as they are now going to start to be lower end.. Reply
  • ROcHE - Monday, October 11, 2004 - link

    How would a 9800 Pro do against these card? Reply
  • ViRGE - Monday, October 11, 2004 - link

    Unless LucasArts changes something Anand, you may want to drop the Battlefront test. With multiplayer, the framerate is locked to the tick rate(usually 20FPS), so its performance is nearly irrelivant.

    PS #1, he's talking about the full load graph, not the idle graph
    Reply
  • teng029 - Monday, October 11, 2004 - link

    "For example, the GeForce 6600 is supposed to have a street price of $149, but currently, it's selling for closer to $170. So, as the pricing changes, so does our recommendation."

    i have yet to see the 6600 anywhere. pricewatch only lists two aopen cards (both well over 200.00) it and newegg doesn't carry it. i'm curious as to where he got the 170.00 street price.
    Reply

Log in

Don't have an account? Sign up now