NV4x's Video Processor - What Happened?

When NVIDIA launched NV40, they were very quick to tout a huge hunk of transistors on the chip, which they called the NV40 Video Processor. This "Video Processor" was composed of more than 20 million transistors and NVIDIA was proud to announce that they put more transistors into NV40's Video Processor than they did in the entire original GeForce 256 chip itself. NVIDIA promised quite a bit with the Video Processor. They promised full hardware accelerated MPEG1/2 and WMV9 encoding and decoding at 1080i resolutions. What it meant was that our CPU video encoding tests would be a thing of the past - a slow CPU paired with any graphics card featuring NVIDIA's Video Processor would be able to handle even the most tasking video encoding without a problem. NVIDIA originally told us that they would have a driver which could take advantage of the processor 2 weeks after the launch of the GeForce 6800 Ultra. We even pressured NVIDIA to work on getting support for the Video Processor in the DiVX codec, since it's quite popular with our readers. The launch came and went, as did the two weeks with nothing from NVIDIA.

I personally emailed NVIDIA every other week from May until August asking for an update, with no official or unofficial response as to why nothing had happened with the illustrious Video Processor. Finally, when 23 out of the 35 slides of the NVIDIA press presentation about the GeForce 6200 featured the GPU's "Video Processor", I had had enough. It was only then that NVIDIA came clean about the current state of the Video Processor.

The Video Processor (soon to receive a true marketing name) on the NV40 was somewhat broken, although it featured MPEG 2 decode acceleration. Apparently, support for WMV9 decode acceleration was not up to par with what NVIDIA had hoped for. As of the publication of this article, NVIDIA still has not answered our questions of whether or not there is any hardware encoding acceleration as was originally promised with NV40. So, the feature set of the Video Processor on NV40 (the GeForce 6800) was incomplete, only in its support for WMV9 acceleration (arguably the most important feature of it).

NVIDIA quietly fixed the problem in the 6600GT and since the 6200 is based on the 6600, the 6200 also features the "fixed" Video Processor with WMV9 decode acceleration support. After much explaining to NVIDIA that their credibility when it comes to the Video Processor is pretty much shot, they decided to pull the talk about the Video Processor from their launch of the 6200. As a result, you won't see any benchmarks of it here. NVIDIA is currently aiming to have us a functional driver and codec that will enable the Video Processor and take advantage of its capabilities in the next month or so; given that the feature has already been on cards (in one form or another) for 6 months now, we're just going to have to wait and see.

There are still some unresolved issues here - mainly clarification of what the Video Processor really can and can't do. NVIDIA is touting excellent deinterlacing and video scaling quality, which are both important to DVD playback as well as TV playback. They are also claiming hardware assisted WMV9 decode, although they have yet to provide us with information on how much of the decoding process is actually handled by the video processor and how much of it is still software (CPU) driven. Finally, we still don't know what this thing does when it comes to encoding, but we're inclined to believe that it's far less than full-fledged GPU based video encoding.

We'll keep you updated on this topic as we get more information and we will get more information.

Index The Contenders
Comments Locked

44 Comments

View All Comments

  • nvdm24 - Sunday, December 19, 2004 - link

    Many of the readers of these tech sites want to know the full capabilities of the cards, yet, sadly, reviewers at anandtech and every other tech site ignore the video capabilities of video cards. Even in the reviews for the new 6600 agp, the video aspect has not been tested by any reviewer despite the problems of the 6800. Never mind the fact that EVERY review of these cards is about the 3d aspect and is nearly the exact same - run halo, doom 3, hl 2, etc. and list the performance, yet no tests of dvd movies or the video aspect are conducted, thus doing a HUGE disservice to readers.
  • nserra - Thursday, December 16, 2004 - link

    I dont understand why on you previous 6200 review the X300 wins, loses (Doom3), and keep up, but now a much worst 6200 wins over X300. How the hell did that hapen, new nvidia drivers?
  • nserra - Thursday, December 16, 2004 - link

    I dont understand why on you previous 6200 review the X300 wins, loses (Doom3), and keep up, but now a much worst 6200 wins over X300. How the hell did that hapen, new nvidia drivers?
  • IntelUser2000 - Thursday, October 14, 2004 - link

    Surprisingly, my 865G with Intel Extreme Graphics 2 can run Doom 3 beta at default, it still crashes, but when I run it, I get barely playable frames, I say around 20 at the highest and less than 10. I think the GMA900 should be much better, but maybe the DX9 support in it really sucks.
  • nserra - Wednesday, October 13, 2004 - link

    #39 Thanks to the answer, but...

    Doesnt 2 cards cost more then one?
    And whats the difference between having two 6600GT vs 6800GT? in price and performance?

    I think this kind of "edge" could come in the future like the voodoo2 did, the card was getting old, people getting rid of it and "some" get them cheap just to keep their PC the longger time they could.
  • Confusednewbie1552 - Tuesday, October 12, 2004 - link

    #30

    Everyone wants 660GT because they are cheap and two of them can be put into SLI mode (once Nforce 4 comes out) which could mean better performance than the X700, and maybe even the X800.
  • PrinceGaz - Tuesday, October 12, 2004 - link

    I'm sure the core of the 6600 will overclock very well, but the memory all depends on the particular chips used and might not have any real headroom. That could be its main problem as its an 8-pipe 300MHz core so theres plenty of power there, but only 128-bit 500MHz (effective) memory which is what is probably holding it back. If thats the case then overclocking the core may not help very much.

    Its a pity no attempt to overclock was performed in the review, but then again the results from overclocking cards sent out by the manufacturer are always suspect as they could have hand-picked the best.
  • thebluesgnr - Tuesday, October 12, 2004 - link

    " I can't see how the 6200 could have a street-price of $149 (128-bit) and $129 (64-bit). "

    It's actually $129 for the 128MB 128-bit version and $149 for the 256MB 128-bit version. The 64-bit version (only 128MB) should have an MSRP of $100, according to the Inquirer.

    So nVidia has:
    $100 6200 128MB 64-bit
    $130 6200 128MB 128-bit
    $150 6200 256MB 128-bit
    $150 6600 128MB 128-bit
    $200 6600GT 128MB 128-bit

    In my opinion ATI beats all nVidia cards except for their $200, where the 6600GT wins. But we can't forget the 6600 has a great overclocking potential, and street prices should be lower than the X700's, because of the slower memory.
    Like already mentioned, you can find the 6600 for $135 already.
  • mkruer - Tuesday, October 12, 2004 - link

    To X700 XT or to 9800 Pro, that is the question
  • neo229 - Tuesday, October 12, 2004 - link

    I also wish to thank you for keeping up the fight to unravel the mystery behind the mysterious video processor. That notion of that feature really got me excited when I first heard about it, yet site after site after site reviewed these cards without even touching on the subject.

Log in

Don't have an account? Sign up now