NV4x's Video Processor - What Happened?

When NVIDIA launched NV40, they were very quick to tout a huge hunk of transistors on the chip, which they called the NV40 Video Processor. This "Video Processor" was composed of more than 20 million transistors and NVIDIA was proud to announce that they put more transistors into NV40's Video Processor than they did in the entire original GeForce 256 chip itself. NVIDIA promised quite a bit with the Video Processor. They promised full hardware accelerated MPEG1/2 and WMV9 encoding and decoding at 1080i resolutions. What it meant was that our CPU video encoding tests would be a thing of the past - a slow CPU paired with any graphics card featuring NVIDIA's Video Processor would be able to handle even the most tasking video encoding without a problem. NVIDIA originally told us that they would have a driver which could take advantage of the processor 2 weeks after the launch of the GeForce 6800 Ultra. We even pressured NVIDIA to work on getting support for the Video Processor in the DiVX codec, since it's quite popular with our readers. The launch came and went, as did the two weeks with nothing from NVIDIA.

I personally emailed NVIDIA every other week from May until August asking for an update, with no official or unofficial response as to why nothing had happened with the illustrious Video Processor. Finally, when 23 out of the 35 slides of the NVIDIA press presentation about the GeForce 6200 featured the GPU's "Video Processor", I had had enough. It was only then that NVIDIA came clean about the current state of the Video Processor.

The Video Processor (soon to receive a true marketing name) on the NV40 was somewhat broken, although it featured MPEG 2 decode acceleration. Apparently, support for WMV9 decode acceleration was not up to par with what NVIDIA had hoped for. As of the publication of this article, NVIDIA still has not answered our questions of whether or not there is any hardware encoding acceleration as was originally promised with NV40. So, the feature set of the Video Processor on NV40 (the GeForce 6800) was incomplete, only in its support for WMV9 acceleration (arguably the most important feature of it).

NVIDIA quietly fixed the problem in the 6600GT and since the 6200 is based on the 6600, the 6200 also features the "fixed" Video Processor with WMV9 decode acceleration support. After much explaining to NVIDIA that their credibility when it comes to the Video Processor is pretty much shot, they decided to pull the talk about the Video Processor from their launch of the 6200. As a result, you won't see any benchmarks of it here. NVIDIA is currently aiming to have us a functional driver and codec that will enable the Video Processor and take advantage of its capabilities in the next month or so; given that the feature has already been on cards (in one form or another) for 6 months now, we're just going to have to wait and see.

There are still some unresolved issues here - mainly clarification of what the Video Processor really can and can't do. NVIDIA is touting excellent deinterlacing and video scaling quality, which are both important to DVD playback as well as TV playback. They are also claiming hardware assisted WMV9 decode, although they have yet to provide us with information on how much of the decoding process is actually handled by the video processor and how much of it is still software (CPU) driven. Finally, we still don't know what this thing does when it comes to encoding, but we're inclined to believe that it's far less than full-fledged GPU based video encoding.

We'll keep you updated on this topic as we get more information and we will get more information.

Index The Contenders
POST A COMMENT

44 Comments

View All Comments

  • PrinceGaz - Tuesday, October 12, 2004 - link

    I'm assuming the 6200 you tested was a 128-bit version? You don't seem to mention it at all in the review, but I doubt nVidia would send you a 64-bit model unless they wanted to do badly in the benchmarks :)

    I don't think the X700 has appeared on an AT review before, only the X700 XT. Did you underclock your XT, or have you got hold of a standard X700? I trust those X700 results aren't from the X700 XT at full speed! :)

    As #11 and #12 mentioned, with the exception of Doom 3, the X600 Pro is faster than the 6200:

    Doom 3 - 39.3 60.1 (-35%)
    HL2 Stress Test - 91 76 (+20%)
    SW Battlefront - 45 33 (+36%)
    Sims 2 - 33.9 32.2 (+5%)
    UT2004 (1024x768) - 46.3 37 (+25%) [they were CPU limited at lower resolutions]
    BF Vietnam - 81 77 (+5%)
    Halo - 45.2 44 (+3%)
    Far Cry - 74.7 60.6 (+23%)

    So the X600 Pro is slower than the 6200 (128-bit) in Doom 3 by a significant amount, but its marginally faster than it in three games, and its significantly faster than the 6200 in the other three games and also the HL2 Stress Test. So that makes the X600 Pro the better card.

    The X700 absolutely thrashed even the 6600, let alone the 6200, in every game except of course Doom 3 where the 6600 was faster, and Halo where the X700 was a bit faster than the 6600 but not by such a large amount.

    Given the prices of the ATI cards, X300SE ($75), X300 ($100), X600 Pro ($130), X700 (MSRP $149); the 6600 is going to have to be priced at under its MSRP of $149 because of the far superior X700 at the same price point. Lets say a maximum of $130 for the 6600.

    If thats the case, I can't see how the 6200 could have a street-price of $149 (128-bit) and $129 (64-bit). How can the 6200 (128-bit) even have the same price as the faster 6600 anyway? Its also outperformed by the $130 X600 Pro which makes a $149 price ridiculous. I think the 6200 will have to be priced more like the X300 and X300SE-- $100 and $75 for the 128-bit and 64-bit versions respectively, if they are to be successful.

    Maybe most 6200's will end up being cheap 64-bit cards that are sold to people who aren't really bothered about gaming, or who mistakenly believe the amount of memory is the most important factor. You just have to look at how many 64-bit FX5200's are sold.
    Reply
  • Shinei - Tuesday, October 12, 2004 - link

    The PT Barnum theory, wilburpan. There's a sucker born every minute, and if they're willing to drop $60 for a 64-bit version of a card when they could have had a 128-bit version, so much the better for profits. The FX5200 continues to be one of the best selling AGP cards on the market, despite the fact that it's worse than a Ti4200 at playing games, let alone DX9 games. Reply
  • wilburpan - Tuesday, October 12, 2004 - link

    "The first thing to notice here is that the 6200 supports either a 64-bit or 128-bit memory bus, and as far as NVIDIA is concerned, they are not going to be distinguishing cards equipped with either a 64-bit or 128-bit memory configuration."

    This really bothers me a lot. If I knew there were two versions of this card, I definitely would want to know which version I was buying.

    What would be the rationale for such a policy?
    Reply
  • wilburpan - Tuesday, October 12, 2004 - link

    Reply
  • nserra - Tuesday, October 12, 2004 - link

    Why do you all keep talking about the Geforce 6600 cards (buying them) when the X700 was the clear winner?
    You all want to buy the worst card (less performing)? I dont understand.

    Why dont anantech use 3Dmark05?

    No doubt that mine 9700 was a magnificent buy almost 2 years ago. What a piece of cheat are the Geforce FX line of cards....
    Why didnt they use one (a 5600/5700) just to see...

    Even 4pipe line Ati cards can keep up with 8 pipe nvidia, gee what a mess... old tech yeah right.
    Reply
  • coldpower27 - Tuesday, October 12, 2004 - link

    I am very happy you included Sims 2 into your benchmark suite:)

    I think this game like the amount of vertex processor on X700 plus it's advanatge in fillrate and memory bandwidth, could you please test the Sims 2 when you can on the high end cards from both vendors? :P
    Reply
  • jediknight - Tuesday, October 12, 2004 - link

    What I'm wondering is.. how do previous generation top-of-the-line cards stack up to current gen mainstream cards? Reply
  • AnonymouseUser - Tuesday, October 12, 2004 - link

    Saist, you are an idiot.

    "OpenGl was never really big on ATi's list of supported API's... However, adding in Doom3, and the requirement of OGL on non-Windows-based systems, and OGL is at least as important to ATi now as DirectX."

    Quake 3, RtCW, HL, CS, CoD, SW:KotOR, Serious Sam (1&2), Painkiller, etc, etc, etc, etc, are OpenGL games. Why would they ONLY NOW want to optimize for OpenGL?
    Reply
  • Avalon - Monday, October 11, 2004 - link

    Nice review on the budget sector. It's good to see a review from you again, Anand :) Reply
  • Bonesdad - Monday, October 11, 2004 - link

    Affordable gaming??? Not until the 6600GT AGP's come out...affordable is not replacing your mobo, cpu and video card... Reply

Log in

Don't have an account? Sign up now