A Better HTPC Card: MPEG-4 ASP Decoding & LPCM Audio

Along with the addition of DirectX 10.1 functionality, the latest members of NVIDIA’s GPU lineup have picked up a couple of new tricks specifically geared towards HTPC use.

The first of these is a newer video decoding engine. Officially NVIDIA is moving away from the VP* moniker, but for the time being we’re sticking to it as we don’t have a better way of easily differentiating the feature sets of various video decoding engines. NVIDIA’s vendors are calling this VP4, and so are we.

Successive VPs have focused on adding support for additional video formats. VP2 had full H.264 decoding, and VP3 (which never made it into a GTX 200 series part) added VC-1 decoding. For VP4, NVIDIA has added support for full decoding of MPEG-4 (Advanced) Simple Profile, better known as DivX or XviD. With this addition, NVIDIA can now offload the decoding of most of the MPEG formats – the only thing not supported is MPEG-1, which as the oldest codec is trivial to decode on a CPU anyhow.

To be frank, we’re a bit puzzled by this latest addition. By no means are we unhappy (we’ll always take more acceleration!), but MPEG-4 ASP isn’t particularly hard to decode. Even an underclocked Nehalem with only a single core (and no HT) enabled can handle HD-resolution MPEG-4 ASP with ease; never mind what even a low-end dual-core Pentium or Celeron can do. This would be a good match for the Atom, but those almost always use integrated graphics (and Ion isn’t slated to get VP4 any time soon). So while this addition is nice to have, it’s not the kind of game changer that adding H.264 and VC-1 were.

The unfortunate news here is that while the hardware is ready, the software is not, and this is something that caught us off-guard since these parts have been going to OEMs since July. NVIDIA has yet to enable MPEG-4 ASP acceleration in their drivers, and won’t be doing so until the release 195 drivers. So at this point we can’t even tell you how well this feature works. We’re not pleased with this, but we’re also not particularly broken up about it since as we just mentioned the cost of CPU decoding isn’t very much in the first place.

On a final note with video decoding, one of NVIDIA’s marketing pushes with this launch is touting the fact that they have been working with Adobe to bring video decode acceleration to Adobe Flash 10.1, and that the GT 220/G 210 series are well suited for this. This is going to be absolutely fantastic to have since Flash Video is a CPU-hog, but Flash 10.1 is still 6 months (or more) away from being released. More to the point, as far as we know this is being implemented via DXVA, which means everyone else will get acceleration too. And notably, this is only for H.264, as VP6 (the older Flash Video codec) is not supported in hardware on any card.

Moving on, the other new HTPC feature is that NVIDIA has finally stepped up their game with respect to HDMI audio on cards with discrete GPUs. Gone is the S/PDIF cable to connect a card to an audio codec, which means NVIDIA is no longer limited to 2-channel LPCM or 5.1 channel DD/DTS for audio. Now they are passing audio over the PCIe bus, which gives them the ability to support additional formats. 8 channel LPCM is in, as are the lossy formats DD+ and 6 channel AAC. However Dolby TrueHD and DTS Master Audio bitstreaming are not supported, so it’s not quite the perfect HTPC card. Lossless audio is possible through LPCM, but there won’t be any lossless audio bitstreaming.

Finally, we’re still waiting to see someone do a passive cooled design for the GT 220. The power usage is low enough that it should be possible with a dual-slot heatsink, but the only cards we’ve seen thus far are actively cooled single-slot solutions with the heatsink sticking out some.

DirectX 10.1 on an NVIDIA GPU? Palit’s GT 220 Sonic Edition
POST A COMMENT

79 Comments

View All Comments

  • abs0lut3 - Tuesday, October 13, 2009 - link

    When is GT 240 coming out and when are you going to review it? I had expected the GT 220 to be as low as it comes (reaaallly low end), however, I saw some preliminary reviews on other forums on the GT 240, the supposedly new Nvidia 40nm mainstream card with GDDR5 and quite fascinate with result. Reply
  • MegaSteve - Tuesday, October 20, 2009 - link

    No one is going to buy one of these cards by choice - they are going to be thrown out in HP, Dell and Acer PCs under a pretty sticker saying they have POWERFUL GRAPHICS or some other garbage. Much the same as them providing 6600 graphics cards instead of 6600GTs, then again, I would probably rather have a 6600GT because if the DirectX 10 cards that were first released were any indication this thing will suck. I am sure this thing will play Bluray... Reply
  • Deanjo - Tuesday, October 13, 2009 - link

    "NVIDIA has yet to enable MPEG-4 ASP acceleration in their drivers"

    Not true, they have not enabled it in their Windows drivers.

    They are enabled in the linux drivers for a little while now.

    ftp://download.nvidia.com/XFree86/Linux-x86_64/190...">ftp://download.nvidia.com/XFree86/Linux-x86_64/190...

    VDP_DECODER_PROFILE_MPEG4_PART2_SP, VDP_DECODER_PROFILE_MPEG4_PART2_ASP, VDP_DECODER_PROFILE_DIVX4_QMOBILE, VDP_DECODER_PROFILE_DIVX4_MOBILE, VDP_DECODER_PROFILE_DIVX4_HOME_THEATER, VDP_DECODER_PROFILE_DIVX4_HD_1080P, VDP_DECODER_PROFILE_DIVX5_QMOBILE, VDP_DECODER_PROFILE_DIVX5_MOBILE, VDP_DECODER_PROFILE_DIVX5_HOME_THEATER, VDP_DECODER_PROFILE_DIVX5_HD_1080P

    *

    Complete acceleration.
    *

    Minimum width or height: 3 macroblocks (48 pixels).
    *

    Maximum width or height: 128 macroblocks (2048 pixels).
    *

    Maximum macroblocks: 8192

    Reply
  • Deanjo - Tuesday, October 13, 2009 - link

    I should also mention XBMC already supports this as well in linux. Reply
  • Transisto - Tuesday, October 13, 2009 - link

    zzzzzzzzzzzzzzzzz............... Reply
  • Souleet - Monday, October 12, 2009 - link

    I guess the only place that actually selling Palit right now is newegg. http://www.newegg.com/Product/ProductList.aspx?Sub...">http://www.newegg.com/Product/ProductLi...&Des... Reply
  • MODEL3 - Monday, October 12, 2009 - link

    Great prices, lol (either they have old 55nm stock or the 40nm yields are bad or they are crazy, possibly the first)

    Some minor corrections:

    G 210 ROPs should be 4 not 8 (8 should be the Texture units, GT220 should have 8 ROPs and 16 Texture units)

    http://www.tomshardware.co.uk/geforce-gt-220,revie...">http://www.tomshardware.co.uk/geforce-gt-220,revie...

    (Not because tomshardware is saying so, but because otherwise, it doesn't make sense NV architects to designed a so bandwidth limited GPU) (and based on past architecture design logic)

    G 210 standard config CPU core clock is 589MHz, shaders 1402MHz.

    (check Nvidia's partner sites)

    9600GSO (G94) Memory Bus Width is 256bit not 128bit.

    http://www.nvidia.com/object/product_geforce_9600_...">http://www.nvidia.com/object/product_geforce_9600_...

    58W should be the figure NV is giving when GT 220 is paired with GDDR3, with DDR3 the power consumption should be a lot less.

    Example for GDDR3 vs DDR3 power consumption:

    http://www.techpowerup.com/reviews/Palit/GeForce_G...">http://www.techpowerup.com/reviews/Palit/GeForce_G...
    http://www.techpowerup.com/reviews/Zotac/GeForce_G...">http://www.techpowerup.com/reviews/Zotac/GeForce_G...
    Reply
  • Souleet - Monday, October 12, 2009 - link

    I'm sure there is cooling solution but it will probably going to hurt your wallet. I love ATI but they need to fire their marketing team and hire some more creative people. Nvidia needs to stop under estimating ATI and crush them, now they are just giving ATI a chance to steal some market share back. Reply
  • Zool - Monday, October 12, 2009 - link

    Its 40nm and has only 48sp 8rop/16tmu and still only 1360MHz shader clock.Is the TSMC 40nm this bad or what. The 55nm 128sp gt250 has 1800 Mhz shaders.
    Could you please try out some overckocking.
    Reply
  • Ryan Smith - Tuesday, October 13, 2009 - link

    We've seen vendor overclocked cards as high as 720MHz core, 1566MHz shader, so the manufacturing process isn't the problem. There are specific power and thermal limits NVIDIA wanted to hit, which is why it's clocked where it is. Reply

Log in

Don't have an account? Sign up now