After installing VLC 1.1.0, I was surprised to find that Blu-Ray sample clips continued to stutter during playback. I then realized that GPU acceleration was disabled by default. The option is hidden in the Preferences window accessible through the Tools menu.

Experimental GPU Acceleration Option in VLC 1.1.0 > Tools > Preferences


The three graphs below show the maximum CPU usage during the course of playback with and without GPU acceleration (X-axis) for each of the 8 files listed in the previous section (Y-axis). A completely unwatchable video has no entry corresponding to it. Most of the videos showing 100% utilization were watchable except for a few stutterrs and dropped frames.

A quick look at the graph for the Intel i5-430M below show that the VLC - GPU interaction for H.264 is a complete failure. Upon initializing any H.264 stream, the screen turned completely green. On the other hand, VC-1 decode acceleration is not broken like H.264. CPU usage is lesser with acceleration turned on, but not by much. On being contacted with these details, VLC developer Jean-Baptiste Kempf indicated that the issue was quite simple, and was quite confident that the code would work as soon as the developer team had access to an Intel box.

Moving on to Nvidia's PureVideo VP2 decoder in the Quadro FX2700M, we find that both the L4.1 H.264 streams were accelerated without issues. However, L5.1 videos having more than 4 reference frames were rendered unwatchable due to extensive artefacting despite the fact that CPU usage remained low. From the same graph, we also find that VC-1 videos aren't accelerated as well as H264. This is due to the fact that the VP2 decoder doesn't provide VLD acceleration for VC1, but only IDCT. VLC manages to make use of the IDCT acceleration a little bit, but, obviously, the results are not as good as what one could achieve with VLD.

The GeForce G210M has Nvidia's latest PureVideo VP4 decoder (which supports acceleration for even MPEG-4 / DivX, but we are not testing those here). We observe that both H264 and VC-1 get accelerated as expected, but the L5.1 streams still have an issue. Jean-Baptiste Kempf seems to think that the L5.1 problem could be a result of issues with Nvidia's drivers as well as VLC code. A fix is expected once a bug report with a sample file is filed.

 
Testing Methodology Final Words
Comments Locked

74 Comments

View All Comments

  • ganeshts - Friday, June 25, 2010 - link

    Software able to use multiple cores are better off, as you rightly observe.

    However, we just saw how VLC had to use 100% of the CPU to decode some of the HD videos.

    For the non-technology folks, VLC is probably the only media player installed by whoever set up the computer for them. If they try to see HD videos, they are going to hit issues with CPU usage. VLC's GPU acceleration is meant for people like them :)
  • fabarati - Friday, June 25, 2010 - link

    Using ffdshow tryouts that came with CCCP (from 2008 at that), i could decode almost all 720p files at 1.2 GHz (on a 2.4 Ghz T7700). At 2.4 GHz I could take on any 1080p file, including a massive 24 GB rip of The Godfather.

    Now I have it set up to run DXVA first, then a fairly recent FFdshow tryout, with the h.264 decoder on MT.
  • fabarati - Friday, June 25, 2010 - link

    (no Edit)

    When I was using coreavc (1.9.0), i could take on nearly any 1080p file at 1.6GHz, and that beastly godfather rip at 2 GHz.
  • 0roo0roo - Friday, June 25, 2010 - link

    well no once again, if you install vlc on a n00bs computer, they are simply not likely to run a raw bluray rip. they are neither going to download or know how to get their hands on such a file of massive proportions. any web rip or youtube type ripped 1080p files or apple trailers are FAR easier to playback than bluray. playing back those on even an older dual core is easy as pie.

    That being said, somethings wrong with the 100% usage thing. even with software players based on anandtechs own 2006 article an e6700 2.6ghz could playback bluray in software. and that was back when chips were both slower, and the software decoders far less efficient. or perhaps vlc is just not that efficient at bluray.
  • ganeshts - Friday, June 25, 2010 - link

    High-Def is becoming more and more popular. A n00b will probably get MKVs from friends on a USB drive and expect to play it back couple of years down the line ( maybe, even right now :) )

    The 100% usage is because VLC is pretty crappy and uses a single threaded implementation to decode when GPU is not enabled. Also, the streams are pretty taxing (16 reference frames , 60 frames per second and so on).

    Also, I would mention that having 'trouble' with CPU decoding might mean dropped frames, stutters, sudden spikes in CPU usage and kicking in of the CPU fan etc. etc.
  • 0roo0roo - Friday, June 25, 2010 - link

    yea but those "mkv"' are at a fraction of the original blurays bitrate, and thus easier to play, at most they are 12gb or so, and those are more rare, the more common 1dvd size or 2 dvd rips easily play even on a 2ghz core 2. i know, i've tried this before. and even that is really not common usage for a true n00b who will at best run apple trailers, and those don't use vlc.
  • ganeshts - Friday, June 25, 2010 - link

    It is just not the bit rate, right? Actually, more than the bit rate, it is other encoding characteristics such as reference frames which are the issue in a PC. (On a PMP with hardware acceleration, it is the other way round). Most PCs have more than enough bandwidth to handle high bit rate scenes, but the CPU intensive calculations are what causes the spikes in CPU usage and stutters (I am trying to find some information about which part of the decode process consumes most time in CPU based decoding.. Let me know if you find any thing relevant !)

    VLC's development motto, I feel, is that they should be able to play back anything and everything perfectly. From that viewpoint, it makes sense for them to develop GPU accelerated playback, though their primary target audience might not make use of it :)
  • 0roo0roo - Friday, June 25, 2010 - link

    stuff like global hot keys/multimedia kb support have sat in the interface broken for years now while they play iwth other stuff.. Its just kind of annoying.
  • legoman666 - Friday, June 25, 2010 - link

    Can you compare the HD decoding performance of VLC, WMP, MPC-HC and other software? I'd do it myself, but I have an ATI card ;)
  • mindbomb - Friday, June 25, 2010 - link

    i can tell you what the results would be.
    WMP 12 would have the lowest cpu usage, followed closely by mpc hc, and vlc would be in last by a large margin.

Log in

Don't have an account? Sign up now