Flash acceleration has traditionally worked without issues in AMD and NVIDIA drivers, unlike Intel. Intel and Adobe got it right with Ivy Bridge. Fortunately, things look good with Trinity too. As the screenshot below indicates, we have full GPU acceleration for both decoding and rendering. AMD's System Monitor shows how the CPU and GPU resources are balanced when playing H.264 Flash videos.

Netflix streaming, on the other hand, uses Microsoft's Silverlight technology. Unlike Flash, hardware acceleration for the video decode process is not controlled by the user. It is up to the server side code to attempt GPU acceleration. Thankfully, Netflix does try to take advantage of the GPU's capabilities.

This is evident from the A/V stats recorded while streaming a Netflix HD video at the maximum possible bitrate of 3.7 Mbps. The high GPU usage in the AMD System Monitor also points to hardware acceleration being utilized.

One point which deserves mention here is that Flash and Silverlight acceleration works without hiccups here, unlike what we saw in the Brazos-based machines (where the CPU was too weak despite the availability of hardware acceleration through the GPU).

Video Decoding and Rendering Benchmarks Miscellaneous Aspects and Final Words
POST A COMMENT

49 Comments

View All Comments

  • ganeshts - Thursday, September 27, 2012 - link

    Hmmm.. all vendors tag 23.976 Hz as 23 Hz in the monitor / GPU control panel settings. So, when I set the panel to 23 Hz, I am actually expecting 23.976 Hz. However, this platform gives me 23.977 Hz which is a departure from the usually accurate AMD cards that I have seen so far. Reply
  • ChronoReverse - Thursday, September 27, 2012 - link

    23.977 and 23.976 are so close that it's basically the same (the error in measuring tools would be as large as the difference). I'd only be concerned if it were 23.970.

    In any case, from looking at the screenshots in the gallery, the only frequency looking rather off is 60Hz (although my AMD card has always given similar lower than 60Hz results anyway).
    Reply
  • ganeshts - Thursday, September 27, 2012 - link

    Note that these are in Hz, not MHz. So, the margin for error is quite large. In fact, madVR statistics deliver accurate refresh rates up to 6 decimal digits (as the screenshots show).

    To read more on why the 0.001 Hz difference matters for SOME people, look this up: http://www.anandtech.com/show/4380/discrete-htpc-g...

    In short, with the 0.001 Hz difference, the renderer might need to repeat a frame every ~17 minutes. I am NOT saying that this is a serious issue for everyone, but there are some readers who do care about this (as evidenced by the range of opinions expressed in this thread: http://www.avsforum.com/t/1333324/lets-set-this-st...
    Reply
  • ChronoReverse - Thursday, September 27, 2012 - link

    That thread on avsforum is talking about 24FPS playback where if you got 23.97x instead, it's a stutter about every 42 seconds which is terrible and clearly not acceptable (to my eye anyway).

    Still, I do admit that even a single stutter every 17 minutes is noticeable.

    Also, I had misread that part of the review a bit since for some reason I had the impression it was saying the performance of AMD has diminished when it's still about the same +/- 0.002Hz
    Reply
  • jeremyshaw - Thursday, September 27, 2012 - link

    Wasn't AMD's first APU Brazos, not Llano? Or was it too small to really count!? Reply
  • ganeshts - Thursday, September 27, 2012 - link

    Technically correct, but it didn't compete in the same level as the Clarkdales / Arrandales / Sandy Bridge lineup :) Reply
  • jamawass - Thursday, September 27, 2012 - link

    "The video industry is pushing 4K and it makes more sense to a lot of people compared to the 3D push. 4K will see a much faster rate of adoption compared to 3D, but Trinity seems to have missed the boat here. AMD's Southern Islands as well as NVIDIA's Kepler GPUs support 4K output over HDMI, but Trinity doesn't have 4K video decode acceleration or 4K display output over HDMI."
    Although this statement is technically correct it has no real world relevance. At this time people who can afford 4k TVs ( if there any commercially available ones at this time) won't be messing around with cheap htpcs. It's an inconsequential statement made just to detract from AMD's overall superiority with this product in the htpc market.
    If I was in AMD's shoes why would I dedicate resources to a nonexistent market ? Has anyone actually tested Nvidia or Intel's 4K output over HDMI to see whether they actually work? In the early days of HDCP all the video card manufacturers were claiming compliance but real world compatibility was a different matter.

    '
    Reply
  • ganeshts - Thursday, September 27, 2012 - link

    I had the same caveat in the Ivy Bridge HTPC review. Surprised you didn't notice that, but you notice this :) Ivy Bridge doesn't support 4K over HDMI yet.

    Anyways, yes, we have test 4K output from both NV and AMD. When AMD 7750 was released, we didn't have access to a 4K display, but things changed when the GT 640 was released:

    http://www.anandtech.com/show/5969/zotac-geforce-g...

    I don't have a sample image ready for the 7750, but I can assure you that it works as well as NVIDIA's and I have personally tested it. In fact, AMD was the first to 4K output over HDMI.
    Reply
  • JNo - Saturday, September 29, 2012 - link

    More importantly do you have any 4K films to watch? No. Will you in the immediate future? No. Even then, when will *most* new films coming out be available in 4K? Probably in 5 years time when you'd build a new HTPC anyway.

    The 4K thing is absolutely irrelevant at this point (unlike 3D I'd argue because you can go into plenty of shops and buy actual 3D media).

    After Hi Def came out hardware (TVs) were available quickly but it took a *long* time before there was plenty of 1080p material anyway (note use of the word, 'plenty'). Hell, most people I know are still watching stuff in SD. Laughably, 4K isn't even close to being out yet, let alone the content.

    The whole thing's a red herring right now and for a long while.
    Reply
  • Cotita - Thursday, September 27, 2012 - link

    I'm not sure I'd go for an A10.

    Even a A4 3420 would do pretty much the same.

    Heck, If I don't care about HD flash or silverlight even a E-350 is enough
    Reply

Log in

Don't have an account? Sign up now