8-channel LPCM over HDMI

You may have heard that I've recently become somewhat infatuated with HTPCs. I've been hammering on all of the AnandTech staffers to start looking at the needs of HTPC enthusiasts, and I've personally been on a bit of a quest to find the perfect HTPC components.

Blu-ray (and HD-DVD) both support Dolby TrueHD and DTS-HD audio encoding, which offer discrete 8-channel audio output. The problem is that there's currently no way to send a TrueHD or DTS-HD encoded stream from a PC over HDMI to a receiver, the stream must be decoded on the PC. Cyberlink's PowerDVD will decode these high definition audio formats just as well as any receiver into 8-channel LPCM audio, but you need support for sending 8-channel LPCM over HDMI.

Most graphics cards that implement HDMI simply pass SPDIF from the motherboard's audio codec over HDMI, which is unfortunately only enough for 2-channel LPCM or 6-channel encoded Dolby Digital/DTS audio. Chipsets with integrated graphics such as NVIDIA's GeForce 8200 and Intel's G35 will output 8-channel LPCM over HDMI, but AMD's 780G will not.

All of AMD's Radeon HD graphics cards have shipped with their own audio codec, but the Radeon HD 4800 series of cards finally adds support for 8-channel LPCM output over HDMI. This is a huge deal for HTPC enthusiasts because now you can output 8-channel audio over HDMI in a motherboard agnostic solution. We still don't have support for bitstreaming TrueHD/DTS-HD MA and most likely won't anytime this year from a GPU alone, but there are some other solutions in the works for 2008.

To use the 8-channel LPCM output simply configure your media player to decode all audio streams and output them as 8-channel audio. HDMI output is possible courtesy of a DVI-to-HDMI adapter bundled with the card; AMD sends audio data over the DVI interface which is then sent over HDMI using the adapter.

Index NVIDIA's Unexpected Response
Comments Locked

114 Comments

View All Comments

  • Sunrise089 - Friday, June 20, 2008 - link

    Derrick should really clarify the source of the problem then Jarred. We all know on forums everyone says you need a 600 watt PS to even run integrated graphics, but one reason I love AT's real power draw numbers is that they show how little power most sane systems really need. But casually mentioning a 1KW unit isn't enough for even 4850 CF and not explaining further is about as close to pure FUD as I've seen here.
  • DerekWilson - Friday, June 20, 2008 - link

    all these tests have been done at Anand's place and at-the-wall power should not be a problem for any of these recent articles.

    we did have problems with our 1kW thermaltake and our 1kW ocz PSUs with the GTX 280 in SLI. we couldn't get through a crysis run.

    in testing 4850 crossfire, the 1kW ocz power supply (elite xtreme) failed during call of duty.

    we had no problems with the 1200 W pcp&c turbo cool PSU we now have installed.

    our peak power numbers were shown using one of 3dmarks GPU only feature tests. this is in order to isolate GPU power as much as possible for comparison purposes between different graphics cards.

    power draw at the wall will be MUCH larger when playing an actual game. this is because the CPU will be under load and system memory will likely be hit harder as well. we will also see the hard disk active as well.

    i do apologize for not explaining it further. knowing what app we used to test power would probably have done enought to explain why the PSU crashed under game tests but not under our power test with a 1kW PSU ...

    4850 crossfire and up and gt200 sli and up will absolutlely massive ammounts of power to run. we would be the first to say that a 1kW PSU was enough if it were -- but it is not.
  • semo - Saturday, June 21, 2008 - link

    so how much are you drawing at the wall. just saying "MUCH larger" doesn't mean anything.

    this also doesn't make much difference as power ratings refer to how much can be delivered to the system - not how much can be pulled from the socket.

    in other words, there seems to be some confusion. could we get some clarification the next time you do a review for GPUs (e.g. at 4870's launch)
  • flagpole - Saturday, June 21, 2008 - link

    I have a 650w Silverstone Zeus ST650ZF powering my system right now, and it's handling a pair of 4850's Crossfire'd fine.

    Not to mention the 4 harddrives, 5x 120mm fans, Swiftech water pump, an AMD 64X2 4400+ @ 2.7 Ghz, plus various other things like LED's and Cathode tubes sucking back power as well.
  • HOOfan 1 - Friday, June 20, 2008 - link

    How about the fact that nvidia has 2 CWT built 1000W systems certified on SLIzone for dual GTX 280.

    It really perplexes me that you guys think a 1Kw PSU wouldn't be enough for GTX 280 SLi or for 4850 Crossfire. an 800+ Watt PSU should be enough for either. Nvidia even certified the Zalman 850 Watt for dual 9800GX2. Jonnyguru stated that there was a problem specific to the GTX 280 that was not the fault of the PSUs.

    I think you guys really ought to have a talk with nVidia and ATI about this before you just claim that a 1Kw PSU isn't enough for dual GPU for these two cards...because quite honestly that claim sounds rather preposterous to me.
  • strikeback03 - Friday, June 20, 2008 - link

    I was wondering the same - the review says they had power supply problems with 2 4850s in CF, even though the table directly above says that configuration drew 335.7W total system power.
  • Sunrise089 - Thursday, June 19, 2008 - link

    Why the heck are you guys have power supply failures with this card? I know it draws a decent amount of power, but when you're load numbers are less than HALF the rating of the power supply something seems fishy.
  • BPB - Thursday, June 19, 2008 - link

    I thought these cards are to be better than current ATI cards for HD movies. Did you get a chance to play any movies? And if so, ho was the audio?
  • jay401 - Thursday, June 19, 2008 - link

    75C idle, 90C load is insane, i don't care how well the components can tolerate it. It's like an oven inside your case, and -something- will give eventually on it because those temps are nuts. Why does AMD/ATI have such trouble putting out reasonably-temped cards even after yet another die shrink? :(
  • Clauzii - Saturday, June 21, 2008 - link

    They used the die-shrink to ramp up performance, which they needed AND achieved :)

    I hope some Arctic cooling solution will show up even though two slots might be used.

Log in

Don't have an account? Sign up now