8-channel LPCM over HDMI

You may have heard that I've recently become somewhat infatuated with HTPCs. I've been hammering on all of the AnandTech staffers to start looking at the needs of HTPC enthusiasts, and I've personally been on a bit of a quest to find the perfect HTPC components.

Blu-ray (and HD-DVD) both support Dolby TrueHD and DTS-HD audio encoding, which offer discrete 8-channel audio output. The problem is that there's currently no way to send a TrueHD or DTS-HD encoded stream from a PC over HDMI to a receiver, the stream must be decoded on the PC. Cyberlink's PowerDVD will decode these high definition audio formats just as well as any receiver into 8-channel LPCM audio, but you need support for sending 8-channel LPCM over HDMI.

Most graphics cards that implement HDMI simply pass SPDIF from the motherboard's audio codec over HDMI, which is unfortunately only enough for 2-channel LPCM or 6-channel encoded Dolby Digital/DTS audio. Chipsets with integrated graphics such as NVIDIA's GeForce 8200 and Intel's G35 will output 8-channel LPCM over HDMI, but AMD's 780G will not.

All of AMD's Radeon HD graphics cards have shipped with their own audio codec, but the Radeon HD 4800 series of cards finally adds support for 8-channel LPCM output over HDMI. This is a huge deal for HTPC enthusiasts because now you can output 8-channel audio over HDMI in a motherboard agnostic solution. We still don't have support for bitstreaming TrueHD/DTS-HD MA and most likely won't anytime this year from a GPU alone, but there are some other solutions in the works for 2008.

To use the 8-channel LPCM output simply configure your media player to decode all audio streams and output them as 8-channel audio. HDMI output is possible courtesy of a DVI-to-HDMI adapter bundled with the card; AMD sends audio data over the DVI interface which is then sent over HDMI using the adapter.

Index NVIDIA's Unexpected Response
Comments Locked

114 Comments

View All Comments

  • Final Destination II - Sunday, June 22, 2008 - link

    Well done!
    Good not everyone here is asleep :)
  • Wwhat - Saturday, June 21, 2008 - link

    I noticed bioshock was not tested at all with AA, so I'm curious, what's the status on that? From what I read you could initially enable AA on ATI by renaming the executable to oblivion.exe and AA was a DX9-only feature, but from what I gather DX10.1 should support AA in DX10 games, that's part of the .1 improvements I thought.
    And I heard nvidia was, back when bioshock came out, working on a driver trick to enable AA in it.
    So I'm wondering what the real life status is on such issues in DX10 games?
  • Griswold - Saturday, June 21, 2008 - link

    How come you need a 1200W PSU when this german guy at

    http://www.forumdeluxx.de/forum/showthread.php?t=5...">http://www.forumdeluxx.de/forum/showthread.php?t=5...

    runs them in CF with a 750W PSU and even triple CF with 2x 4850 and a 3870?

    Piece of shit PSU on your test bench?
  • geogaddi - Saturday, June 21, 2008 - link


    that, AND my 328xi gets 28 mpg. those ever-efficient germans...

    'ze goggles! zey do nothing!'
  • Glock24 - Saturday, June 21, 2008 - link

    Radeon HD4850
    956 million transistors on 55nm fabrication process
    40 texture units
  • Straputsky - Friday, June 20, 2008 - link

    I think there is a difference in the strategy. nVidia produces one large chip and that's their high-end product. AMD uses smaller chips but puts two of them on one board. There is no need having a crossfire able Motherboard, cause it's only one single card. nVidia can't do that with their GT200 (due to huge Die size, power consumption). That means if nVidia would like to counteract they have to use their G92 chip. I think it's a different thing having Crossfire/SLI realized on one board or two due to the Motherboard restrictions.
    Next thing is cost level: It seems that producing two chips on one board is cheaper than one big chip. And it should be easier to cool down two smaller hotspots than one big one.
  • TheJian - Saturday, June 21, 2008 - link

    Nvidia can't do it today. Correct. But the die shrink has already taped out and should be out in 2 months or so just about the same time AMD does it with 4870x2 which should make a GTX280X2 doable.

    For the next two months you DO need CF for 4870 (which isn't even out yet to begin with) or 4850. Also these two in one cards have driver issues as shown by many sites. You are better off much of the time getting 100% of a single chip than trying to get the same out of two chips. No driver tricks etc are needed with a single large chip. No timing issues and stuttering either with single chip.

    They can just replace the current GX2 with these new + chips to speed up the current GX2. Should produce a nice boost and no new work is even needed. Just swap chips and the design is done. I'm sure they're already working on it or maybe even done just waiting for the need after 4870. A die shrink on GX2 should make a good match for 4870. The same in two months on GTX280 (The shrink will cheapen it up, speed it up and allow dual GTX280) should easily dominate any 4870X2 (even a single one will be problematic, 2 just gives nvidia serious bragging rights).

    I worry AMD didn't quite do enough AGAIN (phenom, mumble grumble). They can't take much more losses before we end up with Intel boning us on cpu's, and Nvidia boning us on video cards. I'd hate to see the low end go to crap as they used to be. If AMD dies nvidia will spread the high/low end further apart again and force us to pay more. Right now $100-200 gets you a damn fine gaming experience. A little overclocking and these cards are superb for the money. I guess we can always hope Intel might actually put out a decent card if AMD dies. Cost would not have been a problem for Nvidia at 55nm. So that's going away in 2 months. It would have made nvidia late to the party so from a stockholder's standpoint it's a great idea as long as you don't lose money on each card. Gimp 2 months until die shrink and then make your money from back to school/xmas all while holding AMD down. Barcy is a prime example of why AMD should have just taken the simple route and glued 2 chips together and put a quad in the market a year early. Heck, even Hector admits this now. With the 4850 at $170 I wonder how much AMD is making on these. But hey, get em while they're hot :) That's a good deal.
  • Final Destination II - Saturday, June 21, 2008 - link

    Your crystal ball future-foretelling skills are asthonishing. Maybe we got us a Nvidia-Insider here?

    I doubt it...


    Nvidia (living up to it's name) will do its best to stop the HD 4850, but this time ATI has the borg card: resistance is futile. It's an over all nicer, faster, cheaper package they assembled.

    Plus, I won't just sit here and talk - I will buy one (or the HD4870 - gotta wait a bit till that thing is benchmarked).
  • TheJian - Saturday, June 21, 2008 - link

    No crystal ball just common sense. NO nvidia insider either. The taping out of the die shrunk GT200 has been reported elsewhere already, I'm not forecasting anything. It's widely known you can have a card in hand 12 weeks from tapeout. What's surprising about Nvidia wanting to shrink a FREAKY huge expensive chip? If you need a crystal ball to predict what I said you should just give it up. What's "over all nicer"? 4850 is probably faster than a 9800GTX (but I wonder about minimums - see below)? Cheaper? Not by much after the price cut and 9800GTX+ announcement.

    http://anandtech.com/video/showdoc.aspx?i=3338">http://anandtech.com/video/showdoc.aspx?i=3338
    Even AMD sees the 4850 competing against the 8800, NOT 9800GTX which is reserved for the 4870 in their slide. Now the GTX is priced competitively to the 4850 raining on AMD's slide. The 4870 is expected to be $299 or so. It's now competing against a $229 9800GTX+ which got a pretty healthy performance boost. No crystal ball needed to see tough times for 4870 $299 vs 9800GTX+ $229. Just as tough for 4850 vs 9800GTX now. Is AMD going to revamp that slide now? AMD KNEW the performance of all these cards (except maybe GTX280/260) before making that slide. Expect another price cut after all 9800GTX's are gone and every one is a GTX+ die shrunk version allowing much cheaper pricing. AMD isn't going to become profitable at these prices, and especially not when nvidia is shrinking the price defecit of everything in the span of 2 months. With the GX2 going for $415 after rebate at newegg, and a die shrink undoubtedly coming shortly, how much will it cost then? Nvidia cut GTX from $269 to $200. That's $70 and it only has one shrink. What will GX2 get with 2 chips shrinking? Even if you just take the $70 off that puts it REALLY close to a single $299 4870. Which would you take? GX2+ will walk away with that one. I could see myself upgrading from 8800GTOC to GX2+ for $280-300 at xmas this year :) I figure it will debut at $340-350 and should easily make $300 by xmas.

    Anand's 4850 vs 9800GTX shows:
    Crysis=GTX
    COD4=4850
    ETQW=4850
    Assasin=4850
    Oblivion=GTX
    Witcher=TIE
    Bioshock=4850
    Those are just avg's though. When you look at the minimum fps things change a bit.
    http://www.pcper.com/article.php?aid=580&type=...">http://www.pcper.com/article.php?aid=580&type=...
    Anand shows 4850 @ 43fps vs GTX 40.7 at 2560x1600. PCPer shows GTX winning minimums at 2048x1536 and up. The GTX+ does it from above 1600x1200. I'm interested in seeing hardocp/pcper extensive 4850 minimums now. Pcper only did 2 games so far.

    Do you need a crystal ball to predict everything will shrink and get cheaper? Punch "GT200 die shrink taped out" into google. It was reported in may. It's only 2 months out. GX2+ just needs to change out old 65nm for already shrunk GTX+ chips, nothing special there. Why is this confusing to you? You don't think the price will drop from $415 when 4870 arrives? With 2 chips shrinking 100mm's or so each? Both will be running faster also. OUCH. 4870 has a tough sell vs a FASTER/CHEAPER GX2+. AS it stood when 4850 reviews hit AMD looked pretty good. But cheaper + models with more speed for ALL of nvidia's lineup in 2 months sucks for AMD. Even if Nvidia takes an extra month on all this it still sucks for AMD. Nvidia will be in time for back to school/xmas which pretty much makes either sides FY (heck Intel etc for that matter...H2 is where the money is made). Nothing I said is news. I just repeated it. The GT200 will drop from 576mm to 400mm. Quite a savings in cost/heat/watts, and allows cranking it up more. AMD can't afford another price war. They just started nosing around AGAIN for more money from Dubai last week. They're already 5B in debt. They are expected to show a FY09 loss AGAIN. Can they survive another 1.5yr without profit?
  • Miggle - Tuesday, June 24, 2008 - link

    4850 can be had for as low as $170. That's cheaper than the new price on the 9800GTX. You've confirmed yourself that the 4850 is generally faster than the 9800GTX (except for the min fps at uber high resolutions that most wouldnt/couldnt play on). The new price on the 9800GTX hasn't been implemented yet for what I know. AMD wins here.

    Then we have the GTX+ which has been well received in available reviews and is a bit faster than the 4850, priced at $230. I've read somewhere that a DDR5 version of the 4850 will be released for $230. It boasts a much faster memory speed but the core would be clocked the same. Which would be a better buy, we've yet to find out.

    Finally, we have the 4870 which AMD has placed against the GTX. But knowing that even the GTX+ is about on par with the 4850, I wouldn't agree with AMD. The 4870 should be about 20-30% faster than the current 4850 (based on released numbers from unconfirmed sources) and (hopefully) have dual slot cooling so its on a league of its own once released. We've yet to see how NV will respond to this one.

Log in

Don't have an account? Sign up now