Significance on Desktop

The first question that should pop into your head right now is why we would need HDMI on the PC when it physically does the job of DVI – particularly considering how few people actually use DVI instead of analog connections! The answer is, again, copy protection. If we take a step back and look at the larger plans for PCs and media devices in general, the obvious trend becomes the PC’s integral role as an entire entertainment system with considerable weight on Media Center, DVDs, etc. For large content providers like Viacom, Starz! and Discovery Channel to get on board with Microsoft’s dream of IP TV, media center “servers” and set top boxes running stripped-down PC hardware, the obvious scrutiny on security comes to mind as well. No major content providers would consider the Media Center vision if they didn’t feel that their content would be secure from piracy on MCE PCs.

The weakest link narrows down to the user’s ability to transcode on demand media on the PC into something more portable, or the user’s ability to digitally rip the signal off the DVI interface! With Intel’s HDCP tied into the HDMI specification so tightly, manufacturers and content providers would be insane not to push HDMI out the door to replace DVI. The additional perks for HDMI are still there: it’s a smaller cable, can run longer distances without issues, and obviously, the integrated ability to transfer audio too. However, when a tier 1 OEM decides to build their next HTPC, they will certainly come under considerable scrutiny to provide a secure platform if they expect backing from the content providers. The fact that HDMI protects video and audio signaling is enough for content providers to lean on PC manufacturers to adopt the standard over DVI.

Audio poses a fairly large problem for PC manufacturers. While it’s easy for an IGP motherboard to include audio and video on the same interface, graphics cards are only designed for video. At first, graphics cards and motherboards that adopt HDMI will probably opt out of utilizing audio over HDMI as most HDMI-ready devices allow analog stereo input (just as DVI does). However, if we think more long term, fusing audio and video on the same output puts ATI and NVIDIA at particular odds with discreet and integrated audio partners. After all, Intel just released their 8 channel digital audio solution, and companies like Creative and VIA have a significant portion of their business riding on the fact that separate inputs are needed for audio and video. Will we see a synergy from graphics and audio manufacturers to consolidate audio and video back down onto the graphics card? Unfortunately, the PC industry doesn’t have an answer for that question just yet.

Where does this leave DVI? For the PC industry, DVI is just getting its feet off the ground in terms of replacing the ancient 15-pin D-sub analog cables that we have all been using on CRT monitors. There isn’t an advantage for the everyday home user to need an HDCP compliant HDMI LCD panel connected to their computer, although with the backing of a player like Microsoft, it won’t be very long before HDMI starts showing up anyway. For the home theater industry, HDMI is already here and quickly gaining a lot of momentum. DVI won’t disappear overnight in the living room, but you can surely bet that the content providers would love to remove its weakest link in digital copy protection in the near future. Not surprisingly, FCC just mandated that all digital cable ready TVs sold after July 2005 must have DVI-HDCP or HDMI-HDCP capability.

All in all, be aware of the new standard, but don’t be too surprised if HDMI starts showing up on next generation IGP motherboards and then, finally, video cards with audio capabilities. HDMI-to-DVI converters will continue to support older TVs and monitors that don’t have HDMI capability if that monitor is HDCP compatible. The smaller form factor is a welcomed addition for laptops and set top HTPCs, and if audio integration takes off, it will be a welcome fix to the clutter behind the computer. If the PC market shows the same momentum for HDMI that the home theater market has, it certainly won’t be too long until we get these questions answered first hand!


Index
Comments Locked

56 Comments

View All Comments

  • PrinceGaz - Tuesday, January 18, 2005 - link

    Combining audio and video onto one output is a non-issue really, even though it offers no benefit to most PC users and if anything is a slight inconvenience. If HDMI video-cards don't have their own audio-hardware, as someone said they'll just include suitable audio-in headers on the card so you can connect your soundcard directly to it.

    As for the output of the card, the obvious answer is a splitter-cable so you can connect it to both a monitor and amplifier/receiver. Here in Europe we've been using SCART cables for A/V connections for years and you just get the right cable to suit whatever inputs the device or devices on the other-end require. Problem solved.

    As for HDCP and other forms of DRM, yes it sucks. And yes it will arrive and be adopted despite the fact it probably reduces signal-quality (such as the unnecessary D/A then A/D conversion someone mentioned). Many protection methods impact the quality of the material so that is nothing new. However just as surely as HDCP will be adopted, when there is sufficient incentive to do so, it will be cracked. It doesn't matter how complex the protection is, nothing is uncrackable and HDCP is likely to be a prime target. It may take a while, it might require inside help from sympathetic people who work with HDCP (there are always some who don't agree with what they are working on who will anonymously spill the beans), but sooner or later HDCP will be cracked and you'll be able to do what you want with your movies on HD discs.
  • Alphafox78 - Tuesday, January 18, 2005 - link

    Who cares how many cables are behind your TV, its not like everyone can see them. 1 cable or 4, who cares. im sure these new devices will have some older ports on them for component or svideo input or many people wont be able to use them.
  • gonzo2k - Tuesday, January 18, 2005 - link

    I would want HDMI to go from the cable/sat box to CAPTURE HDTV video/sound on a HTPC. I would have no need for unified A/V transfer from HTPC to monitor/TV since the sound would be handled by a seperate audio receiver. An A/V receiver to handle switching between multiple HDMI connections would be nice... but that seems a long way off.
  • crazyeddie - Tuesday, January 18, 2005 - link

    Is it really going to be necessary to combine audio and video onto a single card for the sake of HDMI? I would imagine that a simple coaxial digital feature connector from sound card to video card could solve the problem easily enough. The video adapter would simply have to grab the digital out from the sound card and patch it in with the video signal to the HDMI out.

    Of course getting the picture and sound to sync may or may not be a challenge, but that is another issue entirely...
  • KristopherKubicki - Tuesday, January 18, 2005 - link

    endrebjorsvik: thanks for the catch - they should both read gigaBITS.

    Kristopher
  • endrebjorsvik - Tuesday, January 18, 2005 - link

    In the 2nd paragraph on page 1 you have written that the cable kan handle 5 Gigabit per second (Gbps), while you say it is okay to tranfer 1080p video and 8 channel audio who requiers 4 GigaByte per second (GBps).

    4 GigaByte will be approximately 32 gigabit, but the cable only takes 5. How do you figure that out?
  • arfan - Tuesday, January 18, 2005 - link

    Anand, where is your newest article about Nvidia Ultra vs SLI ??? why suddently missing ??? What happen ????
  • Fluff - Tuesday, January 18, 2005 - link

    In a perfect world we would all be using HD-SDI and AES/EBU.

    Remember HDMI can support up to 12bit .
    DVI is upto 8bit.

  • bersl2 - Tuesday, January 18, 2005 - link

    #30: One of the reasons why I voice my displeasure about DRM in places like here is because there aren't enough people aware that such a thing is taking place. Letters and emails are great and all, but unless people start *talking about it*, it's not going to get through politicians' thick skulls that we disapprove of their listening to the lobbying of the entertainment industries.

    I stick behind my earlier assertion that to allow DRM is to relinquish control of one's viewpoint on the world. It will lead to the Internet becoming like TV is, controlled by an oligarchy of "content providers," backed by law.

    And just out of spite, I'm putting this post in the public domain. :P
  • quasarsky - Tuesday, January 18, 2005 - link

    http://www.anandtech.com/news/shownews.aspx?i=2290...

    This neat article I saw was about using GPU's to crunch audio. Worth a look as relates to HDMI :)

    9 - Posted on Sep 6, 2004 at 10:02 PM by Saist Reply
    when I first read the title here on Anand, I thought the article was going to be about how Creative Labs was spending more time creating and patenting IP instead of developing newer sound systems ( e.g. Carmark's Reverse).

    Another example is this story over at Xbit : http://www.xbitlabs.com/news/mmedia/display/200409...

    some business has figured out how to get GPU's to crunch out high quality audio. Now, I don't know about you, but right now a Radeon 9600 XT or a Geforce FX 5700 is cheaper than a top of the line Audigy2. To think that I could use that in place of an audigy2 and get superior sound quality (and yes... I am aware of the physical issue of how to pipe that sound out of a graphics card, but at least the capability is there)

    I agree... Tweaktown really dropped the ball. There's a whole other place they could have gone to but didn't

Log in

Don't have an account? Sign up now