WiCast in Practice

Getting the ASUS WiCast to work really is a breeze. The manual notes the transmitter and receiver may take as much as a minute to sync, but my experience was much better. With the two boxes about five feet apart, syncing was actually very quick once Windows loaded, and within Windows the solution was as transparent as it should be. The WiCast-connected monitor appeared the way any wired monitor would, and I was able to switch audio over to the WiCast easily.

My first test was to see if the WiCast could hit 1080p, and sure enough it could. Latency, at least on the Windows desktop, was invisible. At 1080p (60Hz), the solution was largely seamless. In fact the only artifacting I really saw was in high-contrast areas, where there would be slight flickering on the edges of shapes and letters. The whole image appeared slightly darker than it did on a wired connection.

The next step was to see how well it handled audio, so I fired up WinAMP and put my usual audio testing whipping boy, The Prodigy's "Spitfire", through its paces. Audio quality between wired HDMI and the WiCast was indistinguishable, though it did serve to highlight how poor the speakers in my television are. It's reasonable to assume the WiCast probably handles multichannel audio perfectly fine, but I have a hard time imagining a home theatre enthusiast who would opt to use the WiCast instead of a hard line for reasons that will become clear soon enough.

For me, the big test was latency, something Intel's WiDi has a real problem with. I fired up Quake Wars (yes, some of us still play this), set it to 1080p, and was up and running. Gameplay was nigh-indistinguishable from a wired connection. ASUS advertises a latency at or below 1ms and while I can't confirm that, I can tell you that from a gaming perspective the WiCast is remarkably fluid and responsive. It's worth noting that this is one area where the WiDi simply can't compete: while I was able to use the ATI Mobility Radeon HD 4650 in my notebook to push polygons in Quake Wars, WiDi is restricted to Intel HD graphics only. So in this test we've already exposed two things WiCast can do that WiDi can't: game, and game at 1080p.

Finally I wanted to test Blu-ray playback, and it was here that things started to get a little hairy. There weren't any HDCP issues, but when I tried to play Iron Man 2, the WiCast started to have trouble with interference. It wasn't anything game-breaking, but there were five horizontal lines of artifacting on the screen, evenly spaced. Adjusting the transmitter seemed to help a little, and often the lines would go away on their own, but nonetheless the WiCast seemed to have a hard time keeping a clean signal at just five feet away.

With the above in mind, we did some additional testing of the signal quality at five foot intervals. Keeping in mind the WiCast is rated at "up to" 33 feet, we were unimpressed with the amount of blocking and other artifacts even at close range with Blu-ray, and it quickly gets worse as we move away from the receiver. Oddly enough, we had better experience testing the WiCast with a Gateway ID49C than we did with a Dell Studio 17—the former worked at up to 20 feet without any noticeable problems while the latter had periodic issues even when the receiver/transmitter were nearly on top of each other.

The signal ended up being more of a case of all or nothing: it either worked or it didn't, though sometimes other factors seem to come into play (a person moving in between the transmitter and receiver, or interference from other electronic devices). Since the WiCast is also device agnostic, you can use it with a PlayStation 3, Xbox360, or any other HDMI-equipped hardware. Again, the most likely use seems to be laptops, simply because anything else is already hard-wired for AC power. Also worth noting is that we measured power draw on the transmitter of 5.7-5.8W, which means if you're running off a notebook's battery, you'll take a pretty significant hit to battery life.

The ASUS WiCast Conclusion: Lots of Wires for "Wireless"
POST A COMMENT

37 Comments

View All Comments

  • cactusdog - Monday, November 01, 2010 - link

    Haha, thats funny. Wireless that introduces more cables than a wired connection? Sounds like a big hassle. Reply
  • Sihastru - Monday, November 01, 2010 - link

    I agree, 2 power bricks (with power cables), 2 new boxes, and 3-4 wires for them, just to replace 1 simple cable... It's just that wireless should reduce clutter, not add to it. Reply
  • Solandri - Monday, November 01, 2010 - link

    I can't help but think that all this is trying to reinvent the wheel.

    Most HDTVs already have over the air digital receivers built in. All we need is for the FCC to open up one or two of the digital TV channels for public broadcast at low wattage like they did the 2.4 GHz spectrum. Then you can just make a doohickey which plugs into your laptop which converts the screen image to a DTV signal and broadcasts it in one of those OTA channels.
    Reply
  • jonup - Tuesday, November 02, 2010 - link

    That's actually a good idea. I like it! We do not even need an FCC approval. We can just order it from China. Reply
  • RangerDave - Tuesday, November 02, 2010 - link

    in my line of work (custom home theaters, home automation etc....) there are quite a few installs where we are laying down 50 ft HDMI cables that cost a fortune (aspecially if where trying to stick to hdmi 1.4 standard for 3D) and other installs where we use hdmi to cat5e to hdmi. anyways that being said, i can see this being a viable solution to some setups that i have done but in a typical home theater setup, is completely useless. Reply
  • Homerboy - Tuesday, November 02, 2010 - link

    Cost a fortune?
    Hell I just bought a 35ft HDMI cable for $23.
    Reply
  • enderwiggin21 - Wednesday, November 03, 2010 - link

    For in-wall or in-attic runs you're not going to use the $10 cable from monoprice. That's better suited for runs within your rack. You're going to want CL2 or CM rated cable.

    When you have customers and the responsibility to pander to the lowest common denominator, using cable that's tested out to the required length leaves less room for error. The longer the run, the more difficulty at passing the speed tests, ergo the more expensive the cable. Even though the cheaper cable could get the job done just fine, this is an installer's livelihood. Better to use tested, durable cable for such runs than not.

    Bluejeanscable sells CM cable tested and rated cable for $135 for 50ft. To me that's not ultra-expensive; not for 50ft. But even then, it's only tested for Category 2 speeds to 25feet, and Category 1 speeds to 45feet. So imagine how expensive it would be to test out to Category 2 to 50feet. And they're considered a great bang-for-your-buck vendor.
    Reply
  • enderwiggin21 - Wednesday, November 03, 2010 - link

    For clarifcation, CL2 and CM ratings are "to code" for in wall cable runs.

    Someone who makes their living installing has to be "up to code." If they weren't they'd be at risk if something went wrong (a fire, water leakage, etc) and the cable was a catalyst. Or if the owner of the house decided to sell it at some point and it was determined the wiring wasn't up to code, that could jeopardize the home owner's sale as well as open the installer to liability claims.

    If you're DIY'ing it, then you could do whatever you want. Caveat emptor.
    Reply
  • mikeyD95125 - Wednesday, November 03, 2010 - link

    You actually can get CL2 rated cables at monoprice.Here's 50ft for $56. <a href=http://www.monoprice.com/products/product.asp?c_id... Reply
  • enderwiggin21 - Wednesday, November 03, 2010 - link

    That's a great bargain and I would be tempted to try that in my own installation so I could easily undo it if there were problems. However, it's only rated for "Standard Speed," which is Category 1.

    I know some of the price premiums are snake oil like Monster Cable, but if I run a business I'm using something tested for the product running through the cable and its distance, not leaving something up to chance, *if at all possible.*

    Forgive the length, but...

    Per HDMI.org,

    * Standard (or “category 1”) HDMI cables have been tested to perform at speeds of 75Mhz or up to 2.25Gbps, which is the equivalent of a 720p/1080i signal.
    * High Speed (or “category 2”) HDMI cables have been tested to perform at speeds of 340Mhz or up to 10.2Gbps, which is the highest bandwidth currently available over an HDMI cable and can successfully handle 1080p signals.

    Q. Will my Standard cable work in High Speed applications?

    Although a Standard HDMI cable may not have been tested to support the higher bandwidth requirements of cables rated to support high speeds, existing cables, especially ones of shorter lengths (i.e., less than 2 meters), will generally perform adequately in higher speed situations. The quality of the HDMI receiver chip (in the TV, for example) has a large effect on the ability to cleanly recover and display the HDMI signal. A significant majority, perhaps all, of the HDMI TVs and projectors that support 1080p on the HDMI inputs are designed with quality receiver chips that may cleanly recover the 1080p HDMI signal using a Standard-rated HDMI cable. These receiver chips use technology called “cable equalization” in order to counter the signal reduction (attenuation) caused by a cable. We have seen successful demonstrations of 1080p signal runs on a >50 ft. cable, and a 720p signal run on a >75 ft. cable. However, the only way to guarantee that your cable will perform at higher speeds is to purchase a cable that has been tested at the higher speeds and labeled as “High-Speed.”

    1. Standard cables (referred to as Category 1 cables in the HDMI specification) are those tested to perform at speeds of 75Mhz, which is the equivalent of an uncompressed 1080i signal.
    2. High Speed cables (referred to as Category 2 cables in the HDMI specification), are those tested to perform at speeds of 340Mhz, which is the highest bandwidth currently available over an HDMI cable and can successfully handle 1080p signals including those at increased color depths (e.g. greater than eight bits per color) and/or increased refresh rates (e.g. 120Hz).
    Reply

Log in

Don't have an account? Sign up now