Back to Article

  • superandroid - Thursday, January 15, 2009 - link

    i think that the 60 ghz microwave frequency will be hazardous to the human brain because it can easily penetrate the human body. Reply
  • 3DoubleD - Thursday, January 15, 2009 - link

    That is incorrect. 60 GHz cannot easily penetrate the human body, which would actually be more reason to fear it than if it could simply just pass through. However, 60 GHz photons have such little energy that they can't possibly negatively influence the human body, just like all other radio communication and microwaves. Now if the transmitter is powerful enough that the absorbed energy turns into a significant amount of heat (eg a microwave oven), then you should worry about cooking yourself. Considering a microwave oven is ~1000 Watts and this transmitter will likely be on the order of a few hundred or thousand milliwatts, you have nothing to fear. Reply
  • michael145 - Tuesday, January 13, 2009 - link

    Man, imagine running Crysis at 4K's resolution with 4AA/4AF on...
    What machine do you need?
  • fly123 - Tuesday, January 13, 2009 - link

    anybody know if this will be usable with a media server, or is the wirelesshd consortium locking it down to oem because of drm fear?
    I mean, do any of you know if there are any plans for making a plugin device that one can plug into a pc and screen of own chosing?
    Ive looked at the consotiums website, but havnt been able to find anything usefull.
  • Milleman - Monday, January 12, 2009 - link

    "Your inputs are connected wirelessly to the TV via a 60GHz signal..."

    That's a lot of Hertz. I didn't think 60 GHz was possible to produce with todays semiconductor technology.
  • Stampede103 - Monday, January 12, 2009 - link

    Does any one know of a netbook that is using Nvidia's Tegra 600 line of processors? Reply
  • sprockkets - Monday, January 12, 2009 - link

    When a netbook can play YouTube on the HQ setting without stuttering, I'll get one.

    I still like the HP one with the nice sized keyboard. If they used a Nano or Dual core Atom with Ion, this problem would be solved.
  • Denithor - Monday, January 12, 2009 - link


    The standard is WirelessHD. Your inputs are connected wirelessly to the TV via a 60GHz signal, capable of transmitting full bandwidth 1080p60 at a distance of up to 30 feet.

  • Jynx980 - Monday, January 12, 2009 - link

    I thought that was really high as well. The closest thing I know of would be wireless phones with 5.8 GHz. This is more than 10x that amount so it seemed out of whack.

    I assume that WirelessHD will support HDCP, but it only states that it supports Digital Transmission Content Protection (DTCP) in the wiki link. Another protection to worry about.

    I like reading about TV stuff. Was there a Vizio booth at all? I would like to see their lineup for '09. Although the line is blurred more each year between TV and PC monitors, I wish there were more focus on PC monitors. It doesn't seem like there has been much innovation in this area.
  • 3DoubleD - Monday, January 12, 2009 - link">">

    Both pages agree that not only is it possible, but the article is correct. This frequency range has (relatively) high attenuation in Earth's atmosphere and (usually) requires line-of-sight so it is really limited in terms of application. It should really make connecting devices much more easy. Apparently the first devices are supposed to support up to 10 feet without line-of-sight. They chose the extreme high frequency band because the strong absorption with oxygen molecules in the air will protect copyright owners by preventing you neighbors from using the signal. A competing technology uses bandwidth around 5 GHz to achieve 250 mbps throughput, but I guess copyright holders were a bit worried with the transmission distance.
  • Jaybus - Tuesday, January 13, 2009 - link

    I doubt they chose the 60 GHz band just to avoid copyright problems. It is because of the bandwidth that they needed. 250 Mbps is not nearly enough. HDMI 1.0 specifies 4.9 Gbps for uncompressed transmission of 1080p60Hz plus 8-chan audio. HDMI 1.3 upped it to 10.3 Gbps to allow for higher resolutions. WirelessHD has a 7 GHz bandwidth on a 60 GHz carrier to achieve a max of around 25 Gbps. You obviously can't have a 7 GHz bandwidth with only a 5 GHz carrier frequency, so the carrier frequency had to be high to truly do HDMI wirelessly. Now why they chose 60 GHz as opposed to say 40 GHz may be because of transmission distance. Reply
  • Galvin - Monday, January 12, 2009 - link

    4K LCD's scare me that just increases your chance of dead pixels by 2x. The sooner FED/SED tech is out the better. Cause that tech doesn't need all the fancy back lighting of LCD to handle blacks/whites.

  • Plifzig - Monday, January 12, 2009 - link

    Even worse, it increases your chances of getting dead pixels by 2X in the horizontal AND 2X in the vertical. Overall it's a 4X chance increase!

    2,073,600 pixels vs. 8,294,400 pixels
  • Denithor - Monday, January 12, 2009 - link

    And can you imagine the GPU required to push a game for that resolution? Reply
  • SlyNine - Tuesday, January 13, 2009 - link

    Geforce 256 SDR ?? ;) Reply
  • 3DoubleD - Monday, January 12, 2009 - link

    "Another potential benefit of full matrix LED backlighting is what Toshiba and some other manufacturers are calling the 240Hz effect. Last year we saw the beginnings of a move to 120Hz LCDs, which you may remember isn’t LCD panels with 120Hz refresh rates. There are only 60 frames of data displayed, the data in between two frames is simply interpolated on the fly effectively giving you 120 frames per second (but from only 60 frames worth of data)."

    Anand, this is one implementation of 120 Hz technology, but I fear it is the incorrect one. Many (if not all) TV manufacturers are producing TVs that use interpolation on 120 Hz TVs. This feature primarily targets sports as many complain about blurring while watching their favorite fast paced games on their several thousand dollar LCD TVs. This is the only time interpolation should be implemented as the effect is rather sickening for movie content. Interpolation of movie frames gives a rather "home video, handycam" sort of feel, completely ruining the experience.

    120 Hz was used to allow both movie and tv content to be viewed without performing an uneven pulldown. On most TVs in the past, 24 Hz content (from Blu-rays, HD DVDs, and properly encoded movies and TV shows on the internet) required a 2:3 pulldown to be shown at 30 Hz and then doubled to the 60 Hz refresh rate of your TV to eliminate flickering. Unfortunately, the 2:3 pulldown isn't perfect and you get a phenomena called telcine judder or motion interpolation. This is very obvious during slow panning scenes, where the panning motion does not seem smooth, but jumps. The only way to properly handle 24 Hz material is to display it at a refresh rate at an even multiple of 24. Thus there are two options: offer two possible refresh rates on your TV (60 Hz and a multiple of 24 but greater than 60 (72 Hz or 96 Hz)), OR offer one refresh rate at 120 Hz as it is divisible by both 24 and 30 Hz.

    TVs such as the Pioneer Kuro line offer the first implementation, where 60 and 72 Hz refresh rates are available. Most LCDs above ~$1500 CAD offer 120 Hz. With most of these 120 Hz LCDs, interpolation can be turned off for proper movie viewing. With all of this said, a TV that has a 240 Hz refresh rate is completely useless as 120 Hz solves the problem of displaying both 24 and 30 Hz content as well as offering interpolation for keen eyed sports fans.

    However, the general public will never understand this... so let the Mhz wars begin (again)
  • Holly - Sunday, January 18, 2009 - link

    You forget one thing... These 240Hz screens might be also inteded as a first step to shutter-glass aided 3D screen simulation. Giving each eye 120Hz is about reasonable refresh rate not to tire the eyes too much. Reply
  • strikeback03 - Monday, January 12, 2009 - link

    If you turn off interpolation, wouldn't that leave 60Hz as the only option? Which would still not provide a multiple of 24 for movies. Reply
  • 3DoubleD - Monday, January 12, 2009 - link

    When you turn interpolation off it should due a 5:5 pulldown, which is simply playing the same frame 5 time in a row. However, some TVs don't do this and they should be avoided. This is something everyone should look for when shopping for a TV. Most TVs with 120 Hz capabilities have off-low-medium-high settings for their interpolation Reply
  • 3DoubleD - Monday, January 12, 2009 - link

    and by "due" I mean "do"... time for the afternoon coffee Reply
  • araczynski - Monday, January 12, 2009 - link

    I guess its a good thing then that I don't watch enough TV/Movies to care about these new tv developments. My two 2 year old LCD's will just have to 'do' until they die.

    the only netbook i've ever liked is that acer aspire one (i believe thats its name), $350 at walmart. only turn off is the lcd size in comparison to the top bezel size, seems like a waste of space. if you make the bezel of a certain size, fill it.
  • JimmiG - Monday, January 12, 2009 - link

    Totally agree with Mooly about the Netbook pricing. A netbook is by definition affordable, and an Atom-powered Netbook performs as you would expect from a $300 ultra-portable.

    However, there is currently a gap in the market that computer manufacturers are trying to fill. There are the classic "ultra-portable" laptops that have been around forever and cost upwards of $2000 or more, and then there are the new $300 Netbooks. There isn't anything in between really. So they're taking their Atom designs and putting them into more stylish chassis to create e.g. a $900 Netbook. Those kinds of computers should really feature low-power Core2 Duo processors instead, but I guess that would drive the price up. A dual-core mobile Atom with a few IPC tweaks, running at 2+ GHz might work, or a scaled down Core2 or i7 CPU.

    The Atom has gained much brand recognition. It seems everyone wants an Atom. It's also a return to Netbust and the "MHz myth" - 1.6 GHz doesn't sound that bad, but what many don't realize is that it's an in-order CPU a la 1993, with an IPC of less than half that of a Core2 CPU.
  • Zoomer - Saturday, January 24, 2009 - link

    How many netbooks weight 1.2lbs? Reply
  • James5mith - Monday, January 12, 2009 - link

    Ahh, IPC, it always comes back to IPC. :) Reply
  • elerick - Monday, January 12, 2009 - link

    I have been really wanting to know some of the features of the plasma's. I have heard the worlds largest plasma plant capable of producing over 1 million units per month is almost complete, in China.

    Do the new plasma panels offer THX or ISF calibration controls?
  • Visual - Monday, January 12, 2009 - link

    Local dimming is a terrible idea - as it also dims the bright pixels in the same general area. It will only be a good feature once it can be done with pixel precision, but that would just mean a LED TV instead of LCD. Reply
  • quiksilvr - Monday, January 12, 2009 - link

    In my opinion, the smallest size a notebook should be is nothing under 11". 11" notebooks are smaller than a sheet of paper, and be it on your lap, on a plane, in a classroom, or in a coffee shop, you will ALWAYS be able to fit a footprint of a sheet of paper in front of you comfortably. As such I feel that 10" and 9" notebooks should not exist.

    Another reason why I feel that 11" should be the smallest size is because you can also fit an optical drive in these notebooks, something I feel is still a necessity. Wouldn't be nice to have your netbook as a portable DVD player too without having to rip the DVD onto a flash drive and plug it into your notebook?

    IMO, the netbook hasn't reached to the point of reaching its full potential. Hopefully by the summer time the following specs can be available as the base model for a $500 price point:

    11" 1280x720 screen (LED, 1.3 MP camera)
    Highest CPU clock available for Atom
    nVidia 9400M GT
    2 GB RAM
    32GB SSD
    802.11 a/b/g/n with Bluetooth 2.1
    3G and GPS (let us activate it with a cell phone provider of OUR choosing, just have it there by default)
    Windows XP (maybe Windows 7)
    6 cell battery

    Until then what's available now simply is just too expensive.
  • strikeback03 - Monday, January 12, 2009 - link

    Unless they would include both GSM and CDMA hardware, you wouldn't be able to activate it on each of the available choices. Reply
  • Penti - Monday, January 12, 2009 - link

    Forget about the Atom and it maybe possible with like AMD Neo @ 1.5 - 1.6GHz. (K8 single-core)

    And for twice the price!

    The problem with high resolution 8-11" panels/screens are that none are made. OEM/ODMs can't do more then simply ordering the parts and put it together really. They aren't cheep either. 1280x800 or 1280x768 is do able though. Samsung has a 10.6" with 1280x768.
  • tayhimself - Monday, January 12, 2009 - link

    Would you like a pie in the sky with that? Reply

Log in

Don't have an account? Sign up now