In this afternoon’s CES keynote session, the speaker was Paul Otellini, Intel’s CEO. As Intel already made its major CES announcements yesterday, this session was lighter fare geared as a general pep-rally for CE device manufacturers are not particularly meaty with device details, but Intel did show off some information that bears mentioning.
Intel’s message to CE device manufacturers is that they feel the CE industry has already conquered many of the current major markets they participate in (reinforcing this with a parody of Video Killed the Radio Star, where the Internet killed just about everything else) and that CE device manufacturers need to be looking in to new markets to continue CE growth. Much of Otellini’ keynote was focused on augmented reality and more personalized experiences with consumer electronics, the direction that Intel believes CE device manufacturers should be going. To that extent Intel showed off two sets of products they and their partners are working with that they feel are prime examples of this technology.
The first was a smartphone device that took the augmented reality idea quite literally, featuring a built-in camera that augmented what it saw. With this Intel demonstrated the device in a mock-up of a Chinese location where the device would translate things it saw on the fly, such as menus and street signs. Furthermore it could pull up related information (information about Chinese dishes, restaurant reviews, etc) and augment those on to the image. Finally the device was capable of not-quite real time audio translation as they held a (presumably scripted) conversation with an actress speaking Mandarin.
Like most presentations of this nature, what Intel is showing isn’t something that’s ready yet. The processing power for the device doesn’t exist yet in a mobile form, so it was a full sized computer behind the scenes doing all of the processing. However the software was real and everything we saw was occurring in real time, so what Intel was showing off wasn’t too far fetched and definitely was both technologically impressive and potentially useful.
Paul also quickly rattled off the 4 things Intel believes to be necessary to make the demonstrated technology viable in the real world: More processing power, widespread wireless broadband (WiMax), a contextual-sensitive internet that would be capable of piecing together all of the information required on its own, and better ways to interact with computers such as touch and gestures (in other words, buttons and styluses are on their way out). The first two goals are easy enough, but the latter two are still basically things in the research phase that may never play out like Intel hopes.
The other major demonstration of the afternoon was working with Steve Pederson of Smashmouth as a special guest, serving as a central theme for a few pieces of software Intel was showing off with some of their partners, as a demonstration of personalization. One piece of software was a group music collaboration platform (eJamming) where various band members could play together over the internet (think Rock Band on steroids) with Smashmouth playing a piece of one of their songs. This was followed by another piece of software from Big Stage, who supplied a piece of software that created an avatar of Steve based on a few digital camera pictures of his face.
Finally demonstrated was a 14 camera motion capture system that is intended to replace the need for suits and balls, which culminated with Steve being captured and rendered in real time in to a virtual Smashmouth concert with the rest of the band. Of the two demonstrations this was by far the weaker one; technologically it was impressive but it felt like Intel hadn’t managed to do much more than recreate Rock Band in a more complex form, one that would require their chips.
We did get a couple of useful pieces of information out of Intel however that made the keynote worth seeing. First and foremost, Intel quickly announced its Canmore system-on-chip product. Canmore is targeted towards HDTV products and is capable of full 1080p decoding, the first products using it will likely be HDTV tuners and HD-DVD/BluRay players. Menlow was also quickly mentioned here (having been announced yesterday), we’re assuming it’s the CPU at the center of Canmore.
Paul also spent a short amount of time talking about die shrinks and Moore’s Law. There’s been some concern in the past few years how much longer Moore’s Law can continue as parts of transistors approach the size of a few atoms of silicon. Paul stated that Intel feels confident that Intel will get another 5 generations (around 10 years), with anything after that we’re assuming Intel isn’t sure, rather than knowing for sure if they’ll be able to get past it or not.
Gallery: Paul Otellini Keynote Coverage