Original Link: http://www.anandtech.com/show/2150
CES 2007 Part I: Convergence Happened and the Most Impressive Demo of CESby Anand Lal Shimpi on January 11, 2007 6:53 PM EST
- Posted in
For years companies like Intel and Microsoft have been talking of this impending convergence of PC and Consumer Electronics (CE) devices. In the past couple of years we have finally seen this convergence come to fruition, through a slew of devices that basically let you move or display content stored on your PC, on CE appliances. While most manufacturers have tried, very few have “gotten it right” when it comes to convergence devices. The end goal is simple: access to everything, everywhere on any device. Making it happen however is far more difficult, as creating the devices that will facilitate this goal is like one giant game of process of elimination.
Most of this year’s CES has been about poor attempts at convergence, with a handful of things that were worth while. Despite very high expected attendance, the show wasn’t nearly as crowded as last year. It still ends up taking 30 - 40 minutes to get a cab during the day, but we had no problems navigating the show floor and surrounding hotels. Whereas in previous years we’d waste a significant amount of time wading through hordes of people, there’s actually breathing room this year.
Whether it’s that the show is simply far more spread out this year, among two convention centers and many hotel suites, or attendance is simply down due to a lack of interest, we were here in full force in search of something interesting. This year’s CES marked the end of an era of talking about convergence, and the beginning of the introduction of many convergence products. While we’ve yet to see anyone with the vision to bring us the convergence world’s iPod (although Apple’s iPhone announced at Macworld looks like it may redefine another sector), that didn’t stop us from finding individual technologies that were worth a look.
As with most trade shows, the vast majority of what we saw on the floor was poorly designed and/or executed. What follows are some of our answers to the question we always get: “what was the most exciting thing you saw at the show?”
Corsair's Prepping for the Memory Boom
Our first meeting of the show was actually not with a CE company at all, rather a PC memory company: Corsair. Our discussion with Corsair would prove to be something of a trend for this year’s CES, as it centered around Windows Vista and its impact on the PC market in general.
Out of all of the manufacturers that will be affected by Vista and its hardware requirements, it is the memory manufacturers that stand to benefit the most. Vista needs more system memory, can benefit from Ready Boost enabled USB flash drives, supports Hybrid drives and Intel’s on-motherboard flash technology (Robson). While CPU and GPU requirements are greater with Vista, given that you can get premium certification with Intel’s integrated graphics, the requirements aren’t all that high. What Vista will do, more than anything, is help sell more memory - whether it is in the form of DDR2 or flash devices.
Corsair had its Dominator line of memory on display, including its recently announced PC10000 offering. But it wasn’t DDR2 that was the most interesting at Corsair’s suite, rather Corsair’s Flash memory lineup.
In the past year we have seen Flash drives get very big, very quickly. Corsair showed us its 16GB MLC based NAND flash, which we will be reviewing soon, as well as its faster 8GB SLC based USB drive. The difference in performance between SLC and MLC is forcing Corsair to brand its SLC drives differently, which you can see below:
Flash Voyager GT drives will use SLC flash
In the future, all SLC flash based devices will have the “GT” suffix while MLC devices won’t. The idea is that MLC devices are significantly slower than their SLC counterparts, which is the tradeoff you make when getting twice the storage capacity thanks to being able to store multiple bits per cell.
Regular Flash Voyager drives will be MLC only
While USB drives have become a clear commodity, Corsair is doing its best to differentiate in a very different manner than much of its competition. When Corsair introduced the Flash Voyager what set it apart was the fact that it was a bit more rugged than its competition. While everyone else is trying to make USB drives smaller, Corsair has been hard at work to make their drives more rugged. Enter the G-Force:
This waterproof drive takes rugged to a new level and feels like it would be at home sitting in a Hummer.
The G-Force is by no means small, but it actually feels right in your hands and if you’re prone to dropping/breaking things it’s not a bad option. At the show Corsair only had a prototype, but the final version will be available in a matter of weeks. Corsair is convinced that NVIDIA won’t be upset by the naming of the product, as there’s no “e” in its G-Force USB drive.
Speaking of NVIDIA, Corsair was also proudly showing off its 620W power supply running a pair of GeForce 8800 GTXs, bringing to light the fact that you don’t need an 800W+ PSU to run NVIDIA’s highest end configuration, you just need an efficient PSU.
Sony: The Most Impressive Demo at CES
The fact of the matter is that if you're an exhibitor at the show, you rarely have time to actually walk around the show floor and see exhibits other than your own. It's even worse if you're stuck off in a hotel suite somewhere in meetings all day, you don't really get to see the show. So when we meet with most companies we often get asked what they should go see at the show floor if given a free hour or so. CES is a very visual show, and it's very fitting that one of the most interesting things at the show is a display.
Sony was showcasing a handful of prototype OLED displays, with no release date or product in sight, it was still the most impressive looking demo at CES. Sony's arrangement consisted of a number of 11" OLED displays and a single 27". The 11" displays had a native resolution of 1024 x 600 and the 27" was a full 1920 x 1080 display.
Thanks to the use of OLED technology, these displays are extremely thin; the 11" models were around 3mm thick while the 27" display was approximately 10mm thick. The displays were simply looping several high color/contrast video scenes, but with very little motion going on in them. What we could see was absolutely amazing and put every other display at CES to shame, bar none.
Other than a very thin panel, the use of OLEDs meant that you could get some very wide viewing angles when looking at these displays. We tried our best to show it in our pictures but you could almost stand at the very edge of the display and still get a very clear, bright picture.
Sony's Laser Light Engine LCoS Display
Another Sony prototype on display was a Laser backlit LCoS display, but unfortunately the prototype was far more crude than the OLED setup.
While a laser backlight in theory provides a larger color gamut, even larger than a LED backlight, the actual demo itself was unimpressive.
Colors appeared off and the overall image wasn’t very sharp; some of this was due to the fact that it’s not a direct view display, but mostly the issue was that it was a very early prototype.
Although non-functional, the TV’s controls on the side of the display were pretty cool. Like the OLED prototypes, there’s no indication of when we may actually see this technology come to fruition.
Sony's $33,000 LCD
If you find yourself wondering what you can buy for $33K this April, Sony has an answer for you. While LCD and plasma manufacturers have been creating bigger and bigger displays to showcase at CES, they rarely end up as an actual product you can buy. Sony is changing the trend this year by showcasing a 70” LCD that will be shipping to customers in April, at a price tag of $33,000.
The 70” Bravia display features a single 120Hz 1080p panel with 10-bit color support and is LED backlit.
The display is absolutely huge and looked quite good, although not nearly as good as the OLED setups we talked about earlier obviously. For the discerning buyer in dire need of a 70” LCD, Sony has exactly what you’re looking for.
Bravia Internet Video Link, and PS3 UI for LCDs
One of many examples of convergence devices was the Bravia Internet Video Link on display at the Sony booth. The Internet Video Link will interface with all redesigned 2007 Bravia TVs (currently not announced/available) and will stream content from a handful of sources to your TV.
The Internet Video Link connects via HDMI to your TV and via Ethernet to the Internet. Once connected, it can stream content from a number of web portals - currently only limited to AOL, Yahoo Video, Grouper and Sony. The Internet Video Link also offers RSS support for things like traffic and weather data.
While it would be far better if you could stream content from any source, and something like Youtube integration would be much more useful than being able to stream from AOL and Yahoo video, the real story behind the Bravia Internet Video Link is the TV it was demonstrated on.
An unreleased 2007 model Bravia TV was the demo platform, and the big feature to talk about is its use of a very PSP/PS3-like UI. Called the Crossbar menu, it’s a reasonably quick UI that is among the best we’ve seen on a TV. While it’s not as fast as on a PS3, it is still reasonably fast and not too cluttered. Expect to see this UI on all 2007 model Sony Bravia displays.
Sony Reader: Electronic Ink at its Best
Announced at last year’s CES, Sony’s e-book and PDF reader was on display at the show as it has been shipping for a couple of months now. The display uses electronic ink and thus consumes no power unless you are flipping pages.
The unit itself is extremely light and honestly is one of the first devices of this type that we could actually see being a reasonable replacement to carrying around tons of books. While the demonstration centered around reading novels, what we’d really like to see is this technology used to store textbooks for schools. Rather than having to carry around multiple books each composed of hundreds of pages, a single e-Ink based Reader like this would be a much better experience. Sony got the ergonomics on the ultra thin device just right and the display is superb, making it very similar to reading pages in a regular book.
The Sony Reader features a 256MB internal memory, enough to store approximately 80 books, but you can always add more memory through a memory stick slot on the side of the unit. While the content is DRM’d, once you purchase an e-book you can share it with up to 5 other readers (including one computer). The Reader is also able to view PDF documents. The only downside to Sony’s Reader is its cost, which is around $350.
Sony UMPC Update
After the UMPC craze of last year’s CES, Sony announced the first UMPC that had some potential - the Vaio UX. Featuring an integrated Blackberry-style keyboard, it looked like the UX could actually be useful, but in reality it was far from it. The keyboard ended up being more stylish than functional, and the screen was far too high res to actually be useful in the majority of cases. Furthermore, without good support for high DPI displays, Windows XP was the wrong OS to launch the UMPC with.
At this year’s CES, Sony announced an update to the Vaio UX which fixes some of the problems with the unit. Sony shaved down some of the area between the keys on the keyboard so you can actually get a better feel for the keys themselves, a major problem with the original unit. The keyboard is much improved but still not perfect, and in our opinion still more style than function. While Blackberry and other QWERTY-enabled smartphone owners can usually type pretty fast on their devices, Sony’s UMPC still isn’t quite there.
The major change to the updated model is the inclusion of a 32GB flash drive instead of a normal mechanical hard drive. The end result is that the unit is quicker to turn on and off, applications launch faster and the device is much more rugged. While we still think current UMPCs are missing the mark, Sony at least made their unit a little more attractive for those who were going to buy it anyways.
Before the show even started Toshiba had already announced its Regza line of LCD TVs. Available in both 1080p and 720p models, the Regza LCDs we saw at the show all looked very good.
A subset of the 1080p displays is Toshiba’s new Regza Cinema series, which use 120Hz panels, while the rest of the displays use standard 60Hz displays. Toshiba did have a live comparison showing the benefits of 120Hz vs. 60Hz panels, but honestly it’s tough for the normal viewer to see a difference between the two.
All of the Regza LCDs use a normal backlight and not LED like the 70” Sony LCD we talked about earlier. Some of the models are available with an integrated slot-loading DVD player.
Updated HD-DVD Players
Toshiba also showcased its second generation HD-DVD players, two of the models offering 1080p output support.
The new players are smaller and fix the start-up time performance issues that were present in the first player. Unfortunately pricing is still quite high, with the cheapest player coming in at $599.
With the format wars still undecided, and pricing still quite high, your best bet is still to wait and see what happens with Blu-ray and HD-DVD. If you need a HD-DVD fix now and happen to have an Xbox 360, Microsoft’s external drive is pretty cheap and gets the job done albeit not in the most elegant fashion (lacking HDMI and all).
Toshiba’s future technology stand had both PC and set-top HD-DVD writers on display, although honestly for data storage you’d ideally want Blu-ray thanks to its greater disc capacity (50GB vs. 30GB).
ATI’s OCUR is Introduced
The most exciting story of last year’s CES was ATI’s OCUR (Open Cable Unidirectional Receiver) device; although at the time all we knew that it would officially launch when Vista was released due to its dependency on a protected path through the PC. With Vista’s retail release less than a month away, it’s no surprise that AMD (formerly ATI) had a final production version of its OCUR device on display and working at CES.
The official product name of OCUR is the ATI TV Wonder Digital Cable Tuner by AMD, and as expected it will be available in two versions: internal and external. The TV Wonder DCT is currently an OEM-only product and it will remain so until there’s enough trust built between the cable industry and the PC users in order to start making it more accessible.
The TV Wonder has a set of AV inputs for external devices
For those that don’t remember, the point of the TV Wonder DCT is to bring CableCard support to Windows Vista Media Center. Currently, you can’t use Media Center Edition to record digital cable or non-OTA HD content. AMD’s product is the only way of getting CableCard support on your PC, and implementation is fairly simple. You plug your cable into a coax port on the back of the unit and then plug the unit in to your computer via USB 2.0. The internal version also operates over USB, despite being a standard PCI card.
Your CableCard goes in a slot on the front of the TV Wonder; a cover slides open to reveal the CableCard slot:
To enable support for the TV Wonder the motherboard must include a flag in the BIOS to tell Vista that it supports the tuner. There’s also a product ID that needs to be entered in Vista, but most OEMs will probably be entering this on their own so that it is completely transparent from the end user. It’s sounding like it won’t be too easy/feasible to simply take the unit and install it on a different computer from the one it originally came on.
Using the TV Wonder in Vista works just like any other tuner in Media Center, just change channels and record shows like you normally would, the only difference is that you now have access to premium and non-OTA HD content. Availability will be in OEM systems on January 30th with Vista’s launch.
ASUS Demos Affordable Ultra-Portable Notebook
Of all of the motherboard companies at CES, ASUS had by far the most impressive product showing. All of the motherboard manufacturers are hard at work on developing new platforms based on Intel’s upcoming Bearlake chipset, so most of the motherboard news at the show is stuff we’ve already heard/seen before. With ASUS, the big news was about notebooks and honestly just how far ASUS has come as a notebook maker.
It wasn’t too long ago that ASUS made fairly boring but functional notebooks, but these days you get good performance and good looks.
ASUS’ U1 notebook is based on Intel’s Core Duo U2400, codenamed Yonah, running at 1.06GHz. The system ships with 512MB of DDR2 memory on-board, expandable with up to another GB of memory.
One of the more unique features of the U1, other than its 2.2 lbs weight, is its LED-backlit 11.1” WXGA screen (1360 x 768). We’re used to seeing notebooks like the U1 from Sony, but seeing one from ASUS will hopefully mean an ultra portable with a more reasonable price tag. The U1 should be available this quarter.
The LED backlight lets ASUS get away with a very thin panel
ASUS’ UMPC was also on display but the one thing that kept coming to mind was that with notebooks as small and light as the U1, what’s the point of a UMPC like this?
What we’d really like to see is something in a similar formfactor to a Blackberry or Smartphone, but significantly more powerful. Unfortunately CPU technology isn’t quite there yet, so we’ll have to wait on Intel to truly deliver a revolutionary ultra-mobile CPU before the Ultra Mobile PC will really take off.
External GPU for notebooks
Motherboard manufacturers are always trying to think outside the box and deliver something innovative, and with the XG Station ASUS has done just that. The premise is simple: an external graphics card upgrade for notebooks, and it’s achieved through the use of PCI Express and the mobile version ExpressCard.
The XG Station is an external housing with a PCIe x16 slot powered by an external AC adapter. The box connects to your notebook via ExpressCard 34, which is effectively a PCIe x1 interface. The XG Station would appear as another video card, and you’d have to use an external monitor with the card, but you get much better performance than integrated graphics.
The current XG Station features a GeForce 7900GS, but future models could support higher end offerings.
It’s currently the only way to really upgrade notebooks with cheap integrated graphics, unfortunately the XG Station does carry a high price tag - $599 gets you the unit and a 7900 GS.
Water Cooled 8800 GTX
NVIDIA is slowly but surely lifting restrictions on its partners selling factory overclocked GeForce 8800 GTXs. ASUS was one of many companies at the show to demonstrate an overclocked 8800 GTX, running at 630MHz core and 2.06GHz memory (up from 575/900MHz).
The EN8800 GTX Top AquaTank is water cooled in order to reach higher clock speeds at lower noise levels. Unfortunately both the water reservoir and the card itself both have fans despite the liquid cooling, they just spin at lower speeds.
Side Show in Action
ASUS and LG were two companies that had Windows Side Show enabled notebooks up and running at the show. Side Show encompasses an integrated or removable device in your notebook that can play music, access email and display other information while your notebook lid is closed.
The ASUS Side Show device features an integrated 1GB of Flash memory to store music and pictures that you want accessible while your notebook lid is closed. Unfortunately you need to first sync your songs and pictures to the Side Show device before you can access them using Side Show. We would rather have something that could access the notebook's integrated drive, but without such functionality Side Show will have limited use.
Side Show supports Windows Vista Gadgets that can be used to display various bits of information while your system is powered down. You can even read downloaded emails on the display, which actually worked reasonably well thanks to a quick interface.
The Side Show device appears as a portable device under Vista, and you can configure what Gadgets are available for use on the device within Windows.
NVIDIA's GeForce 8600, 8500 and 680SE
While NVIDIA wasn't showing off anything impressive at the show, there was a lot of very good NVIDIA news that we kept running into at CES. At CES we saw a mobile GeForce 8600 and 8400, later on during the show we learned a bit more about these two GPUs.
NVIDIA has two G80 derivatives designed to target the more mainstream segments: G84 and G86. G84 will be the base of NVIDIA's GeForce 8600 while G86 will be used in what is currently known as the GeForce 8500. Detailed specifications aren't known other than the chips are supposed to be 80nm, but expected launch date is around April or May.
Also on the roster is a 320MB GeForce 8800 GTS card, which is expected to be priced at $299. Currently clock speeds and shader configuration are expected to be no different than the $449 640MB version, the card will only have less memory. Honestly, we expect 320MB to be more than enough for most games/resolutions, which may make the 320MB card extremely attractive. Couple it with a Core 2 Duo E4300 and you've got one fast and affordable system.
Despite a decent amount of information about upcoming NVIDIA GPUs, we didn't hear anything about a 80nm G80. Much of what happens with the G80's successor will probably depend on ATI's R600 release schedule.
On the platform side, NVIDIA will be introducing a nForce 680SE chipset, which will be a less overclockable version of the 680i chipset. Price point will be less than $180 but we're still not sure how it will fit into the big picture between the 650i and 680i.
Those looking for NVIDIA's Vista 8800 GTX driver needn't look any further than Microsoft's booth at CES. All of the gaming machines at Microsoft's booth were running nForce 680i motherboards with single GeForce 8800 GTXs, under Windows Vista. The machines were running Crysis and Halo 2, and actually ran reasonably well. Halo 2 was choppy at times and there were some visual bugs with Crysis, but the driver was working and is apparently stable.
We spoke to NVIDIA to understand why there isn't a 8800 Vista driver currently and why we won't see one until Vista's launch. NVIDIA's GPU drivers these days are made up of approximately 20 million lines of code, which as a reference point is about the size of Windows NT 4.0.
Because G70 and G80 are radically different architectures, they each require a separate driver. Combine that with the fact that Windows Vista has completely changed the driver interface, similar in magnitude to what happened between Windows 3.1 and 95, and you've got a "perfect storm" of conditions for driver development. The end result is that for Windows Vista, two 20M line drivers have to be completely re-written (one for G80 and one from all previous architectures). In other words, this isn't a simple port, it's a radical departure from the way things were written before.
There are other elements of Vista driver development that apparently require more work than before. DirectX 9, DX9 SLI, DX10 and DX10 SLI support is provided through four separate binaries, which increases the complexity of testing and the overall driver itself, whereas there was only a single driver in the past.
Interfaces for HD-DVD and Blu-ray video acceleration requires a lot more code than before, thanks to the support for a protected path for HD video under Vista. Supporting this protected path for HD content decode means that you can't re-use the video part of your driver when developing a Vista version.
The last major difference between Windows XP and Vista driver development is that the display engine connecting monitors to the GPUs has been completely redone.
Initial investment in driver development under Vista takes up quite a bit of time, and now we understand a little more of why. While it would be nice to have one today, there's always a tradeoff that has to be made especially when driver work this intense has to be done. Couple that with the recent launch of NVIDIA's G80 GPU and the decision was made to focus on DX9 and XP drivers in order to make the G80's launch as solid as possible, and commit to delivering an 8800 driver by Vista's launch.
When the driver is eventually available NVIDIA expects performance to be at par, slightly slower or slightly faster than the XP driver. What we've seen thus far from other Vista drivers is that performance is slower almost entirely across the board. As stability is currently the primary goal for both ATI and NVIDIA, many compiler optimizations and performance tweaks aren't being used in order to get a good driver out in time for Vista's launch.
Ageia has big plans for PhysX in '07
We met with Ageia at the show to go over plans for its PhysX accelerator and if there’s a future for the technology given the rocky start.
Ageia had no issues admitting that the PhysX launch wasn’t very good, so it is committed to fixing problems and doing whatever it can do correct the negative image of the first physics accelerator in 2007.
According to Ageia, 2007 will be a year of really ramping up the quality and quantity of games that implement PhysX support. Ageia hopes to achieve better overall quality through three major steps that are currently being implemented:
1) All PhysX titles that are released must go through some sort of an approval process before they can ship. This gives Ageia some input into the game development process and will hopefully mean that Ageia can pull support if a game doesn’t meet its standards. On the flip side, it also means that if mediocre PhysX implementations make it into games that Ageia will no longer be able to simply blame the developer; in the future, Ageia will be just as responsible as the developer.
2) Performance of a game with PhysX enabled must not be lower than with it disabled - you should no longer have the problem of better physics but lower performance. This is a big step forward for Ageia, as it is difficult to justify spending money on getting better physics if you end up reducing overall game performance as a trade off.
3) PhysX enabled titles must offer some sort of significant improvement with hardware acceleration enabled. Once again this is a sort of certification or stamp of approval by Ageia that the use of PhysX hardware will actually do more for your gameplay than make a nice tech demo.
Ageia remained fairly vague in how strictly it plans to enforce these requirements, but now it must share the responsibility if PhysX continues to be a failure by the end of 2007. According to Ageia, there will be three AAA game titles released before the end of 2007 that will make substantial use of its PhysX card, above and beyond anything that has been done to date.
Keep in mind that the PhysX core is still built on a 130nm process, so there’s room to reduce cost considerably. Ageia views the current PhysX implementation as a high end offering and plans on introducing lower cost variants to target other markets.
We left the Ageia meeting with a fairly strong statement from the company; by the end of 2007 Ageia expects the question of whether or not a PhysX card does anything to go away completely thanks to much better implementations in games and much better title availability.