Lenovo’s Yoga 13: Ultrabook, IPS, Windows 8, and Convertible

Last on my list of impressive showings at CES is the Lenovo Yoga 13. This is another ultrabook, and if you weren’t at the show, let me just say that Intel is pushing ultrabooks in a major way. We’ve reviewed several shipping ultrabooks, and I can guarantee there will be many more to come. Every laptop manufacturer had one (or more) on display, and Intel’s booth used probably half of their public floor space to show off ultrabooks and related technologies. So far, none of the ultrabooks we’ve reviewed have really nailed every area, but when the Yoga 13 starts shipping that might finally change.

The short summary is that the Yoga 13 sports a 1600x900 IPS touchscreen panel, and it’s beautiful to behold. How Lenovo manages to cram touchscreen and IPS, plus a folding laptop/tablet hybrid into a 17mm thick chassis is something of a mystery. Okay, perhaps it’s not that mysterious—I expect the device will carry a pretty steep price tag, but hopefully it will be worth buying. The design felt solid in the hand, the soft-touch coating on the palm rest is great, and with an Ivy Bridge CPU and SSD performance should be there as well. The only major complaint I have is that the IdeaPad Yoga 13 won’t start shipping until the Windows 8 release, and I want to test one now (or at least when Ivy Bridge officially launches).

Best of Show Summary

I didn’t intentionally set out to find a top three of CES that all shared a common theme, but it’s there nonetheless. For anyone who uses a computer or tablet, or who watches TV and movies, the one thing you always have to see is the display. Put in a great display and you can rise above the crowd; cut corners and you enter the race to the bottom that has brought about the cheap construction and poor quality that run rampant at Best Buy, Office Depot, etc.

Long term, the higher quality displays in tablets and HDTVs are eventually going to force laptops to adopt better, higher resolution displays. What's sad is that I have a 1920x1200 laptop from five years ago, and that display probably cost the manufacturer $350 (possibly less). Today's $350 displays are almost universally worse, other than having brighter LED backlighting. Meanwhile, the $1000 2.8GHz Core 2 Extreme (dual-core) CPU in the laptop is now slower than even a basic $130 Core i3-2310M in most tasks, and this formerly $4000 laptop is also slower than today's laptops that cost just $750. The price-performance ratio has shifted an order of magnitude in five years, but laptop displays continue to stagnate.

I hope we’re nearing the inflection point where consumers will start asking for better laptop displays. When all the tablets at Best Buy are WUXGA, QXGA, or even QHD/QWXGA, advertising a laptop as having a 720p panel ought to present problems for Joe Sixpack. I also hope that Windows 8 will revamp the handling of high DPI displays; Windows 7 does a bit better than Vista, and both are a big step up from XP, but I still routinely encounter applications that don’t scale with DPI settings. When such applications are written with the assumption that everything runs at 96 DPI—and worse, when they have a fixed window size—the result is text that overruns the viewable area and buttons that are unclickable. I’d guess Metro apps will all scale nicely with DPI settings, but we’ll have to see how many apps (and users) eschew Metro on desktops and laptops and stick with the familiar desktop interface.

Wrap Up

That takes care of my top three, but as I noted in the introduction I didn’t even see a fraction of the show floor. (I could also do a bottom three of CES, but that’s too easy: the taxi lines and crowds take slots one and two for me, and the pay-$12-per-day-for-lousy-Internet gets the third. But I digress.) Even with ten editors from AnandTech running around, I’m sure we missed covering a lot of cool technology and gadgets, so I’m curious to know: what do you see and/or read about at CES 2012 that impressed you most? What would you like to see us cover sooner rather than later? Let us know in the comments!

Looking Forward to WUXGA and QXGA Tablets
POST A COMMENT

78 Comments

View All Comments

  • cheinonen - Tuesday, January 17, 2012 - link

    I was covering home theater and video, and only got to spend two days on the show floor, but Sony's CrystalLED prototype was just amazing. Very bright, 180 degree viewing angles with no color or contrast shifts, near infinite contrast ratios, and perfect motion with no blurring or other motion artifacts. I can only hope that Sony decides to release it at an affordable cost, as it's just amazing to see.

    The OLED sets might have been almost as good, but the off-angles were not as good, and the demo content was not good for getting an idea of the quality compared to Sony. Of course they might ship this year and we have no idea when/if the Sony will be released. The 8K panel from Sharp was also just a proof-of-concept design, but amazingly detailed to the point that you can stick your head next to it and see no pixels. The contrast and angles were not nearly as good as the CrystalLED, though.

    Nothing in Blu-ray really amazed me, as the only different feature I really saw was Sony offering 4K upconversion on their new player for their 4K projector, but I'd need a 4K projector to be able to evaluate that anyway. Overall it was the new panel technologies that really stood out to me.
    Reply
  • AnnihilatorX - Wednesday, January 18, 2012 - link

    ZDnet article said something different regarding the CrystalLED:

    "Reports from the show floor came away impressed, if not awed. Engadget said the sample set on view failed to show off the speedy refresh rates, and our sister site CNET found that OLED TVs provided a bit more “wow.” CNET also posted a short video examining Sony’s Crystal LED Display in more detail that you can watch here. "
    Reply
  • cheinonen - Wednesday, January 18, 2012 - link

    Which is fine. The OLEDs might be better, but the way the demo was setup on the floor I just really couldn't get a good idea for it, and the color shift on the LG model was a bit annoying since the Sony LED set had absolutely zero shift. I believe that Samsung had a demo unit setup in a private room that some journalists managed to see, though I did not, so that might have had better material or a better environment and led to a better response than I had. The other AV writers that I talked to during and after the show came away a bit split on the two, though we all want one of them in our living rooms.

    Unfortunately no video that anyone took will do justice of the motion on the CrystalLED, since you'll be watching it on a conventional display. I imagine it might never come out, but we can all hope Sony finds a way to produce it since the results were amazing.
    Reply
  • B3an - Wednesday, January 18, 2012 - link

    Whats the difference between OLED and Crystal LED? Is Crystal LED just Sony's marketing BS for OLED? They both seem extremely similar.

    The Samsung TV at the show had a "Super OLED" display though. Super OLED sets don't use a color filter resulting in pictures with deeper contrasts and finer detail. So it should have been better.
    Reply
  • therealnickdanger - Wednesday, January 18, 2012 - link

    Realistically CLED will likely never see the light of day. Sony stated that it was a tech demo and that they have no current plans to produce them. Considering each pixel is composed of 3 LEDs (RGB) on a chip, the display would be cost-prohibitive to build and sell in any mass market. Sony can't "choose" to release it at an affordable cost unless they find a way to make cheaper LEDs and find cheaper ways to connect them all.

    Even if you could buy a single LED for $0.01 (one cent USD - which you can't), you would need 6 million of them. I'll math for you: $60,000 for just one display. And that's only for 1080p, 4K will be mainstream before this tech will. LEDs have been in mass use for decades in all manner of electronics and the prices aren't even close to make LEDs cheap enough for this tech to work.

    This is where OLED comes in as a realistic alternative. Although as I understand they still need to work on its retention performance.
    Reply
  • demonbug - Tuesday, January 17, 2012 - link

    I've seen a lot of discussion of 4k displays following this year's CES, and invariably brief mention is made of the limited source material available. So; what 4k sources ARE available today? What are the demos running off of? What kind of processing power would it take to play, say, a 4k video stream encoded the same way as a blu-ray (I'm assuming 40 Mbit max for 2k video would roughly translate to 160 Mbit for 4k)?

    Basically, beyond getting the displays into production, what needs to happen before 4k becomes a wider reality? Have we seen some significant improvement in compression technology in the last 5 years that would make 4k satellite broadcasts possible without sacrificing a huge number of channels?

    4k sounds great, and and on the one hand it is just the next logical increment after 2k HD. However, it seems that we are still just barely managing the bitrates required by 2k HD in terms of storage, transmission, and playback; how close are we realistically to making the 4x jump in all of these to make 4k useful?
    Reply
  • JarredWalton - Tuesday, January 17, 2012 - link

    I think 4K will largely be for home theater buffs initially, with Blu-ray players that upconvert to 4K. Then we'll get something post-Blu-ray that will have new DRM and support higher bitrates. Of course, average bitrate of 50Mbps could still fit a two hour movie on a 50GB Blu-ray, so maybe it will use the same disc medium but with new standards? Don't know, but we'll see. Reply
  • hechacker1 - Tuesday, January 17, 2012 - link

    Doesn't Blu-ray scale with layers? AFAIK, they've demonstrated versions with 10 or more layers. So we'd just need updated drives to read them. Reply
  • Fanfoot - Tuesday, January 17, 2012 - link

    From doing a bit of Googling, it looks like 100GB is the likely requirement for 4K movies, which means 4 layers rather than 2. Apparently most Blu-Ray disk players can only read 2 layers, so would have to be upgraded. I suspect the bit rates would blow them up even if they did support the BDXL format... Reply
  • chizow - Wednesday, January 18, 2012 - link

    @Jarred,

    Why wait for home theater/movie buffs to catch up when PC gaming could take full advantage of this tech today?

    We just need 4K/2K to be supported over a single connector or for both IHVs to implement their professional single resolution over multiple display solutions on desktop parts, like the Quadro version described here:

    http://www.pcper.com/reviews/Graphics-Cards/NVIDIA...
    "professional level multi-display technology called "NVIDIA Scalable Visualization Solutions" that will allow multiple monitors to function as a single display to the OS and "just work" with any application."
    Reply

Log in

Don't have an account? Sign up now