Jarred’s Best of CES 2012

CES is all wrapped up and everyone is back home (presumably—there are probably a few who remained in Vegas to lose more money gamble a bit more), and one of the questions I’ve been asked repeatedly by friends and family is, “What was the coolest thing you saw at CES this year?” Now, keep in mind that I am only one person and I didn’t even see a fraction of the show floor, as there were plenty of meetings set up around Vegas, so this is just my perspective on the coolest technology trends at the show. You’ll also notice that there’s a common thread in what really impressed me, but this is a highly subjective topic so take it for what it’s worth: one man’s opinion. (And note that I am specifically not speaking for the other editors; I'm sure most of them would have a different top three.)

I Have Seen the Future, and the Future Is 4K

The most impressive thing I saw at the show for me is the 4K displays. Several places had such displays on hand, but I didn’t spend a lot of time with the various display/HDTV vendors so the first real close up encounter I had with a 4K display was at AMD’s meeting rooms. They had a 4K panel hooked up to a 7970 running an in-house demo. The demo itself wasn’t anything special, but the display… wow! I didn’t have a tape measure handy and the AMD reps I asked weren’t sure, but the panel appeared to be a 46” model (possibly 42”). I did check the native resolution, and while I’m not sure if all 4K displays will use the same resolution, this particular panel was running at 4096x2160, so it’s even wider than the current 16:9 aspect ratio panels (and closer to cinema resolutions); thankfully, with 2160 vertical pixels, I’m not sure many will complain about the loss of height.

Other than the sheer size of the display, what really stood out was the amazing clarity. The dot pitch at 4096x2160—even on a 46” display!—is slightly smaller than that of a 30” 2560x1600 display. I don’t actually need a finer dot pitch, and I had to increase the DPI of Windows in order to cope with my degrading vision (some text just looks too small to comfortably read from a couple feet away), but for videos and images I’m of the opinion that “more is always better” (provided you have the hardware to drive the resolution, obviously). Where I really see 4K being useful outside of people that love high DPI computer displays is for home theater enthusiasts that have 60” and larger displays—particularly projectors—where 1080p just doesn’t really cut it.

If you want another perspective, the consumer electronics industry is always looking for ways to get people to upgrade. When HDTV first came out, you had to choose between 720p and 1080i. A couple years later, 1080p launched and everyone “had to” upgrade. Then of course we had the 120Hz/240Hz/480Hz offerings, and 3D displays got thrown into the mix as well. Now that 1080p 120Hz displays are going for $500-$800 for 40-52” HDTVs, for a lot of people we’re at the point where our displays are good enough to last the next decade. So how do you convince people that they need to upgrade again? You come out with an even better standard. (I also suspect we’ll see a follow up to Blu-ray with native 4K support at some point in the not-too-distant future; that will also be when the content providers come up with a new “unbreakable” DRM standard that will cause a lot of grief and still get cracked within a year of launch.)

Now, I’m all for giant HDTVs, but even I would suggest that a 42” or 46” computer display sitting on your desk would be too much. Still, if I could get an IPS, PLS, or *VA panel and the weight was manageable for my desk, I’d be willing to give it a go. The only drawback I can really see is pricing; I don’t know what these displays will cost when they start showing up en masse at retail, but I wouldn’t be surprised to see five figures for a while. Then again, I remember when 60” plasma displays were going for >$20K about eight years ago, so given another decade we should see these panels in the <$1000 range (for 40-60”). However long it takes, when the price is right I know I’ll be eager to upgrade.

Looking Forward to WUXGA and QXGA Tablets
POST A COMMENT

78 Comments

View All Comments

  • imaheadcase - Wednesday, January 18, 2012 - link

    Because marketing a TV for thousands of dollars now is not going to appeal to the small market of gamers who care :D

    But i'm with you, if the price was right, I would be willing to be a first adopter if i could get a hands on preview of it. Meaning see how it does hooked up to a computer with a few games.

    Computer monitors have been at a standstill for quite a long time. It basically went CRT to LCD and thats pretty much it. In fact i would venture to say its went BACKWARDS for monitors..it used to be anything over 24 inches was 1900x1200. Now you see the market flooded with 22-23 inch 1900x1080 monitors. Or worse 27 inch 1900x1200.
    Reply
  • JarredWalton - Wednesday, January 18, 2012 - link

    4K definitely works on PCs -- I don't even care if it requires two connectors. However, I expect the cost to be prohibitively high for a while. I mean, the movie theaters and digital film has already been using 4K for a while, but it's just not consumer grade stuff. But yeah, AMD showed you could definitely play games on a 4K display. All you need is the display and a GPU with the necessary ports, but I think the display part is only available special order for >$15K. Reply
  • chizow - Thursday, January 19, 2012 - link

    I asked in the other thread about 4K/2K, but did AMD actually demo any actual PC games (just saw some in-house castle demo)?

    That was my point though, with PC gaming we don't need to wait for any content since most any game that reads resolution caps from Windows will be able to render natively at 4K/2K and output 4K/2K natively over 1 output, or with some help from the driver if 2 outputs.

    But that makes sense about the price/demand aspect, since no one is going to make $15K displays affordable just for the PC gaming community. I guess our best bet of seeing these displays commoditized in the near future would be the professional graphics space, which is what largely drove the 2560x1600 format and 30" IPS market as well.
    Reply
  • JarredWalton - Saturday, January 21, 2012 - link

    I only saw their rolling demo, but that's not too surprising. I also poked around at the system from the Windows desktop and everything was as you would expect. I thought I saw a shortcut for a game, but I don't have any pictures of the Windows desktop so I can't confirm or deny. Basically, the game would have to support the dual DP outputs running a single display I think, but if a game supports Eyefinity that shouldn't be a problem. Reply
  • Fanfoot - Tuesday, January 17, 2012 - link

    Good questions of course.

    One obvious use of 4K displays would be to allow for passive 3D without sacrificing resolution. So while only half the pixels would go to one eye, you'd still have 1080p resolution (with say line doubling on top of it). Assuming anybody cares about 3D of course.

    Another possibility is that displays could do upscaling. So just as we saw EDTVs at the end of the SD life-cycle there could be 4K displays upscaling 1080p content.

    Then of course there's games. An updated XBox or PS4 could conceivably drive a higher resolution display. Not clear this will happen of course, but the potential for this increases as the number of years before these consoles get a refresh.

    Then of course there's movies on disk. Studios want you to buy Blu-Ray movies and not stream stuff over the internet or watch it via your MSO's VOD offering. So a future 4K Blu-Ray standard could push higher resolution as one way of trying to stave off the eventual move to all digital delivery. Sony for example claims to have more than 60 theatrical releases shot in 4K, and there have been a number of high profile pushes for 4K (James Cameron for example is shooting Avatar 2 in 4K). Sony has promised to work with the Blu-Ray disk assocation to define a new 4K standard and has promised to release the next Spiderman movie in 4K.

    How are they going to do that? Well... they can already do 1080p 3D, so all they need to do is something less than double that. And the next codec being developed has a goal of another halving of needed bandwidth. So... Or there's always more layers...

    Bit rate for cable or satellite delivery? Well...
    Reply
  • Fanfoot - Tuesday, January 17, 2012 - link

    Looks like the Joint Video Team is targeting 2013 for the next video codec, currently tagged High Efficiency Video Coding or HEVC. It'll get deployed whether 4K is a reality or not of course, since it'll also allow lowering the bit rate for the same quality, whether for mobile video applications or simply 1080p content streaming over the internet.... Reply
  • Fanfoot - Tuesday, January 17, 2012 - link

    And it looks like the existing PS3 will be able to display 4K stills. So these TVs will work great for photo-realistic paintings or simply displaying your high-resolution camera images. Reply
  • PubFiction - Wednesday, January 18, 2012 - link

    Too me 4k displays are going to be about what used to be eyefinity. Everyone tryies to get small bezels and good monitors. 4k offers you that , it offeres you no bezel. So maybe if they are fast enough gamers will want them in place of 3 monitors. Reply
  • Assimilator87 - Wednesday, January 18, 2012 - link

    120Hz
    .
    .
    .
    4k
    .
    .
    .
    OLED

    >_<
    Reply
  • Finraziel - Wednesday, January 18, 2012 - link

    I'm actually wondering why on earth we'd need 4K displays at all? I have a full HD 42" plasma (and love it), but I barely see the difference between 1080 and 720 content. Even when downloading for free, I don't bother going for the 1080 version. Same for the bitrate, why do you need the 40 mbit rate that bluray offers when a 4 mbit file (720p 40 minute episodes are generally around 1.2 GB) looks fine?
    What I wish the industry would move towards a bit faster, is a higher framerate! Sitting 3 meters away I don't really see more pixels, but I do see chopping when the camera is panning around (even though I have a plasma, I'd probably go nuts if I'd have an LCD with a static backlight). It seems insane to me that with all the improvements to image quality over the last decades we're still stuck at 24 to 30 frames per second...
    Reply

Log in

Don't have an account? Sign up now