Sony said at the National Association of Broadcasters (NAB) trade show that it had developed a screen featuring a 16K resolution that uses its Crystal LED technology. The display is currently being installed at a new research center of Shiseido, a Japanese cosmetics group.

The screen is 19.2 meters (63 feet) long and 5.4 meters (17 feet) high, it features a diagonal of 783 inches and is generally larger than a bus. Sony does not disclose exact resolution of the display (other than saying that it has around 16,000 horizontal pixels), though judging by the looks of the screen we are dealing with something that has a non-standard resolution and a non-standard aspect ratio.

Sony’s 16K 783-inch screen uses the company’s Crystal LED technology that uses multiple Micro LED-based modules to build custom displays featuring virtually any size, any resolution, and any aspect ratio. Featuring individually-controlled Micro LEDs, the modules have no bezels and can be attached to each other seamlessly. Sony and Samsung use Micro LED/direct-lit LED-based modules to build custom screens for cinemas, airports, showrooms, and other venues that need large displays.

As 8K content is slow to emerge, it will take the industry quite some time to adopt a 16K resolution for professional and consumer applications. Therefore, such ultra-large ultra-high-definition screens will be limited for the corporate/digital signage world for years to come.

Related Reading

Source: BBC

POST A COMMENT

16 Comments

View All Comments

  • LordSojar - Thursday, April 11, 2019 - link

    In 10 years, our walls will be smart home and media centers displays. I welcome our robot overlords in 30-40 years; I'll be getting advanced in years by then, so if they kill me it won't be near as a big of a deal. Reply
  • Notmyusualid - Sunday, April 14, 2019 - link

    Actually Sojar - you'll be first dead!

    The AI will recognise your auroa of 'dissent', based on where your Google Glasses pointed, your broswing history (no madated by law as undeletable), Kindle books read, songs downloaded, and needless to say Youtube history, Amazon also reported to the cops you bought a signal-jammer and a money counter....and AI will naturally go after the intelligensia who might 'awaken' the young.

    I pray for you pal. I suppose I'll be there right along side you.

    Not many of us left.

    :)
    Reply
  • shabby - Thursday, April 11, 2019 - link

    20 ppi... i'm blown away! Reply
  • quiksilvr - Thursday, April 11, 2019 - link

    Considering that most movie theaters are about 5-10 ppi this is actually pretty good. Reply
  • Kevin G - Thursday, April 11, 2019 - link

    This isn’t really that impresssive given that it is LED tile. Such aggregate resolutions have been available for years. The way LED tiles work is you essentially build them up like LEGO bricks and scaling up the backend would permit any arbitrary resolution.

    What would be impressive is if that was seen by devices as one logical display. Currently due to the seamless modular nature of LED, this display is likely a 4x3 condfiguration of 4K logical displays.
    Reply
  • Duncan Macdonald - Thursday, April 11, 2019 - link

    Because of bandwidth limitations, a screen greater than 4k has problems with refresh rates > 60Hz. DisplayPort has a maximum uncompressed HDR resolution of 4k at 60Hz. If the screen is to have a reasonable refresh rate then multiple cables will be needed from the driving PC (and the PC will probably need multiple graphics cards). (I assume that it will be driven from a PC as I am not aware of any non-PC solution that could drive this large a screen.) Reply
  • Dragonstongue - Thursday, April 11, 2019 - link

    so you basically mean contrary to Nvidia as usual BS chest pounding, Multi-GPU still has a use consumer wise (gamers) got used to calling them SLI or Crossfire.

    Almost like Nv knows full well top end folks very likely still have need of dual gpu (even if edge cases) where having a harsh set limit instead of just being happy your are making massive $$$$ as a result of "allowing" multi-gpu and/or for different brands to pair up with each other to properly drive these mega resolutions at reasonable levels and settings.

    I am surprised at least some of the top end TV not have their own dedicated controller so driving that 4k+ "TV" becomes no harder than running a low 3d task instead of mass wasted "work" (look at Final Fantasy (newest one) for the built in benchmark running crud where it does not need to be just to ensure everything is "loaded" properly or some crud excuse, if a "dedicated" processor like G-sync or w/e could actually handle some of the "heavy lifting" that would be great.

    Instead we have games etc wasting massive potential performance or being hamstrung by BS crud slinging companies (Intel, Nv, Apple to name some) saying you need something 4-5x more powerful than you actually need just so they can turn around not that long after trash its performance via "drivers" cut deals with game publisher etc to "rinse and repeat"

    At least if the TV or w/e handled a chunk of the load it could mean that no in fact you do not need the branded" kool aid at $$$$$$$$ to enjoy the newest stuff, just update the much much less $ "booster" on the TV itself almost like using a switch on its docking station it "opens it way up" performance wise, or like PS4 and its VR "helmet" that combines a few extra chios to "boost" what it normally was not capable of.

    I do not see what they do not do such things, ESPECIALLY for the workstation/photo/gaming crowd, I know they do have chips in TV they have for a long time, and router/modem as well, though it really has not been all that long that they started using a cruddy dedicated processor tor memory, clock speeds and the like, this would be taking this that extra step and making it more than a fancy calculator cpu but an actual functional "booster" if that is making sense ^.^

    take care o7
    Reply
  • Kevin G - Thursday, April 11, 2019 - link

    That is the trick. It appears to end users as one display. To a source in this Sony example, it is likely eight UHD signals. Oddly, I would expect that there would be 32 LED drivers as the common platforms are still based around a 1080p design (newer 4K units are out there but far more rare). A multiview processor or some video switchers can take those eight UHD source signals and splice them into the 32 lower resolution 1080p signals.

    As for the refresh rates, LEDs are actually kind of insane. 480 Hz and 960 Hz are rather common for the individual tiles with some going far, far higher. And these can actually be reached but again, you are generally limited to the normally 1080p capable LED driver. You'd have to limit the resolution to 640 x 360 coming out of the LED driver (you'd get 540 Hz max) which is basically one cabinet of tiles and an insanely large multiview processor/switcher to handle the splicing.

    As for a means of driving it from a single cable, I can actually think of one non-standard means: 100 Gbit Ethernet using a video over IP protocol like SDVoE, a switch with a 100 Gbit uplink, and a video over IP capable LED driver (which do exist). The SDVoE side of things would require the 100 Gbit Ethernet card in reality to be a FPGA as it would need to handle some of the SDVoE protocol. Source video cards would need to be capable of passing multiple frame buffers directly to the FPGA over the PCIe bus in the host system. From there the switch takes the multiple frame buffer streams and then sends it to the necessary number of IP based LED controllers. This is actually how the future of big LED walls is going to progress as it simplifies so much on the cabling end and a switch with a 100 Gbit uplink that can be used to link to multiple IP based LED drivers is actually less expensive than the multiview processors/video switchers. Beyond this, the next logical step is connect directly via IP to the cabinets for the tiles and remove the more traditional driver boxes entirely. Engineering is already happening to this end on the manufacturer side.
    Reply
  • nathanddrews - Thursday, April 11, 2019 - link

    On the contrary - these MicroLED displays (Sony CLEDIS, Samsung Wall) are very impressive.
    Like any given display, it's up to the controller to set the parameters for display resolution/refresh. One controller can drive an entire group of panels as one display. Doesn't matter if it's made up of 20 panels or just one, the result is seamless if done properly. Last presentation I saw stated that production of smaller sub-panels (finer pitch) is on track for mass adoption by 2022, meaning we'll probably forget all about LCD and OLED by 2024. Higher resolutions, larger displays, way higher nits, perfect black, no burn-in (allegedly), lower power, better viewing angles. Exciting stuff!
    Reply
  • Kevin G - Thursday, April 11, 2019 - link

    I've actually built a 4320 x 2160 (two 4K side by side). For end users, it is just two 4K displays being presented via a switcher system. The backend switcher system took each 4K input and split it into four 1080p signals that then feed into a proprietary interface box. From there, each of the eight interface boxes drove four cabinets of panels each holding eight actual LED tiles. It really is like building with Lego. Pixel pitch was 1.6 mm on these panels so this Sony demo actually wasn't that much bigger than the unit I installed even though it is of lower resolution.

    The wall I built was based off of older NovaStar equipment which is why the LED interface boxes were limited to 1080p. Newer NovaStar and alternative LED drivers now accept 4K inputs. Barco and Christie also make LED driver interfaces that accept a 4K video signal directly from an IP connection (SDVoE).

    The Sony demo referenced here actually isn't that impressive either which uses either a 1.2 mm or 1.25 mm pixel pitch. I've personally seen a 0.9 mm units and 0.7 mm based LED panels are in production now. In about same area, that'd be more like a 22,400 x 6,300 resolution using the finer pitched products available today. It just comes down to cost and how much effort is put into the backend to drive that many pixels.
    Reply

Log in

Don't have an account? Sign up now