The First Generation Holodeck by 2016

When AMD first told me about the RV770 they told me another interesting story. For the past several years AMD (and ATI before it) has been obsessed with trying to figure out when it would be possible to render an image so convincing that it was indistinguishable (at least mostly) from reality.

Given the right art and a good technique to render the scene, this is totally possible not only within our lifetimes but within the next decade. Unfortunately, that's not enough.

Carrell estimates that the human eye can directly resolve around 7 million pixels, almost twice the resolution of a 30" display. But that's just what it's directly focusing on, all of the peripherial vision brings the total up to around 100 million pixels. The Eyefinity demo I showed earlier was running at 24.5 million pixels on a single GPU; you can estimate that at this generation we'll be able to do about 50 million pixels with two GPUs and one more generation from now we'll get to that 100 million pixel marker. That's two years for a single GPU. Then give it a few more years to be able to render that many pixels but with enough complexity to actually look real.

Rendering something at the max resolution that the human eye can resolve isn't enough however; you have to feel immersed in the graphics. That's where Eyefinity comes in, at least what it starts to come in.

Carrell believes that in seven years we can have the first generation Holodeck up and running. For those of you who aren't familiar with the Trek reference, Carrell believes it'll take seven years to be able to deliver a 180 degree hemispherical display (you're not completely surrounded by displays but at least your forward and peripheral vision is) with positionally accurate and phase accurate sound (both calculated by the GPU in real time). The GPU will also be used to recognize speech, track gestures and track eye movement/position.

It doesn't solve the issue of not being able to walk forward indefinitely, but again this is only the first generation Holodeck.

Eyefinity isn't anywhere close, but if you understand the direction: it's a start.


We're at six 2560 x 1600 displays today, is it too far fetched to imagine a totally immersive display setup that renders at life-like resolutions?

First person shooters pretty much dictate that you'll need an odd number of displays to avoid your crosshairs spanning multiple monitors. With three displays you can begin to get the immersion effect, but buy five and you'll be completely surrounded by your game. And as I mentioned before, it doesn't require any special application or OS support, the drivers take care of everything: it just appears as a single, large, surface.

It seems trivial but honestly we haven't had the ability to easily support the ridiculous display setups we always see in sci-fi movies. Eyefinity at least makes it look like we can build the PCs from the Matrix.

Will it succeed? Who knows. Does it sound gimmicky? Sure. Is it cool? Yeah, I'd say so.

If panel prices could drop significantly enough where putting together an Eyefinity display setup didn't cost more than the graphics card, I think it'd be a much easier sell. Obviously AMD's next-generation GPU is more than just Eyefinity, but you'll hear about the rest late this month.

Index
Comments Locked

137 Comments

View All Comments

  • Chlorus - Friday, September 11, 2009 - link

    Can someone do my sanity a favor and ban this idiot? Do you think he actually has a job? Hell, does anyone even think he has a frakking GED? I love how he thinks every review site is engaged in a mass conspiracy against AMD.
  • Lonyo - Thursday, September 10, 2009 - link

    Yeah that's why the Radeon HD4870 review was "HD4870 - The Card To Get".
    Almost every preview article (this is a preview article) doesn't have any kind of flashy subtitle/comment, just a quick summary of what it's about.
    When the reviews of the ATI DX11 cards come out, I am sure they will have some sort of zany subtitle/comment about how amazing they are (since compared to current gen they are sure to be amazing, and I doubt by that time based on rumour we will have anything from nvidia except pre-release benchmarks if they feel desperate).
  • skrewler2 - Thursday, September 10, 2009 - link

    I didn't see this in the article. How does a graphics card with two outputs drive 6 displays? What hardware is needed to do this? Is there some kind of splitter or DP box they all plug into? I have two dell 3007wfp-hc's already and this is making me want to buy a 3rd, but I don't know if I need anything else to drive it.
  • Dudler - Friday, September 11, 2009 - link

    I think they use the "Trillian" AIB, or possibly known as the "Six". It features six mini-displayport outputs. Other articles on the net show ATi demonstrating a 24 monitor setup using 4(!!) Six cards..

    I dont know which setup(gpu) the "six" cards uses, rumours were both Cypress(aka r870) and Hemlock(akar800). From the coolers shown, I think it uses "just" Cypress.(Single chip not duallie).

    I also believe that the long socalled 5870 card shown in photos around the net is Hemlock(5870x2), not 5870.

    And for you concerned about your power bill, rumours state that the 5870 uses 28W(!!!!!!) in idle and 2D.

    This ATi generation rocks, I only hope nVidia will get their card out and survive. Anyway how you look their future is bleak. Their chipset business is coming to an end except for AMD cpu's, and theyre late with the gt300.
  • snakeoil - Thursday, September 10, 2009 - link

    come on you have a brain, what about two cards in crossfire.
  • skrewler2 - Thursday, September 10, 2009 - link

    Uh, in the article they said one card was driving 6 monitors.. what the fuck does your comment even mean?
  • wifiwolf - Thursday, September 10, 2009 - link

    from page 1:
    "The OEMs asked AMD for six possible outputs for DisplayPort from their notebook GPUs: up to two internally for notebook panels, up to two externally for conncetors on the side of the notebook and up to two for use via a docking station. In order to fulfill these needs AMD had to build in 6 lanes of DisplayPort outputs into its GPUs, driven by a single display engine. A single display engine could drive any two outputs, similar to how graphics cards work today."
  • mczak - Thursday, September 10, 2009 - link

    I can't make much sense out of this however. Current cards have two independent display controllers. These two display controllers afaik don't really share anything, so referring to them as one display engine doesn't really make a lot of sense. So any rv870 really has 6 display controllers (though probably not that many internal tmds transmitters, so if you'd want to drive more than 2 dvi/hdmi displays I'd guess you're out of luck or you'd need a card with external tmds transmitters)?
  • mczak - Thursday, September 10, 2009 - link

    actually no all passive DP/DVI converters should still work. You just need enough space on your desk :-)
  • therealnickdanger - Thursday, September 10, 2009 - link

    Am I the only one here that thinks the real story is not the multi-monitor support, but rather the ridiculous GPU power driving them all? I realize they haven't fully disclosed the specs publically, but 7000x3000 resolution over 60fps? The article barely seems impressed by these numbers. Was this the performance expected from single-GPU setups for this generation? I didn't see this coming at all and I'm completely floored!

    Also, I would just like to add that I have always preferred being able to segregate displays so that I can easily maximize multiple applications within their own screen. Having everything as "one giant display" has been possible for years and is a less than desirable for everything BUT gaming... IMO

Log in

Don't have an account? Sign up now