The First Generation Holodeck by 2016

When AMD first told me about the RV770 they told me another interesting story. For the past several years AMD (and ATI before it) has been obsessed with trying to figure out when it would be possible to render an image so convincing that it was indistinguishable (at least mostly) from reality.

Given the right art and a good technique to render the scene, this is totally possible not only within our lifetimes but within the next decade. Unfortunately, that's not enough.

Carrell estimates that the human eye can directly resolve around 7 million pixels, almost twice the resolution of a 30" display. But that's just what it's directly focusing on, all of the peripherial vision brings the total up to around 100 million pixels. The Eyefinity demo I showed earlier was running at 24.5 million pixels on a single GPU; you can estimate that at this generation we'll be able to do about 50 million pixels with two GPUs and one more generation from now we'll get to that 100 million pixel marker. That's two years for a single GPU. Then give it a few more years to be able to render that many pixels but with enough complexity to actually look real.

Rendering something at the max resolution that the human eye can resolve isn't enough however; you have to feel immersed in the graphics. That's where Eyefinity comes in, at least what it starts to come in.

Carrell believes that in seven years we can have the first generation Holodeck up and running. For those of you who aren't familiar with the Trek reference, Carrell believes it'll take seven years to be able to deliver a 180 degree hemispherical display (you're not completely surrounded by displays but at least your forward and peripheral vision is) with positionally accurate and phase accurate sound (both calculated by the GPU in real time). The GPU will also be used to recognize speech, track gestures and track eye movement/position.

It doesn't solve the issue of not being able to walk forward indefinitely, but again this is only the first generation Holodeck.

Eyefinity isn't anywhere close, but if you understand the direction: it's a start.


We're at six 2560 x 1600 displays today, is it too far fetched to imagine a totally immersive display setup that renders at life-like resolutions?

First person shooters pretty much dictate that you'll need an odd number of displays to avoid your crosshairs spanning multiple monitors. With three displays you can begin to get the immersion effect, but buy five and you'll be completely surrounded by your game. And as I mentioned before, it doesn't require any special application or OS support, the drivers take care of everything: it just appears as a single, large, surface.

It seems trivial but honestly we haven't had the ability to easily support the ridiculous display setups we always see in sci-fi movies. Eyefinity at least makes it look like we can build the PCs from the Matrix.

Will it succeed? Who knows. Does it sound gimmicky? Sure. Is it cool? Yeah, I'd say so.

If panel prices could drop significantly enough where putting together an Eyefinity display setup didn't cost more than the graphics card, I think it'd be a much easier sell. Obviously AMD's next-generation GPU is more than just Eyefinity, but you'll hear about the rest late this month.

Index
POST A COMMENT

137 Comments

View All Comments

  • Azarien - Friday, September 11, 2009 - link

    I could do this with two monitors long ago. This has more monitors and maybe have less bugs ("just works") but its still a reimplementation of an old thing. A real multi-monitor setup with independent displays, where a maximized window does NOT span across all displays is much more usable. Reply
  • Dudler - Friday, September 11, 2009 - link

    So tell me:

    What old card gives you the option of running 6 screens from one card?

    And that should be a consumer product, not a professional one. And if you actually read the article, you'll see that you CAN setup each monitor independently. Or in 5 groups. Or in 4. Or in 3. Or 2. Also as one big screen.
    Reply
  • therealnickdanger - Thursday, September 10, 2009 - link

    Watch this video closely. There are 24 1080p monitors being rendered by 5800-class Quadfire. Notice how the screens lag when the camera pans? Chances are that maybe you wouldn't notice when up close, but it certainly is distracting...

    http://www.youtube.com/watch?v=N6Vf8R_gOec">http://www.youtube.com/watch?v=N6Vf8R_gOec
    Reply
  • captcanuk - Thursday, September 10, 2009 - link

    Watch this other video and the music will distract you from the lag: http://www.youtube.com/watch?v=tzvfzJq3VTU">http://www.youtube.com/watch?v=tzvfzJq3VTU Reply
  • iwodo - Thursday, September 10, 2009 - link

    Well, OLED can do without the bezel, And i believe LED Backlight can also be done as well.
    The good things about this is, if it really do take off we can finally GET RID OF TN Panel. Because having poor vertical viewing angle would make the experience yuck.
    Reply
  • Cmiller303 - Thursday, September 10, 2009 - link

    that is fucking hideous Reply
  • wagoo - Thursday, September 10, 2009 - link

    Does the driver support tilted monitors?

    It seems to me three or five monitors in a tilted single row configuration would do best at minimizing bezel annoyance while also giving a nice increase to vertical real estate.

    8000x2560 or 4800x2560? Great if it works..
    Reply
  • Byrn - Friday, September 11, 2009 - link

    Just what I was thinking...

    Looks like it will from the article here: http://www.techradar.com/news/gaming/hands-on-ati-...">http://www.techradar.com/news/gaming/hands-on-ati-...
    Reply
  • SnowleopardPC - Thursday, September 10, 2009 - link

    I had to create an account to comment on this. I am running 2 ATI 4870's with 3 Dell 2408WFP's and a 42 inch Sony XBR on HDMI

    6 Dell 3008WFP's would be sweet and at 80FPS.

    My only question... WoW? An ATI 1x series card from 15 years ago can run WoW at 80FPS at full res...

    Why not give us some info using a game that can take advantage of a card like that.

    If you are going to pick wow, at least look at Guild Wars where the graphics can actually scale to the resolution and test the card out... need I say... Does it play Crysis at that res at 80FPS? lol
    Reply
  • ipay - Friday, September 11, 2009 - link

    Agreed - if they wanted to demo an MMORPG they should've used EVE Online, at least it has a recent graphics engine that doesn't look like ass. WoW's minimum system requirements are hardware T&L for crying out loud... that's the original GeForce!

    AnandTech benchmarked WoW way back in 2005 (see http://www.anandtech.com/video/showdoc.aspx?i=2381...">http://www.anandtech.com/video/showdoc.aspx?i=2381... and the cards there could almost hit 80FPS at 800x600, so I don't think it's that much of an achievement to hit the same performance on 10x the screen real estate.
    Reply

Log in

Don't have an account? Sign up now