The First Generation Holodeck by 2016

When AMD first told me about the RV770 they told me another interesting story. For the past several years AMD (and ATI before it) has been obsessed with trying to figure out when it would be possible to render an image so convincing that it was indistinguishable (at least mostly) from reality.

Given the right art and a good technique to render the scene, this is totally possible not only within our lifetimes but within the next decade. Unfortunately, that's not enough.

Carrell estimates that the human eye can directly resolve around 7 million pixels, almost twice the resolution of a 30" display. But that's just what it's directly focusing on, all of the peripherial vision brings the total up to around 100 million pixels. The Eyefinity demo I showed earlier was running at 24.5 million pixels on a single GPU; you can estimate that at this generation we'll be able to do about 50 million pixels with two GPUs and one more generation from now we'll get to that 100 million pixel marker. That's two years for a single GPU. Then give it a few more years to be able to render that many pixels but with enough complexity to actually look real.

Rendering something at the max resolution that the human eye can resolve isn't enough however; you have to feel immersed in the graphics. That's where Eyefinity comes in, at least what it starts to come in.

Carrell believes that in seven years we can have the first generation Holodeck up and running. For those of you who aren't familiar with the Trek reference, Carrell believes it'll take seven years to be able to deliver a 180 degree hemispherical display (you're not completely surrounded by displays but at least your forward and peripheral vision is) with positionally accurate and phase accurate sound (both calculated by the GPU in real time). The GPU will also be used to recognize speech, track gestures and track eye movement/position.

It doesn't solve the issue of not being able to walk forward indefinitely, but again this is only the first generation Holodeck.

Eyefinity isn't anywhere close, but if you understand the direction: it's a start.


We're at six 2560 x 1600 displays today, is it too far fetched to imagine a totally immersive display setup that renders at life-like resolutions?

First person shooters pretty much dictate that you'll need an odd number of displays to avoid your crosshairs spanning multiple monitors. With three displays you can begin to get the immersion effect, but buy five and you'll be completely surrounded by your game. And as I mentioned before, it doesn't require any special application or OS support, the drivers take care of everything: it just appears as a single, large, surface.

It seems trivial but honestly we haven't had the ability to easily support the ridiculous display setups we always see in sci-fi movies. Eyefinity at least makes it look like we can build the PCs from the Matrix.

Will it succeed? Who knows. Does it sound gimmicky? Sure. Is it cool? Yeah, I'd say so.

If panel prices could drop significantly enough where putting together an Eyefinity display setup didn't cost more than the graphics card, I think it'd be a much easier sell. Obviously AMD's next-generation GPU is more than just Eyefinity, but you'll hear about the rest late this month.

Index
Comments Locked

137 Comments

View All Comments

  • Zool - Friday, September 11, 2009 - link

    7680 x 3200 is nice but the DPI is praticaly same. And if u sit right before them like the guy in the wow picture its even confusing. To hawe much higher dpi u would need to watch the display from much greater distance (and than u couldnt read the chat). Of course it doesnt aply to panoramatic wiew in simulators or racing games where the 3 display setup can hawe use (arcades maybe) and in fps games for extra space for maybe map or fps wiew from other squad members like in old hired guns on amiga and pc.
  • kekewons - Friday, September 11, 2009 - link

    The big attraction for me here is the possibility to run these outputs through projectors, rather than flatscreen monitors.

    I DO spend most of my PC gaming time driving racing simulators (primarily "rFactor"), and do use a projector to throw up a LARGE, CLOSE image. Pixelation is an issue, but, IMO, the rest of the tradeoffs make it worthwhile to go this route.

    What intrigues me about this new card/system are two things: (1) The possibility of running this card output thru two or three-projector rigs, in which one or two "widescreen" projections (covering most or all of a full-surround 180 degree "dome/planetarium" space) are overlaid in the center with a smaller and more highly detailed/higher resolution third projection. If such a rig could be melded with realtime headtracking/eyetracking inside a projection CAVE *or better yet, a dome*, it seems to me we might finally realize the holy-grail: A full-surround, simulation space, at fairly nominal cost.

    (2) The possibility of enabling at least that smaller, central region for 3D (stereo) imaging. Obviously, since this is an AMD card, any stereo output would necessarily depend on alternative solutions to Nvidia 3D...but there is at least one of those solutions that might work: TI's "DLP-link," which apparently can be used to enable some new projectors (ViewSonic) with the new Crystal Eyes-5 shuttterglasses to allow 3D output (all without using Nvidia's cards and 3D specific drivers)....

    ...so let's amend that ^ to read: "A surround, 3D simulation space, at fairly nominal cost."

    Could it be we are finally getting close?
  • marc1000 - Friday, September 11, 2009 - link

    why is everyone whinning about the monitors??? the interesting part is the new GPU. when will you guys get your hands on these new boards to dissecate them for us?

    win7 is about to launch retail in less than a month, so it is about time to see those new DX 11 boards hit the market!
  • coffeehousejam - Friday, September 11, 2009 - link

    Sigh. And here I thought Anandtech readers were a brighter group of people. A 6 monitor setup pumped out of one video card is incredible, no doubt about it. But to the average consumer it's not even close to practical. Everyone is talking about the six display setup capabilities, issues with bezel and LEDs as though they are considering taking advantage of this. Guys, read between the lines: the real story is a GPU that can play DX11 titles so well that even 6 monitors at 4 times the typical person's resolution aren't even enough to bring it to its knees.
  • jimhsu - Friday, September 11, 2009 - link

    No, I'm pretty sure the optimization is resolution scaling (largely memory bound) and not necessarily raw throughput (GPU bound). Unless they have more surprises.

    They would show a demo of WoW on ultra-high resolutions for that reason. Using FSAA or Pixel Shaders will be much more stressful.
  • papapapapapapapababy - Friday, September 11, 2009 - link

    copy paste from neogaf

    My $85 4770 laughs at this news. Hell my 9800GT and even the 8600GTS sit here unused and laugh at this fuking news. BASICALLY...

    FUCK YOU ATI. FUCK YOU NVIDIA. FUCK YOU AMD. AND FUCK YOUR ASS INTEL. OH AND MICROSOFT. FUCK YOU TOO, YOU STUPID FUCKS. LET ME ASK YOU THIS...

    Where the hell is my KZ2 caliber game, exclusive for the PC? MY GT5 CALIBER GAME? AH?

    Crysis? Fuck that. YES, It was funny game for like a few days, but basically that's just throwing shit at my PC, so you FUCKS can justify selling your ridiculous hardware. That doesn't strike me as a good, intelligent, and honest effort. That's not efficient. That doesn't wow me. KZ2 does. GT5 does ( For the record, im no ps3 fan) And those games are running on a super piece of shit notebook gpu from 2005!!

    So enough of this bullshit. ENOUGH! YOU WANT ME TO BUY YOUR STUPID HARDWARE? WOW ME. USE WHAT I HAVE FOR A FUKING CHANGE. PUT SOME FUCKING EFFORT ON IT. HIGHER ANTI ALIASING AND HIGHER RESOLUTION IS NOT GOING TO CUT IT ANYMORE. IM NOT ASKING FOR MUCH. 720P AND 30FPS IS GOOD ENOUGH FOR ME.
    JUST TAKE WHATS LEFT AND SQUEEZE REALLY HARD. YOU KNOW? LIKE YOU FUCKS DO WITH THE CONSOLES. UNTIL THEN, FUCK YOU.
  • papapapapapapapababy - Friday, September 11, 2009 - link

    http://www.neogaf.com/forum/showthread.php?t=37255...">http://www.neogaf.com/forum/showthread.php?t=37255...
  • papapapapapapapababy - Friday, September 11, 2009 - link

    sometimes less is more. thats why i love my hdtv, less resolution (720p), more screen (42) and thats why i hate desktop lcds, to much fuking resolution + tinny screens. ATI this shit does not appeal to me at all. Give me a gpu that renders at low res and then scales my games ( not movies) at 1080p lcd resolution so i can play crysis on a cheap desktop lcd. This Eyegyimmicky? > stupid.
  • Wererat - Friday, September 11, 2009 - link

    In reading through the first page of comments, bezel issues were mentioned.

    I personally wouldn't want a 2x3 panel setup because of those; any even number of panels puts the center of view in the middle of a bezel.

    1x3 is great though, as race and flight simmers will attest. With that setup, most games (including shooters and RPGs) will give a dramatic peripheral view.

    Unfortunately for Matrox, this more or less kills their $300 "TripleHead2Go" product.
  • justaviking - Friday, September 11, 2009 - link

    Yes, that is indeed a cool demonstration.

    My question is "Can you mix resolutions?"

    Something I cannot do today is to have two displays of a cloned desktop, one being a different resolution than the other.

    Why would I want to do that? Sometimes I would like to display a game on the television. It accepts VGA input (yes, yes, it's old tech), but I have to change the monitor to the same resolution as the TV in order to do that. You would think it would be so simple to display the same desktop on two monitors, but you can't do it if the resolutions aren't the same.

    Obviously this card (and a hundred others) has the power to do that simple setup. I wonder if it lets you.

Log in

Don't have an account? Sign up now