The First Generation Holodeck by 2016

When AMD first told me about the RV770 they told me another interesting story. For the past several years AMD (and ATI before it) has been obsessed with trying to figure out when it would be possible to render an image so convincing that it was indistinguishable (at least mostly) from reality.

Given the right art and a good technique to render the scene, this is totally possible not only within our lifetimes but within the next decade. Unfortunately, that's not enough.

Carrell estimates that the human eye can directly resolve around 7 million pixels, almost twice the resolution of a 30" display. But that's just what it's directly focusing on, all of the peripherial vision brings the total up to around 100 million pixels. The Eyefinity demo I showed earlier was running at 24.5 million pixels on a single GPU; you can estimate that at this generation we'll be able to do about 50 million pixels with two GPUs and one more generation from now we'll get to that 100 million pixel marker. That's two years for a single GPU. Then give it a few more years to be able to render that many pixels but with enough complexity to actually look real.

Rendering something at the max resolution that the human eye can resolve isn't enough however; you have to feel immersed in the graphics. That's where Eyefinity comes in, at least what it starts to come in.

Carrell believes that in seven years we can have the first generation Holodeck up and running. For those of you who aren't familiar with the Trek reference, Carrell believes it'll take seven years to be able to deliver a 180 degree hemispherical display (you're not completely surrounded by displays but at least your forward and peripheral vision is) with positionally accurate and phase accurate sound (both calculated by the GPU in real time). The GPU will also be used to recognize speech, track gestures and track eye movement/position.

It doesn't solve the issue of not being able to walk forward indefinitely, but again this is only the first generation Holodeck.

Eyefinity isn't anywhere close, but if you understand the direction: it's a start.

We're at six 2560 x 1600 displays today, is it too far fetched to imagine a totally immersive display setup that renders at life-like resolutions?

First person shooters pretty much dictate that you'll need an odd number of displays to avoid your crosshairs spanning multiple monitors. With three displays you can begin to get the immersion effect, but buy five and you'll be completely surrounded by your game. And as I mentioned before, it doesn't require any special application or OS support, the drivers take care of everything: it just appears as a single, large, surface.

It seems trivial but honestly we haven't had the ability to easily support the ridiculous display setups we always see in sci-fi movies. Eyefinity at least makes it look like we can build the PCs from the Matrix.

Will it succeed? Who knows. Does it sound gimmicky? Sure. Is it cool? Yeah, I'd say so.

If panel prices could drop significantly enough where putting together an Eyefinity display setup didn't cost more than the graphics card, I think it'd be a much easier sell. Obviously AMD's next-generation GPU is more than just Eyefinity, but you'll hear about the rest late this month.

Comments Locked


View All Comments

  • formulav8 - Friday, September 11, 2009 - link

    I'm not sure why you think AMD's cpu's stink? All benches i've seen is they can run any game out there and push more than 30fps in all tested games not limited by the video card and even push the frames to where the video card ends up the bottleneck. No?

    Even compared to i5, the PII 9650 held its own quite well in alot of area's.

    For the past few years Intel has defintely had the trashiest iGPU's and probably will at least until the forseeable future. And I wouldn't count on Larrabee to change that all that much by the time it comes out. You can have the strongest cpu in the world but if you have gpu trash like Intels you can't game anyway at good resolutions and speeds to make good use of the fastest cpu in the world.

    Just my humble opinion :)


  • TA152H - Friday, September 11, 2009 - link

    I think people sometimes instinctively balance things out, and forget they are even doing it.

    Keep in mind that Phenom II processors are the same size as the Nehalems, and you're forced to compared the low end of a brain-damaged line with the highest end AMD CPU, and still generally come out the loser. That's not a good CPU design, if you think about it.

    I don't agree with your remarks about great processors not mattering if you don't have the GPU to go with it. Stuff like that just makes you sound irrational and too pro-AMD. Not everyone needs powerful 3D graphics, but more to the point, you can get an ATI card for your Intel motherboard. So, sorry, they don't need to make a good video card for you to benefit from it.
  • Kary - Thursday, September 10, 2009 - link

    That might have looked really cool using projectors since that would get rid of your borders...REALLY EXPENSIVE, but cool :)

    Maybe on the side of a building
  • anandreader - Thursday, September 10, 2009 - link

    The Apple store on Michigan in Chicago is using something very similar to what's being shown here. If I recall correctly, they had 8 panels arranged 2 across by 4 down. They were running an animation showing all the iPhone apps that were available. I noticed the display from the other side of Michigan and was impressed enough to cross the street to see how they did it.

    The device was probably purposely made by the LCD manufacturerer as the seams were about as wide as a single thin bezel on Samsung monitors.
  • Holly - Thursday, September 10, 2009 - link

    hmm, running that is pretty nice, especially with the framerate given, but god save us from electricity bills... screens with good colour reproduction of this size are about to take 80Watt at least each... giving half kilowatt on screens only.

    Anyway I am eager to get detailed review soon... And hopefully nVidia counterpart as well... $1000+ graphic cards are just nice to see in benchmarks so fingers crossed for fast competition...
  • teldar - Thursday, September 10, 2009 - link

    They are actually going to be $450 cards for the initial 5870's with 2GB of GDDR5 1300MHz ram on board.
    So they are probably reasonable for what they offer.
  • imaheadcase - Thursday, September 10, 2009 - link

    I've yet to see someone use a monitor setup like that for gaming. It looks terrible with the lines between monitors. Video card manufacturers have been trying that for ages now, Matrox tried and failed at it already. Let it die. AMD.

    Mission control stuff, large desktop for apps sure be nice. But to game on? yuk.

    Case in point, look at the WoW screen shot..screen cut right in half were character is located. lol

    Now if some monitor company decides to make a monitor that lets you remove bezel around edges to a smooth setup, then we can talk.
  • Havor - Saturday, September 12, 2009 - link

    Apparently you never played on a 3 monitor setup. (yeah a 6 monitor setup would suck)

    I can tell from personal experiences that it awesome.
    I have tree 22" wide screens Matrox TripleHead2Go setup, so have no problem whit bezels in the middle of my screen.

    Yeah of course the bezels bug me, but just as mouths as the roof beams do when i am driving in my car,
    And yeah a 6 monitor setup just doesn't work in most games(except games like C&C), would be the same as putting a band of duct tape on eye height around the car.

    If you wane see for your self go over to gaming site from Matrox to see some screen dumps from for example WoW
    Ore see how it looks on YouTube

    NFS is also awesome on tree monitors

    And whit the low prizes of monitors anyone can now have a 3x 22" setup
  • Havor - Saturday, September 12, 2009 - link

    Darn shitty quote and link system on Anandtech

    Matrox game site:">
    WoW site:">
    YouTube movies:">
  • Zingam - Friday, September 11, 2009 - link

    What actually counts is that the graphics card is so powerful that it can support the high resolution!!! And this is just the first generation D11 card. I doubt that we will see anytime soon DX12 so I guess the third generation DX11 cards will rock the earth and break the walls!

Log in

Don't have an account? Sign up now