The First Generation Holodeck by 2016

When AMD first told me about the RV770 they told me another interesting story. For the past several years AMD (and ATI before it) has been obsessed with trying to figure out when it would be possible to render an image so convincing that it was indistinguishable (at least mostly) from reality.

Given the right art and a good technique to render the scene, this is totally possible not only within our lifetimes but within the next decade. Unfortunately, that's not enough.

Carrell estimates that the human eye can directly resolve around 7 million pixels, almost twice the resolution of a 30" display. But that's just what it's directly focusing on, all of the peripherial vision brings the total up to around 100 million pixels. The Eyefinity demo I showed earlier was running at 24.5 million pixels on a single GPU; you can estimate that at this generation we'll be able to do about 50 million pixels with two GPUs and one more generation from now we'll get to that 100 million pixel marker. That's two years for a single GPU. Then give it a few more years to be able to render that many pixels but with enough complexity to actually look real.

Rendering something at the max resolution that the human eye can resolve isn't enough however; you have to feel immersed in the graphics. That's where Eyefinity comes in, at least what it starts to come in.

Carrell believes that in seven years we can have the first generation Holodeck up and running. For those of you who aren't familiar with the Trek reference, Carrell believes it'll take seven years to be able to deliver a 180 degree hemispherical display (you're not completely surrounded by displays but at least your forward and peripheral vision is) with positionally accurate and phase accurate sound (both calculated by the GPU in real time). The GPU will also be used to recognize speech, track gestures and track eye movement/position.

It doesn't solve the issue of not being able to walk forward indefinitely, but again this is only the first generation Holodeck.

Eyefinity isn't anywhere close, but if you understand the direction: it's a start.


We're at six 2560 x 1600 displays today, is it too far fetched to imagine a totally immersive display setup that renders at life-like resolutions?

First person shooters pretty much dictate that you'll need an odd number of displays to avoid your crosshairs spanning multiple monitors. With three displays you can begin to get the immersion effect, but buy five and you'll be completely surrounded by your game. And as I mentioned before, it doesn't require any special application or OS support, the drivers take care of everything: it just appears as a single, large, surface.

It seems trivial but honestly we haven't had the ability to easily support the ridiculous display setups we always see in sci-fi movies. Eyefinity at least makes it look like we can build the PCs from the Matrix.

Will it succeed? Who knows. Does it sound gimmicky? Sure. Is it cool? Yeah, I'd say so.

If panel prices could drop significantly enough where putting together an Eyefinity display setup didn't cost more than the graphics card, I think it'd be a much easier sell. Obviously AMD's next-generation GPU is more than just Eyefinity, but you'll hear about the rest late this month.

Index
POST A COMMENT

137 Comments

View All Comments

  • 7Enigma - Friday, September 11, 2009 - link

    I had said this after the last launch of gpu's but I think AMD/NVIDIA are on a very slippery slope right now. With the vast majority of people (gamers included) using 19-22" monitors there are really no games that will make last gen's cards sweat at those resolutions. Most people will start to transition to 24" displays but I do not see a significant number of people going to 30" or above in the next couple of years. This means that for the majority of people (lets face it CAD/3D modeling/etc. is a minority) there is NO GOOD REASON to actually upgrade.

    We're no longer "forced" to purchase the next great thing to play the newest game well. Think back to F.E.A.R., Oblivion, Crysis (crap coding, but still); all of those games when they debuted were not able to be played even close to the max settings on >19" monitors.

    I haven't seen anything yet coming out this year that will tax my 4870 at my gaming resolution (currently a 19" LCD, looking forward to a 24" upgrade in the next year). That is 2 generations back(depending on what you consider the 4890) from the 4970, and the MAINSTREAM card at that.

    We are definitely in the age where the GPU, while still the limiting factor for gaming and modeling, has surpassed what is required for the majority of people.

    Don't get me wrong, I love new tech and this card looks potentially incredible, but other then new computer sales and the bleeding edge crowd, who really needs these in the next 12-24 months?
    Reply
  • Zingam - Friday, September 11, 2009 - link

    Everybody is gonna buy this now and nobody will look at GF380 :D


    Vision and Blindeye technoglogies by AMD are a must have ones!!!

    Nvidia and Intel are doomed!

    Reply
  • Holly - Friday, September 11, 2009 - link

    Please correct me if I am wrong but it seems to me that card simply lacks enough RAM to run at that resolution...

    Forgeting everything except framebuffer
    7680*3200*32bits per pixel = 98,304,000 bytes
    now add 4x FSAA... 98,304,000 * 4*4 = 1,572,864,000 bytes (almost 1.5 GB)

    we are quite close to the roof already if the 2GB RAM on card informations are correct... and we dropped Z-Buffer, Stencil Buffer, textures, everything...
    Reply
  • Zool - Saturday, September 12, 2009 - link

    "7680*3200*32bits per pixel = 98,304,000 bytes" u are changing bits to Bytes :P. Thats only 98,304,000/8 Byts. Reply
  • Holly - Sunday, September 13, 2009 - link

    no, I write it in bits so people are not puzzled where I took 4 bytes multiplication.... 7680*3200*4 = 98,304,000 Bytes.
    if it was in bits... 7680*3200*32 = 786,432,000 bits
    Reply
  • Dudler - Friday, September 11, 2009 - link

    They use the Trillian(Six?) card. AFAIK no specs has been leaked about this card. The 5870 will come in 1 and 2 gig flavours, maybe the "Six" will come with 4 as an option?

    Reply
  • poohbear - Friday, September 11, 2009 - link

    why is so much attention given to its support for 6 monitors? that's cool and all, but who on earth is gonna use that feature? seriously, lets write stuff for your target middle class audience, techies that generally dont have $1600 nor the space to spend on 6 displays. Reply
  • Dudler - Friday, September 11, 2009 - link

    A 30" screen is more expensive than 2 24" screens. So when Samsung (And the other WILL follow suit) comes with thin bezel screens, high resolutions will become affordable.

    So seriously, this is written for the "middle" class audience, you just have to understand the ramifications of this technology.

    And as far as I know, OLED screens can be made without bezels entirely.. I guess the screen manufacturers is going to push that tech faster now, since it actually can make a difference.
    Reply
  • jimhsu - Friday, September 11, 2009 - link

    I remembered when 17 inch LCDs looked horrible and cost almost 2000$. This is a bargain by comparison, assuming you have the app to take advantage of it. Reply
  • camylarde - Friday, September 11, 2009 - link

    8th - Lynnfield article and everybody drools to death about it.
    10th - 58xx blog post and everybody forgets Lynnfield and talks about AMD.

    15th - Wonder what Nvidia comes up with ;-)
    Reply

Log in

Don't have an account? Sign up now