OnLive: Gaming Without the Hardware Requirements

I first read about OnLive before I ever set foot on the GDC show floor thanks to Dean Takahashi’s story on the service. If you haven’t read about it, here’s what it does.

Today when you go to play a 3D game like BioShock or Unreal Tournament III, the rendering is done on your PC or console. There’s a reasonably powerful GPU and a fast CPU in your machine and it goes about calculating the color of every pixel on your screen and the impact of every explosion. The more demanding the game, the more powerful your CPU and GPU have to be.

Instead of doing that rendering locally in your home on your PC or console, OnLive seeks to do it remotely on powerful high end servers and simply stream the rendered frames to you, compressed, over the Internet.

OnLive claims that its compression technology is good enough to deliver HD resolution gaming over a standard Internet connection, without the need for a high speed PC or console. The bandwidth requirements are about 1Mbps for a 640 x 480 resolution and 5Mbps for 1280 x 720. The data compression algorithms are adaptive, so if you have poor network conditions the quality of the video sent to your screen will go down to compensate.

The benefit of OnLive is that you can play any title on any platform. OnLive will be supported on PCs, Macs and directly connected to a TV using a very light hardware client. There’s no word on pricing yet.


OnLive running on a MacBook


...and on a Dell notebook


...or on the OnLive "console" hardware; basically a video encoder, ethernet and a USB port.

OnLive does have major game developer support. The list of developers includes the big names: EA, Ubisoft and Epic.

I played BioShock at OnLive’s booth. The game was streamed to me over the Internet and for the most part, it looked and felt like BioShock. The game didn’t look as good as running on a PC with a high end graphics card, and it felt like there was some sort of frame rate smoothing going on, but the technology worked.


BioShock over the Internet. Cool.

The implications are huge. You wouldn’t need a high end PC or console to play the latest games. Game prices could be cheaper thanks to cutting out publisher and retail store overhead. Cross-platform multiplayer gaming would now be possible; if BioShock were a multiplayer game, you wouldn’t have to worry about which version was being played - you could play on your Mac with your friend on his/her PC.

OnLive is supposed to launch by the end of 2009.

The Two Most Hilarious Miyamoto Pictures Bigfoot is Back: Killer Xeno NIC
Comments Locked

38 Comments

View All Comments

  • wilkinb - Thursday, March 26, 2009 - link

    in woW latency can be a big deal...

    Stop casts in pvp or boss fights are a lot harder when you have high ping times and the cast times are short....

    Also timing CD's right time not be on shot...

    Sure if you are just doing easy content or dont care about arenas then it doesnt matter... but the same can be said about FPS...

    Also people keep saying wow isnt prashically intensive.. this again is true when you stand by your self... when you are on say a WG battle ground with say 160 other players adn all their spell effects... there are a lot of polygons... It runs slower then crysis for example.

    MOO's have to deal with more scaling issues then FPS.

    MMO's have the issue f

  • wilkinb - Thursday, March 26, 2009 - link

    no idea what happned to some of that text :(

    wtb edit.
  • randomname - Thursday, March 26, 2009 - link

    "I'm sure the video latency will be at least a second."

    In Dean Takahashi's article:

    "A packet can make an entire round trip in 80 milliseconds, a very short amount of time compared to other Internet traffic that travels through hardware that either compresses or decompresses the data."

    I'm assuming that 80 ms has all the essential stages included. Which would be supported by Anand's claim that Bioshock felt like Bioshock.

    Nevertheless, lag, reliability and bandwidth are probably the only obstacles here. All of which can be further improved. Bandwidth-wise, once you get to Blu-Ray -quality 40/48 Mbps, or at least when you get to 1080p60 4:4:4 -quality, additional bandwidth won't get you anything, really. On the server side, you would only have to buy a fraction of the number of consoles/computing hardware compared to every user buying one. And if you can do it with games, you can do it with all programs and media.
  • SSDMaster - Thursday, March 26, 2009 - link

    80ms? Okay, wonderful. I have to wait .08 Seconds till I can move my cross hairs over someone's head which is only going to be in that specific location for another .02 Seconds.

    FPS's are too fast paced for this... Especially UT 2004 types.
    Maybe you could play a horrendously slow paced FPS like Halo or something.

    Also, the US internet backbone just cannot handle this kind of streaming. Comcast gives me a 30mb connection for about 7 minutes. Then they cap it back to whatever I "really" have, which is a 6mb connection. Which "should" be enough unless everyone starts using onLive... Then I'm sure Comcast won't even be able to give me 6mb.

    My last point. Internet packets don't always arrive on time or in order, its not like you have this one internet (pipe) to your computer. Think of a stream full of rocks, and your packets (water) are taking tons of different paths, all around these rocks.

    That's really not a good enough illustration but I think you guys get the point. If you have a router between you and onLive server's which is bogged down but still working; then your connections going to suck. Not all routers prioritize packets.
  • Modeverything - Thursday, March 26, 2009 - link

    I think one solution to this could be to use UDP instead of TCP.

    Today's networks are not the ones of a decade ago when packet verification was needed. Packets rarely get lost anymore, and if you lose a single packet, big deal, you probably won't notice anyway.

    If OnLive were setup to use UDP transmission, I think it would work.
  • overzealot - Friday, March 27, 2009 - link

    It's a latency-critical app, they're definately going to use UDP, possibly with a layer of RTP(or similar protocol)
  • andrihb - Thursday, March 26, 2009 - link

    Think of them as a series of...
  • Calin - Thursday, March 26, 2009 - link

    Latency might not be a big deal in World of Warcraft - but remember that people balk at the idea of TFT monitors showing the image from 3-5 frames back - that's less than a tenth of a second. If you're talking about Internet-enabled gaming, you should consider the ping response time you get from a server when you're streaming something (from the same server, probably). Let's say ping to a game server while streaming high quality video.
    The bandwidth really needs to increase for this, and the latency needs to go down.


    As for adaptive quality based on usable bandwidth, you don't know when you're reaching your bandwidth, you only know when you're surpassing it, so that makes the game stutter (too low a bandwidth = lost packets), then go sometime in low quality mode, and then go again in high quality mode. For high quality games, the server would need the equivalent of a high-performance recent-generation video card and CPU for every gamer out there, and the hardware or CPU to compress that stream (video and audio)

    It will probably happen, but it's a long time into future

Log in

Don't have an account? Sign up now