Word comes from NVIDIA this afternoon that they are rolling out a beta update to their GRID game streaming service. Starting today, the service is adding 1080p60 streaming to its existing 720p60 streaming option, with the option initially going out to members of the SHIELD HUB beta group.

Today’s announcement from NVIDIA comes as the company is ramping up for the launch of the SHIELD Android TV and its accompanying commercial GRID service. The new SHIELD console is scheduled to ship this month, meanwhile the commercialization of the GRID service is expected to take place in June, with the current free GRID service for existing SHIELD portable/tablet users listed as running through June 30th. Given NVIDIA’s ambitions to begin charging for the service, it was only a matter of time until the company began offering the service, especially as the SHIELD Android TV will be hooked up to much larger screens where the limits of 720p would be more easily noticed.

In any case, from a technical perspective NVIDIA has long had the tools necessary to support 1080p streaming – NVIDIA’s video cards already support 1080p60 streaming to SHIELD devices via GameStream – so the big news here is that NVIDIA has finally flipped the switch with their servers and clients. Though given the fact that 1080p is 2.25x as many pixels as 720p, I’m curious whether part of this process has involved NVIDIA adding some faster GRID K520 cards (GK104) to their server clusters, as the lower-end GRID K340 cards (GK107) don’t offer quite the throughput or VRAM one traditionally needs for 1080p at 60fps.

But the truly difficult part of this rollout is on the bandwidth side. With SHIELD 720p streaming already requiring 5-10Mbps of bandwidth and NVIDIA opting for quality over efficiency on the 1080p service, the client bandwidth requirements for the 1080p service are enormous. 1080p GRID will require a 30Mbps connection, with NVIDIA recommending users have a 50Mbps connection to keep from any other network devices compromising the game stream. To put this in perspective, no video streaming service hits 30Mbps, and in fact Blu-Ray itself tops out at 48Mbps for audio + video. NVIDIA in turn needs to run at a fairly high bitrate to make up for the fact that they have to all of this encoding in real-time with low latency (as opposed to highly optimized offline encoding), hence the significant bandwidth requirement. Meanwhile 50Mbps+ service in North America is still fairly rare – these requirements all but limit it to cable and fiber customers – so at least for now only a limited number of people will have the means to take advantage of the higher resolution.

NVIDIA GRID System Requirements
  720p60 1080p60
Minimum Bandwidth 10Mbps 30Mbps
Recommended Bandwidth N/A 50Mbps
Device Any SHIELD, Native Or Console Mode Any SHIELD, Console Mode Only (no 1080p60 to Tablet's screen)

As for the games that support 1080p streaming, most, but not all GRID games support it at this time. NVIDIA’s announcement says that 35 games support 1080p, with this being out of a library of more than 50 games. Meanwhile I’m curious just what kind of graphics settings NVIDIA is using for some of these games. With NVIDIA’s top GRID card being the equivalent of an underclocked GTX 680, older games shouldn’t be an issue, but more cutting edge games almost certainly require tradeoffs to maintain framerates near 60fps. So I don’t imagine NVIDIA is able to run every last game with all of their settings turned up to maximum.

Finally, NVIDIA’s press release also notes that the company has brought additional datacenters online, again presumably in anticipation of the commercial service launch. A Southwest US datacenter is now available, and a datacenter in Central Europe is said to be available later this month. This brings NVIDIA’s total datacenter count up to six: USA Northwest, USA Southwest, USA East Coast, Northern Europe, Central Europe, and Asia Pacific.

Source: NVIDIA

POST A COMMENT

62 Comments

View All Comments

  • chizow - Wednesday, May 13, 2015 - link

    @xthetenth:

    Why do you have a problem with: "Nvidia is trying their hardest to replace pc gaming with nvidia gaming, and that's not a good thing."?

    Do you think a bunch of verbatim console ports with no PC-specific features or settings options is a good thing? Its just incredibly hypocritical and ironic when AMD fans, supporters, sympathizers and supposed PC enthusiasts constantly say things like:

    1) XYZ game is just another crappy console port. Yet when a company like Nvidia tries to work with devs to bring additional features, even some that work on competitor hardware, its bad for PC gaming? lol.
    2) We need AMD for competition! Competition is always good! But when Intel and Nvidia compete and establish dominance, competition is bad and they are competing too hard.
    3) Close and proprietary are evil, even when they drive innovation in the marketplace and produce results (G-Sync, CUDA). But when AMD does proprietary for their own gains and product differentation, it's A-OK!

    Just some clear examples of the kind of double-standards and hypocrisy AMD and their fans exhibit, regularly.

    Bottom line is this, you say Nvidia trying to make PC gaming Nvidia gaming and its a bad thing, but what's to stop you from simply not using said features. And, do you think a game without Nvidia features is better or worst as a result of them? Just curious.
    Reply
  • yannigr2 - Thursday, May 14, 2015 - link

    chizow
    Seeing you all over the internet defending/supporting Nvidia, it is really fun watching you talking about hypocrisy.

    1) Additional features are good. But games that are sponsored from Nvidia don't come with just addition features. Usually come with problems for the competitor's hardware. Sometimes means also the removal of competitor's features like the DirectX 10.1 that was removed from Assassin's Creed because Nvidia cards where not supporting it at the time.

    2)You can't have competition with Intel and Nvidia controlling the market throw financial strength. Nvidia goes even further by trying to lock everybody in their proprietary ecosystem. They try to guaranty that in the future their will be NO competition.

    3) Close and proprietary are great for driving innovation, but after some time in the market returning the investment to the company created them, it is in everybody's best interest to be replaced by open standards. Because while proprietary stuff can drive innovation in the beginning, it can add obstacles latter. AMD created a proprietary tech like Mantle, the drived innovation in the correct direction and then stepped down for DX12 and Vulkan. So yes, when AMD does it it is A-OK because they do it the right way. They did the same with AMD64.

    The only hypocrite here is you, fortunately. And yes a game with a good physics engine looks much better that one without one. And unfortunately games that use PhysX usually are under performing in that kind of visuals when turning PhysX off, not taking advantage the rest of the system's resources to create an equal experience. Just a coincidence.
    Reply
  • chizow - Thursday, May 14, 2015 - link

    @yannigr2: I defend innovation, features, benefits of certain products, you defend garbage and half-assery, see the difference? :D

    1) Any GameWorks game that has additional features implemented by Nvidia are BETTER than the console version period, if there's a problem AMD should work on sorting them out. But that's not their MO. Their MO is to half-ass some tech out there, half-support it, and then when there's a problem, claim its Open and thus, not their issue! We've seen this dozens of times, FreeSync, HD3D, Mantle, and even the DX10.1 bug you are going waaaay back on. As soon as there are any problems with these AMD solutions, AMD throws it back on the vendor to fix lolol.

    2) No, I don't think you and various other socialist hippies from non-capitalist countries even understand what competition means. You simply want tech sharing in some happy global co-op. Except that's not how it works. Nvidia has every right to invest in tech and value-add features that benefit themselves, their users, and their shareholders. They have no obligation to help otherwise, but they still do when it makes sense. That's true competition and the fact of the matter is, Nvidia's competition and innovation has pushed AMD to the brink. You bet on the loser. The sooner you and the rest of the AMD fanboy hippies get this, the better, but I know you understand this, because you were hypocritically espousing the benefits of the closed and proprietary Mantle for months until it failed and died a few months ago.

    3) Except Nvidia and any other innovator has no incentive to do this. They are their market leader, they have no obligation to do the work and give it to everyone, especially when all they did was "Compete" as you claimed was necessary. So again, stop being hypocritical and acknowledge the fact Nvidia was simply better at competing, because as we have seen with Mantle, AMD attempted to do the same much to the delight of their fanboys like you, they just FAILED at making it stick. Of course, to any non-fanboy, this was the only possible outcome because AMD simply did not have the market position, funds, or clout to drive a proprietary API in the marketplace. Lesson learned, only hundreds of millions of resources direly needed elsewhere wasted. And what do you get some 18-24 months later? A late product to combat Maxwell, a nearly full stack of rebrands, and complete slaughter in the marketplace nearing historical highs in the 75-80% range in favor of Nvidia.

    So yes, if you have a problem with Nvidia's features, simply turn them off! Enjoy the AMD Radeon experience of dumbed-down console ports, that is what YOU CHOSE when you stupidly voted with your wallet. And now you want to cry about it. lol. GLHF, it will all be over soon.
    Reply
  • yannigr2 - Thursday, May 14, 2015 - link

    @chisow
    first paragraph Nvidia's advertisement

    1) Nvidia's marketing department makes a speech. What we learn here? DX10.1 was a bug. LOL! Nice one.

    2) Continues. What we learn here? We are "socialist hippies from non-capitalist countries". Damn. Busted! LOL!

    3) And... continues. Nvidia market leaders. AMD failure. Got that. Thanks for the info.

    I hope they pay you for this.
    Reply
  • chizow - Thursday, May 14, 2015 - link

    @yannigr2's usual backpedaling, deflecting, stupidity when called on his BS:

    1) Yes it was a bug, but given AMD fanboys' low standards, they would rather have a buggy, faster solution that skipped an entire lighting pass! LOL. BF4 Mantle was another great example of this, I guess it should be fast if its not rendering everything it should. Remember BF4 Fanboy Fog TM? :D

    http://techreport.com/news/14707/ubisoft-comments-...
    "In addition to addressing reported glitches, the patch will remove support for DX10.1, since we need to rework its implementation. The performance gains seen by players who are currently playing Assassin’s Creed with a DX10.1 graphics card are in large part due to the fact that our implementation removes a render pass during post-effect which is costly. "

    2. Yes, you obviously are, because you have no idea what competition actually means, and when a company competes your fanboy favorite into the ground, suddenly competition is bad and they are competing too hard. Ouch! Stop it! Imma cry. Competition hurts! :'(

    3) Mantle was a failure, to any non-Fanboy. Complete disaster for AMD. And for what? You're Greek, you should know damn well what a Pyrrhic Victory is. So AMD fanboys can claim dead Mantle lives on in spiritual successor Vulkan (hey look, another Greek reference!), but who gives a crap when Vulkan will be irrelevant as well and AMD pumped hundreds of millions into a dead-end API. Funds that would have been better spent elsewhere!!!!!

    Pay for what? LOL. Like I pay for super awesome monopoly approved Intel processors for the last 9 years since AMD got Conroe'd!!! Let go of the fanboy reigns and enjoy what the tech world has to offer! Free yourself from the bondage of the dying techbottom feeders known collectively known as AMD fanboy and enjoy!
    Reply
  • yannigr2 - Thursday, May 14, 2015 - link

    Oh my. Chizow the Nvidia Fanboy just gone full overclock. So much BS in your comment, so much admiration for Nvidia, so much hate for AMD, so many favorable conclusions, so much one sided (para)logic. DX 10.1 was a bug. Still hilarious. DX10.1 stopped being a bug after Nvidia supported it of course. Reply
  • chizow - Thursday, May 14, 2015 - link

    Oh my yannigr2, ignorantly commenting as usual, ignoring relevant links with the entire back story with comments from both the vendors and the game developers. But this is the usual MO for AMD and their fanboy. Launch a bunch of promises on a slide deck, let misinformation/FUD grow and bloom, then ignore relevant actual proof to the contrary.

    I am sure you will tell me how you are enjoying how Free FreeSync is flashing all your monitors firmware enjoying 9-240Hz refresh rates on every cheap monitor on the market that has no additional hardware or cost huh?

    LMAO, idiots.

    PS. Nvidia never supported DX10.1, like Mantle, it was another irrelevant early-adoption effort from AMD. After it rolled its features into DX11 however, Nvidia did as they always do, they did DX11 done right and of course, killed AMD in one of the main DX10.1 features AMD was trumpeting the whole time: tesselation. Then of course, suddenly tesselation isn't important to AMD and Nvidia is competing too hard, devs are using too much tesselation etc. etc. lol

    QQ more AMD fanboy.
    Reply
  • yannigr2 - Friday, May 15, 2015 - link

    Yeah right. Game developers. Ubisoft. LOL LOL LOL and more LOLs. Everyone knows Ubisoft and everyone knows their relationship with Nvidia.

    "PS. Nvidia never supported DX10.1"
    Low end Nvidia 200 series (205, 210, 220 and 240) and OEM 300 series is DX10.1 moron. What? The technical department failed again to inform you of the marketing department?
    Reply
  • chizow - Friday, May 15, 2015 - link

    Except this happened long before GameWorks, and it is a DIRECT quote from the developer with independently verified links showing graphical anomalies, so yes, keep burying your fanboy head in the sand as I am sure you will once again stupidly claim AMD's buggy (literal) solutions are better lol.

    PS: Clearly I don't care about low-end tech, so yeah great, low-end junk OEM parts supported DX10.1 but that doesn't change the fact Nvidia did not care to support it and it became irrelevant until it rolled into DX11, at which suddenly DX10.1 features were bad for AMD fanboys because Nvidia did it better. :)
    Reply
  • close - Wednesday, May 13, 2015 - link

    An analogy isn't meant to be perfect, just similar enough. But still I don't think you understood the main issue and what people are complaining about. Nobody said anything about running Nvidia software on AMD GPU. It's about being able to run Nvidia software on Nvidia GPU while also having an AMD GPU in your system. Disabling Nvidia components that I paid for just because it detects that I also have an AMD card is plain wrong. It's a crappy way of fighting competition forcing me to remove an AMD card from my system just so I can use something that I have already paid for. And no, you don't see this on the box.

    What if Intel decides to disable some feature on their CPUs or the software support as soon as it detects an AMD card? How can that be considered fair practice?

    So I'm not hoping to run BMW software on a merc. I am demanding however to be allowed to run the BMW as I bought it regardless of the fact that I also want to own another different brand. I bought a complete set of components + features and I wand to use them all as long as promised.
    Reply

Log in

Don't have an account? Sign up now