NVIDIA’s GeForce 800M Lineup for Laptops and Battery Boost
by Jarred Walton on March 12, 2014 12:00 PM ESTOther Features: GameStream, ShadowPlay, Optimus, etc.
Along with Battery Boost, the GTX class of 800M GPUs will now also support NVIDIA’s GameStream and ShadowPlay technologies, again through NVIDIA’s GeForce Experience software. Unlike Battery Boost, these are almost purely software driven solutions and so they are not strictly limited to 800M hardware. However, the performance requirements are high enough that NVIDIA is limiting their use to GTX GPUs, and all GTX 700M and 800M parts will support the feature, along with the GTX 680M, 675MX, and 670MX. Basically, all GTX Kepler and Maxwell parts will support GameStream and ShadowPlay; the requirement for Kepler/Maxwell incidentally comes because GameStream and ShadowPlay both make use of NVIDIA’s hardware encoding feature.
If you haven’t been following NVIDIA’s software updates, the quick summary is that GameStream allows the streaming of games from your laptop/desktop to an NVIDIA SHIELD device. Not all games are fully supported/optimized, but there are over 50 officially supported games and most Steam games should work via Steam’s Big Picture mode. I haven’t really played with GameStream yet, so I’m not in a position to say much more on the subject right now, but if you don’t mind playing with a gamepad it’s another option for going mobile – within the confines of your home – and can give you much longer unplugged time. GameStream does require a good WiFi connection (at least 300Mbps 5GHz, though you can try it with slower connections I believe), and the list of GameStream-Ready routers can be found online.
On a related note, something I'd really like to see is support for GameStream extended to more than just SHIELD devices. NVIDIA is already able to stream 1080p content in this fashion, and while it might not match the experience of a GTX 880M notebook running natively, it would certainly be a big step up from lower-end GPUs and iGPUs. Considering the majority of work is done on the source side (rendering and encoding a game) and the target device only has to decode a video stream and provide user I/O, it shouldn't be all that difficult. Take it a step further and we could have something akin to the GRID Gaming Beta coupled with a gaming service (Steam, anyone?) and you could potentially get five or six hours of "real" gaming on any supported laptop! Naturally, NVIDIA is in the business of selling GPUs and I don't see them releasing GameStream for non-NVIDIA GPUs (i.e. Intel iGPUs) any time soon, if ever. Still, it's a cool thought and perhaps someone else can accomplish this. (And yes, I know there are already services that are trying to do cloud gaming, but they have various drawbacks; being able to do my own "local cloud gaming" would definitely be cool.)
ShadowPlay targets a slightly different task, namely that of capturing your best gaming moments. When enabled in GFE, at any point in time you can press Alt+F10 to save up to the last 20 minutes (user configurable within GFE) of game play. Manual recording is also supported, with Alt+F9 used to start/stop recording and a duration limited only by the amount of disk space you have available. (Both hotkeys are customizable as well.) The impact on performance with ShadowPlay is typically around 5%, and at most around 10%, with a maximum resolution of up to 1080p (higher resolutions will be automatically scaled down to 1080p).
We’ve mentioned GeForce Experience quite a few times now, and NVIDIA is particularly proud of all the useful features they’ve managed to add to GFE since it first went into open beta at the start of 2013. Initially GFE’s main draw was the ability to apply “optimal” settings to all supported/detected games, but obviously that’s no longer the only reason to use the software. Anyway, I’m not usually much of a fan of “automagic” game settings, but GFE does tend to provide appropriate defaults, and you can always adjust any settings that you don’t agree with. AMD is trying to provide a similar feature via their Raptr gaming service, but by using a GPU farm to automatically test and generate settings for all of their GPUs NVIDIA is definitely ahead for the time being.
NVIDIA being ahead of AMD applies to other areas as well, to varying degrees. Optimus has seen broad support for nearly every laptop equipped with an NVIDIA GPU for a couple years now, and the number of edge cases where Optimus doesn’t work quite as expected is quite small – I can’t remember the last time I had any problems with the feature. Enduro tends to work okay on the latest platforms as well, but honestly I haven’t received a new Enduro-enabled laptop since about a year ago, and there have been plenty of times where Enduro – and AMD’s drivers – have been more than a little frustrating. PhysX and 3D Vision also tend to get used/supported more than the competing solutions, but I’d rate those as being less important in general.
91 Comments
View All Comments
MrSpadge - Wednesday, March 12, 2014 - link
Yeah, 4 letters and 3 numbers are just not enough to get this point across.lordmocha - Wednesday, March 12, 2014 - link
"and while I can’t verify the numbers they claim to provide better performance with a 840M than Iris Pro 5100 while using less than half as much power."i think you mean 5200
---
the iris pro in the macbook retina 15" is actually quite amazing for the casual gamer:
doat2: 1920x1200 maximum FXAA 67fps
csgo: 1920x1200 high FXAA TRI 82fps (107fps inside office in corridor)
sc2: 1920x1200 texture = ultra| graphics = med = 78fps
gw2: 1920x1200 - Best Appearance 21fps | Autodetect 50fps (60fps on land) | Best Performance 91fps
diablo3: 1920x1200 high = 56fps
blzd - Wednesday, March 12, 2014 - link
While using 2x as much power as a low end dedicated GPU. Intel just threw power efficiency out the window with Iris Pro.IntelUser2000 - Thursday, March 13, 2014 - link
With Cherry Trail, they will be able to put the HD 4400 level of performance in Atom chips.Both Nvidia and Intel have secret sauce to tremendously improve performance/watt in the next few years or so to push HPC.
Broadwell should be the first result for Intel in that space, while Nvidia starts with Maxwell. The eventual goal for both companies are 10TFlop DP at about 200W in 2018-19 timeframe. Obviously the efficiency gains gets pushed down in graphics.
lordmocha - Sunday, March 16, 2014 - link
yes that is true, but with any gaming laptop you'd only get 2 or 3 hours battery while gaming,aka most laptop gamers play plugged in, thus it's not a massive issue, but will affect the few who game not near a power point.
HighTech4US - Wednesday, March 12, 2014 - link
Jarred: Actually, scratch that; I’m almost certain a GT 740M GDDR5 solution will be faster than the 840M DDR3, though perhaps not as energy efficient.Someone seems to have forgotten the 2 MB of on-chip cache.
JarredWalton - Wednesday, March 12, 2014 - link
No, I just don't 2MB is going to effectively hide the fact that you're using 2GB of textures and trying to deal with most of those using a rather tiny amount of memory bandwidth. Does the Xbox One's eDRAM effectively make up for the lack of raw memory bandwidth compared to the PS4? In general, no, and that's with far more than a 2MB cache.HighTech4US - Thursday, March 13, 2014 - link
So then please explain how the GTX 750 Ti with it's 128 bit bus comes very close to the GTX 650 Ti with a 192 bit bus?JarredWalton - Friday, March 14, 2014 - link
It can help, sure, but you're comparing a chip with a faster GPU and the same RAM to a chip with 640 Maxwell shaders at 1189MHz to a chip with 768 Kepler shaders at 1032MHz (plus Boost in both cases). Just on paper, the GTX 750 Ti has 4% more shader processing power. If bandwidth isn't the bottleneck in a game -- and in many cases it won't be with 86.4GB/s of bandwidth -- then the two GPUs are basically equal, and if a game needs a bit more bandwidth, the 650 Ti will win out.Contrast that with what I'm talking about: a chip with less than 20% of the bandwidth of the 750 Ti. It's one thing to be close when you're at 80+ GB/s, and quite another to be anywhere near acceptable performance at 16GB/s.
Death666Angel - Wednesday, March 12, 2014 - link
"Speaking of which, I also want to note that anyone that thinks “gaming laptops” are a joke either needs to temper their requirements or else give some of the latest offerings a shot."You realize that you are speaking to the "PC gaming master race", right? :P