Gaming Notebooks Are Thriving

Wrapping things up, I’ll include a gallery of NVIDIA’s slides at the end, but let’s quickly go over a few interesting items. NVIDIA provided some research showing that PC gaming is an extremely large industry – competing in yearly revenue with the likes of movie theater ticket sales, music, and DVD/Blu-ray video sales. Along with that growth, growth in the gaming notebooks market has been significant over the past three years, and even greater growth is expected for 2014. A large part of that is no doubt thanks to Optimus, as it allows potentially any notebook to deliver good gaming performance when you need it while not absolutely killing battery life when you don’t need the GPU. The other aspect is that we are simply seeing more GTX class notebooks shipping, thanks to GPUs like the GTX 760M/765M, and with the 850M now moving into the GTX class (which is where NVIDIA draws the line for “gaming notebooks”) we’ll see even more. But it’s not just about names; the following slide is a great illustration of what we’ve seen since 2011:

It’s not too hard to guess what the notebook on the left is (hello Alienware M17x R3), while on the right it looks like Gigabyte’s P34G. That’s not really important, but the difference in size is pretty incredible, and what’s more, the laptop on the right s actually 30% faster with GTX 850M than the GTX 580M from 2011. It also happens to deliver better battery life – gaming or otherwise. Leaner, lighter, and faster are all good things for gaming notebooks. As you would expect, there are quite a few GTX 800M notebooks coming out soon or in the very near future (while most other 800M parts will mostly come a bit later). NVIDIA provided the following images along with some other information on upcoming laptops, so if you’re in the market keep an eye out for the following (in alphabetical order).

The Alienware 17 will be updated to support both the GTX 870M and GTX 880M. ASUS’ G750JZ will update the G750JH and move from GTX 780M to GTX 880M (and apparently Optimus will be enabled this round). Gigabyte will have new versions of the P34G, the P34R with 860M and the P34J with 850M, an updated P35R (P35K core design) with 860M, and apparently updates to the P25 and P27 as well (likely with mainstream 800M class GPUs, so specifics haven’t been given yet). Lenovo’s Y50 will be their now gaming notebook, with a single GTX 860M and an optional high-DPI display. MSI will also be updating their GT70, GT60, GS70, and GS60; the GT models will support GTX 880M and 870M while the GS models will support the GTX 870M and 860M. And finally (though I suspect we’ll see Clevo, Toshiba, and Samsung announce products with GTX 800M GPUs at some point, along with perhaps some other OEMs as well), Razer Blade will have a new 14” Blade with GTX 870M and a 17” Blade Pro with GTX 860M – and no, that’s not a typo, though perhaps we’ll see more than one model of Blade Pro as it seems odd for the smaller laptop to support a faster GPU.

Finally, on the topic of the need for discrete GPUs in laptops, NVIDIA noted that over 85% of the top 30 games (according to NPD/Steam sales) remain unplayable with Intel’s HD 4400 (no surprise, as that’s basically the same performance as HD 4000), while 75% still remain unplayable with Iris 5100 – this is using 1366x768 resolution with “default settings” (presumably medium, but it’s not specifically stated). What’s missing is information on what’s playable with Iris Pro, but while I can say that most games I’ve tested on Iris Pro are able to break 30FPS average frame rates, ironically power use on the i7-4750HQ laptop I’ve tested is actually worse when gaming than most laptops with GT class 700M GPUs. NVIDIA shows this in their results as well, and while I can’t verify the numbers they claim to provide better performance with a 840M than Iris Pro 5200 while using less than half as much power.

Intel has certainly improved their iGPU performance with the last several processor generations, but unfortunately the higher performance has often come only when given more power – so for example a GT2 or GT3 Haswell part limited to 15W total TDP (i.e. in an Ultrabook) is typically no better than a GT2 Ivy Bridge part with a 17W TDP. Broadwell will likely bring us a “GT4” part (to go along with GT3e), but we’ll have to see if Intel is able to improve performance within the same power envelope when those parts start shipping later this year.

Other Features: GameStream, ShadowPlay, Optimus, etc. Closing Thoughts
Comments Locked

91 Comments

View All Comments

  • MrSpadge - Wednesday, March 12, 2014 - link

    Yeah, 4 letters and 3 numbers are just not enough to get this point across.
  • lordmocha - Wednesday, March 12, 2014 - link

    "and while I can’t verify the numbers they claim to provide better performance with a 840M than Iris Pro 5100 while using less than half as much power."

    i think you mean 5200

    ---

    the iris pro in the macbook retina 15" is actually quite amazing for the casual gamer:
    doat2: 1920x1200 maximum FXAA 67fps
    csgo: 1920x1200 high FXAA TRI 82fps (107fps inside office in corridor)
    sc2: 1920x1200 texture = ultra| graphics = med = 78fps
    gw2: 1920x1200 - Best Appearance 21fps | Autodetect 50fps (60fps on land) | Best Performance 91fps
    diablo3: 1920x1200 high = 56fps
  • blzd - Wednesday, March 12, 2014 - link

    While using 2x as much power as a low end dedicated GPU. Intel just threw power efficiency out the window with Iris Pro.
  • IntelUser2000 - Thursday, March 13, 2014 - link

    With Cherry Trail, they will be able to put the HD 4400 level of performance in Atom chips.

    Both Nvidia and Intel have secret sauce to tremendously improve performance/watt in the next few years or so to push HPC.

    Broadwell should be the first result for Intel in that space, while Nvidia starts with Maxwell. The eventual goal for both companies are 10TFlop DP at about 200W in 2018-19 timeframe. Obviously the efficiency gains gets pushed down in graphics.
  • lordmocha - Sunday, March 16, 2014 - link

    yes that is true, but with any gaming laptop you'd only get 2 or 3 hours battery while gaming,

    aka most laptop gamers play plugged in, thus it's not a massive issue, but will affect the few who game not near a power point.
  • HighTech4US - Wednesday, March 12, 2014 - link

    Jarred: Actually, scratch that; I’m almost certain a GT 740M GDDR5 solution will be faster than the 840M DDR3, though perhaps not as energy efficient.

    Someone seems to have forgotten the 2 MB of on-chip cache.
  • JarredWalton - Wednesday, March 12, 2014 - link

    No, I just don't 2MB is going to effectively hide the fact that you're using 2GB of textures and trying to deal with most of those using a rather tiny amount of memory bandwidth. Does the Xbox One's eDRAM effectively make up for the lack of raw memory bandwidth compared to the PS4? In general, no, and that's with far more than a 2MB cache.
  • HighTech4US - Thursday, March 13, 2014 - link

    So then please explain how the GTX 750 Ti with it's 128 bit bus comes very close to the GTX 650 Ti with a 192 bit bus?
  • JarredWalton - Friday, March 14, 2014 - link

    It can help, sure, but you're comparing a chip with a faster GPU and the same RAM to a chip with 640 Maxwell shaders at 1189MHz to a chip with 768 Kepler shaders at 1032MHz (plus Boost in both cases). Just on paper, the GTX 750 Ti has 4% more shader processing power. If bandwidth isn't the bottleneck in a game -- and in many cases it won't be with 86.4GB/s of bandwidth -- then the two GPUs are basically equal, and if a game needs a bit more bandwidth, the 650 Ti will win out.

    Contrast that with what I'm talking about: a chip with less than 20% of the bandwidth of the 750 Ti. It's one thing to be close when you're at 80+ GB/s, and quite another to be anywhere near acceptable performance at 16GB/s.
  • Death666Angel - Wednesday, March 12, 2014 - link

    "Speaking of which, I also want to note that anyone that thinks “gaming laptops” are a joke either needs to temper their requirements or else give some of the latest offerings a shot."
    You realize that you are speaking to the "PC gaming master race", right? :P

Log in

Don't have an account? Sign up now