NVIDIA GeForce GTX 980M and GTX 970M: Mobile to the Maxwellby Jarred Walton on October 7, 2014 9:00 AM EST
GTX 980M and 970M Notebooks and Conclusion
Today's launch of 980M and 970M is about as much of a "hard launch" as we see with notebook GPUs. Quite a few notebooks should be available for order with the new chips, though it could take a couple weeks or more for orders to process. We were hoping to have the MSI GT72 prior to today's launch, but as noted earlier it should arrive in the next day (or in a few hours even). We'll post a follow-up Pipeline article as soon as we're able showing performance using some of our standard gaming and graphics benchmarks. In the meantime, here's the current list of notebooks that support the new GPUs.
|Upcoming GeForce GTX 980M/970M Notebooks|
|ASUS||G751||GeForce GTX 980M
GeForce GTX 970M
|MSI||GT72||GeForce GTX 980M
GeForce GTX 970M
|MSI||GS60||GeForce GTX 970M||15”|
|MSI||GS70||GeForce GTX 970M||17”|
|Gigabyte||P35||GeForce GTX 970M||15”|
|Gigabyte||Aorus X7||2x GeForce GTX 970M (SLI)||17”|
|Clevo||P150/P157||GeForce GTX 980M
GeForce GTX 970M
|Clevo||P170/P177||GeForce GTX 980M
GeForce GTX 970M
|Clevo||P650||GeForce GTX 980M
GeForce GTX 970M
For their part, NVIDIA has provided performance numbers for both GPUs at different settings in a variety of games, but there's no comparison with other GPUs so the numbers are out of context. As a preview of what to expect, and considering several of the games use the built-in benchmark tools, here's what NVIDIA is reporting; all of the following games were tested at 1080p with the settings indicated:
|NVIDIA Performance Results|
|Game||Game Settings||GTX 980M||GTX 970M|
|Batman: Arkham Origins||Max, FXAA High, PhysX High||60||45|
|Bioshock Infinite||Ultra DX11_DDOF||91||69|
|Crysis 3||Very High 4xMSAA||36||26|
|Far Cry 3||Ultra 4xMSAA||51||38|
|Metro: Last Light||Very High SSAA||36||27|
|StarCraft II||Max 4xMSAA||68||62|
Most of the games are apparently being run at near maximum quality settings (though Batman is missing 4xMSAA, it does have PhysX enabled), which is good for putting as much of the bottleneck on the GPU as possible. StarCraft II and Hitman Absolution appear to be CPU limited, which isn't too surprising for StarCraft II as it has always been heavily influenced by CPU performance. On average, the GTX 980M outperforms the GTX 970M by 28%, even including the CPU limited games; if we ignore StarCraft II and Hitman Absolution the 980M is 34% faster on average.
Update: our own performance preview of GTX 980M is now available. The short summary is that GTX 980M is about the same performance level as the desktop GTX 770, though obviously with some newer features like DX12 support and VXGI. It's also twice as fast as the GTX 860M and 35% faster than GTX 880M on average.
One of the problems we're starting to run into with mobile GPUs getting so fast is that many laptops still top out at a 1920x1080 display, and even at maximum detail there are plenty of games that will easily break 60 FPS and may start running into CPU bottlenecks. For that reason, NVIDIA is billing the GTX 980M as a mobile GPU that targets playable frame rates at resolutions beyond 1080p, and we'll likely see more high-end notebooks ship with 2560x1440, 3K, or even 4K displays. It's probably a bit too much to assume that 3K gaming at 60 FPS will happen on most titles at maximum quality with the 980M, as games like Metro: Last Light and Crysis 3 can be very taxing, but we're definitely getting close to being able to max out settings on most games.
NVIDIA didn't provide specific numbers for their previous generation mobile GPUs, but they do note that GTX 980M should be around 40% faster than the GTX 880M, which is no mean feat. Compared to the first generation Kepler GPU, GTX 680M, the difference is even larger: 980M is roughly twice as fast as the GTX 680M that launched three years ago. GTX 970M is also supposed to be about 40% faster than the previous generation GTX 870M and on average twice as fast as the GTX 860M.
Wrapping up, we've provided a full gallery of slides from the NVIDIA presentation for those that are interested. We're very much looking forward to some hands on time testing out the GTX 980M, as it should prove to be quite a formidable GPU. That's not too surprising as GM204 proved to be quite potent on desktop GPUs, with a smaller and more efficient chip able to basically match and generally exceed the performance of l the larger and more power hungry GTX 780 Ti. The result is that this is as close as notebooks have come to matching desktop performance (for a single GPU) in as long as I've been reviewing notebooks.
Looking forward, performance is always improving and we'll certainly see even faster GPUs in the next year. We also know that NVIDIA is capable of making larger GPUs, so we're still missing the true "Big Maxwell" (i.e. GM200 or GM210). As with the GF110 and GK110 I don't expect we'll ever see that chip in consumer notebooks, but we might see GM204 return with even more SMMs enabled. But until NVIDIA comes out with an even bigger and faster Maxwell variant, this is the top mobile GPU, and that means it will priced as such.
We should see GTX 980M in gaming notebooks starting around the $2000 price point (give or take), with GTX 970M launching in notebooks starting at $1600. Based on MSI's pricing of their GT72, it also looks like the GTX 980M may have as much as a $350 price premium over the GTX 970M, or at least that's the difference in pricing for end users. (Ouch.) We're covering the notebooks that have been announced in separate Pipeline articles, and we should see some of them at the usual places like Newegg and Amazon. Stay tuned for our performance results from the MSI GT72, which will go up as soon as we get the laptop and can run some tests.
Post Your CommentPlease log in or sign up to comment.
View All Comments
scottrichardson - Tuesday, October 7, 2014 - linkOne can only assume this is the GPU that Apple will slot into their upcoming 'Retina' iMacs, unless the AMD rumours hold true?
tviceman - Tuesday, October 7, 2014 - linkIt would be crazy of Apple to use Tonga (the rumor) instead of GM204. If Apple really did choose AMD, Nvidia must not have been willing to budge on price.
RussianSensation - Tuesday, October 7, 2014 - linkActually they wouldn't be crazy. There are at least 2 reasons why this rumour could be true:
1) Nvidia is filing a lot of patent infringement lawsuits agains Samsung, Qualcomm and even Apple. Rumour has it as a retaliation, Apple will move some of its products to AMD until NV either withdraws the lawsuit or decides to settle at more agreeable terms.
2) NV's GM204 is not cheap and it certainly won't be cheap for the mobile sector. When you are faced with trying to price the iMac 5K without a stratospheric price, you might want to go with a cheaper GPU because that new 5K display will cost an arm and a leg. In other words, since Macs are hardly used for gaming to begin with, you simply balance your priorities to hit appropriate price points your customers will pay based on historical data. It's possible that the inclusion of GM204 would force a new higher end SKU of the iMac when combined with a 5K display.
3) Manufacturing supply. As you can see from the availability of desktop GM204 chips, there are supply issues. Apple might has required NV to provide a certain amount of GM204 chips and Nvidia couldn't guarantee that many in xx time frame.
Now, all of these are rumours and Apple could still use GM204, or GM204 + AMD GPU + Intel GPU in various SKUs of the iMac. However, there are 3 legitimate reasons why Apple would not use Nvidia's GM204 in the next gen iMac.
Doormat - Tuesday, October 7, 2014 - link4) As shown in the chart on page 2, the max resolution of the Maxwell GPU is 4K. If Apple is going to field a 5K display in the 27" iMac, they will need a GPU that supports this resolution, likely requiring displayport 1.3 (or some draft version of DP 1.3 that was available when the chips and panel were being designed).
Morawka - Tuesday, October 7, 2014 - linkThe patent deal has nothing to do with apple because apple does not make gpu's.
NV's 204 has the best perf per watt by a huge margin. We all know that having the best perf per watt is going to cost more than perf per dollar.
With that said, apple will probably go with AMD for this years models because apple always flip flop's gpu vendors between generations.
However, it will run hotter, use more power. And historicly AMD/ATI GPU's within apple products have seen a much higher rate of recall and defects in the past.
I'm not sure how a mobile amd chip will perform at 5K. The desktop gpu's seem up to the task but we haven't seen a really good mobile amd discrete gpu in a long time.
name99 - Tuesday, October 7, 2014 - linknVidia has filed lawsuits against Samsung and Qualcomm. NOT Apple.
They may have plans to sue Apple --- anyone may have plans to do anything --- but right now they have not sued Apple, have not threatened to sue Apple, have not even hinted or suggested or implied that they want to sue Apple.
Manufacturing supply (if this rumor is true) strikes me as far more likely. There is a long history of Apple doing things that seemed (especially to haters) as perverse limitations based on maximally trying to screw their customers over, only for us to learn later that they were limited by supply issues. When you're shipping Apple volumes, you CAN'T simply wish that millions of the last doodab were available when they can only be manufactured in the hundreds of thousands.
We saw this with fingerprint sensors (restricted to iPhone 5S, not 5C, touch or iPads), probably with sapphire on iPhone 6's, and probably with low power RAM on iPhone 6's (that being what's restricting them to 1GB, not some nefarious Apple plan).
Of course for this rumor to be true and the explanation to be relevant requires that AMD can manufacture faster than nV (or has acquired a large inventory). It's not clear to me that this has to be true...
chizow - Tuesday, October 7, 2014 - linkIt's not surprising actually, AMD is probably throwing these chips at Apple for next to nothing, and Apple probably feels at least some obligation to prop up OpenCL, the standard they championed way back when, over using the proprietary CUDA, by using AMD chips that clearly aren't as good as Nvidia would be for their needs (performance and low TDP).
Apple's hubris probably leads them to believe they can live without Nvidia and weather the growing pains. My personal experience here at work is that users who rely on Adobe CS have simply shifted to Windows-based workstations with Quadro parts rather than deal with the Mac Pro 2013 GPU/driver issues.
What will really be interesting to see is what Apple does with their MBP, where Kepler was the obvious choice due to efficiency. Maxwell would really shine there, but will Apple be willing to take a big step backwards in performance there just to stay consistent with their AMD trend?
Omoronovo - Tuesday, October 7, 2014 - linkThat's certainly possible, but bear in mind that Apple generally doesn't have current top-end mobile hardware in their iMAcs - That either means a late arrival for iMacs or perhaps a 960M when it releases.
I doubt AMD will ever get back into Apple's good graces as none of their GPU's have met the power/performance levels nVidia has - bear in mind that price is usually a secondary concern for Apple and their consumers, so those are the only two metrics that really matter here.
WinterCharm - Tuesday, October 7, 2014 - linkYeah, not to mention that many people were disappointed when Apple included AMD graphics in their Mac Pro. I *need* CUDA. I can't live without it, and many other professionals will tell you the same thing.
RussianSensation - Tuesday, October 7, 2014 - linkThis is exactly why open standards like OpenCL should be embraced. Saying things like I can't live without "insert a proprietary/locked GPU feature" is what segregates he industry. A lot of programs benefit from OpenCL and AMD GPUs provide full GPU hardware acceleration for the Adobe Suite (Creative, Mercury, Premiere, etc.) and so on:
Also, the iMac is not a workstation which means the primary target market of iMacs doesn't perform professional work with CUDA. If you are really in need of professional graphics, you are using FirePro or Quadro in a desktop workstation or getting MacPro.