As the GPU company who’s arguably more transparent about their long-term product plans, NVIDIA still manages to surprise us time and time again. Case in point, we have known since 2012 that NVIDIA’s follow-up architecture to Kepler would be Maxwell, but it’s only more recently that we’ve begun to understand the complete significance of Maxwell to the company’s plans. Each and every generation of GPUs brings with it an important mix of improvements, new features, and enhanced performance; but fundamental shifts are fewer and far between. So when we found out Maxwell would be one of those fundamental shifts, it changed our perspective and expectations significantly.

What is that fundamental shift? As we found out back at NVIDIA’s CES 2014 press conference, Maxwell is the first NVIDIA GPU that started out as a “mobile first” design, marking a significant change in NVIDIA’s product design philosophy. The days of designing a flagship GPU and scaling down already came to an end with Kepler, when NVIDIA designed GK104 before GK110. But NVIDIA still designed a desktop GPU first, with mobile and SoC-class designs following. However beginning with Maxwell that entire philosophy has come to an end, and as NVIDIA has chosen to embrace power efficiency and mobile-friendly designs as the foundation of their GPU architectures, this has led to them going mobile first on Maxwell. With Maxwell NVIDIA has made the complete transition from top to bottom, and are now designing GPUs bottom-up instead of top-down.

Nevertheless, a mobile first design is not the same as a mobile first build strategy. NVIDIA has yet to ship a Kepler based SoC, let alone putting a Maxwell based SoC on their roadmaps. At least for the foreseeable future discrete GPUs are going to remain as the first products on any new architecture. So while the underlying architecture may be more mobile-friendly than what we’ve seen in the past, what hasn’t changed is that NVIDIA is still getting the ball rolling for a new architecture with relatively big and powerful GPUs.

This brings us to the present, and the world of desktop video cards. Just less than 2 years since the launch of the first Kepler part, the GK104 based GeForce GTX 680, NVIDIA is back and ready to launch their next generation of GPUs as based on the Maxwell architecture.

No two GPU launches are alike – Maxwell’s launch won’t be any more like Kepler’s than Kepler was Fermi’s – but the launch of Maxwell is going to be an even greater shift than usual. Maxwell’s mobile-first design aside, Maxwell also comes at a time of stagnation on the manufacturing side of the equation. Traditionally we’d see a new manufacturing node ready from TSMC to align with the new architecture, but just as with the situation faced by AMD in the launch of their GCN 1.1 based Hawaii GPUs, NVIDIA will be making do on the 28nm node for Maxwell’s launch. The lack of a new node means that NVIDIA would either have to wait until the next node is ready, or launch on the existing node, and in the case of Maxwell NVIDIA has opted for the latter.

As a consequence of staying on 28nm the optimal strategy for releasing GPUs has changed for NVIDIA. From a performance perspective the biggest improvements still come from the node shrink and the resulting increase in transistor density and reduced power consumption. But there is still room for maneuvering within the 28nm node and to improve power and density within a design without changing the node itself. Maxwell in turn is just such a design, further optimizing the efficiency of NVIDIA’s designs within the confines of the 28nm node.

With the Maxwell architecture in hand and its 28nm optimizations in place, the final piece of the puzzle is deciding where to launch first. Thanks to the embarrassingly parallel nature of graphics and 3D rendering, at every tier of GPU – from SoC to Tesla – GPUs are fundamentally power limited. Their performance is constrained by the amount of power needed to achieve a given level of performance, whether it’s limiting clockspeed ramp-ups or just building out a wider GPU with more transistors to flip. But this is especially true in the world of SoCs and mobile discrete GPUs, where battery capacity and space limitations put a very hard cap on power consumption.

As a result, not unlike the mobile first strategy NVIDIA used in designing the architecture, when it comes to building their first Maxwell GPU NVIDIA is starting from the bottom. The bulk of NVIDIA’s GPU shipments have been smaller, cheaper, and less power hungry chips like GK107, which for the last two years has formed the backbone of NVIDIA’s mobile offerings, NVIDIA’s cloud server offerings, and of course NVIDIA’s mainstream desktop offerings. So when it came time to roll out Maxwell and its highly optimized 28nm design, there was no better and more effective place for NVIDIA to start than with the successor to GK107: the Maxwell based GM107.

Over the coming months we’ll see GM107 in a number of different products. Its destiny in the mobile space is all but set in stone as the successor to the highly successful GK107, and NVIDIA’s GRID products practically beg for greater efficiency. But for today we’ll be starting on the desktop with the launch of NVIDIA’s latest desktop video cards: GeForce GTX 750 Ti and GeForce GTX 750.

Maxwell’s Feature Set: Kepler Refined
Comments Locked

177 Comments

View All Comments

  • jukkie - Friday, February 21, 2014 - link

    I see the GTX 750 Ti as a direct competitor to the HD 7770, so why was AMD's card left out of the list?
    Hmmm...
  • Novaguy - Saturday, February 22, 2014 - link

    I thought AMD's plan is to put the 7850/r7 265 up against the 750 Ti, not the 7770. The HD 7770 really isn't the direct competitor to the 750 Ti; it's usually had around $110. I would guess that if there's anything the HD 7770 competes against, it would be the upcoming 750.
  • th3parasit3 - Friday, February 21, 2014 - link

    I'm still running a GTX460 768MB with an E8500 at stock (built in 2010), mind you my display is only 1650x1050. To me, Maxwell is a huge advancement -- not because of its ability to deliver great FPS at 1080p, but because of its power requirements, or lack thereof.

    AMD burned me on a faulty 5770, so I have much love for NVIDIAs driver support and performance boosts. Looks like after a four year holding pattern, 2014-15 is the year I upgrade my GPU and rebuild. Sign me up for a 750ti and a 860/870.
  • Grandal - Saturday, February 22, 2014 - link

    These seem to be ready made Steam Box drop-ins to me. Will hit the thermal requirements at the perfect time to win the "reference" Steam Box GPU battle.
  • Novaguy - Saturday, February 22, 2014 - link

    Hmm, beyond using this to upgrade my oem boxes from radeon 7750's, I'll love to see this turned into a mid-range mobile card. 750 Ti downclocked for mobile (maybe this is the 850M/860M) would be a nice upgrade over 750M/755M and even possibly even the 760M/765M. It's already below the 75 W TDP those 760M/765M MXM cards call for....
  • Novaguy - Saturday, March 1, 2014 - link

    Just broke down and bought a 750 ti to upgrade from a 7750. Really nice, runs really cool. Definitely worth it for those of you who want to upgrade oem boxes without dealing with the psu, especially if you flip the 7750 at the usual places.
  • dr_sn0w - Wednesday, February 26, 2014 - link

    So, gurus, please tell me if the GTX 750 ti OC will support 4k resolution or not. Thanks.
  • av30 - Friday, March 7, 2014 - link

    I really would have liked to see how the vanilla 750 performed in the HTPC environment in relation to the GT 640. Any chance of updating that section of the review?
  • kamlesh - Wednesday, March 12, 2014 - link

    I m realy curious about Tegra K1 and its succsessor... Leave K1 beside for a moment and see if gtx 750 having 512 cuda cores n draws 55W and gtx 750ti having 640 drws 60W then if u calculate maxwell's each cuda (veriably) draw 0.039W(if clocked at 1ghz or abov). Means if next Tegra uses 2smx of maxwell (256 cores) it might use only 4W (CONSIDERING 20NM AND ~600MHZ CLOCK GPU) and and max 5W with entire SoC.
  • Gadgety - Saturday, March 22, 2014 - link

    Yep me too. Specially the K1 successor, even though the K1 itself is barely out. GPU per watt likely to yield amazing mobile graphics.

Log in

Don't have an account? Sign up now