As the GPU company who’s arguably more transparent about their long-term product plans, NVIDIA still manages to surprise us time and time again. Case in point, we have known since 2012 that NVIDIA’s follow-up architecture to Kepler would be Maxwell, but it’s only more recently that we’ve begun to understand the complete significance of Maxwell to the company’s plans. Each and every generation of GPUs brings with it an important mix of improvements, new features, and enhanced performance; but fundamental shifts are fewer and far between. So when we found out Maxwell would be one of those fundamental shifts, it changed our perspective and expectations significantly.

What is that fundamental shift? As we found out back at NVIDIA’s CES 2014 press conference, Maxwell is the first NVIDIA GPU that started out as a “mobile first” design, marking a significant change in NVIDIA’s product design philosophy. The days of designing a flagship GPU and scaling down already came to an end with Kepler, when NVIDIA designed GK104 before GK110. But NVIDIA still designed a desktop GPU first, with mobile and SoC-class designs following. However beginning with Maxwell that entire philosophy has come to an end, and as NVIDIA has chosen to embrace power efficiency and mobile-friendly designs as the foundation of their GPU architectures, this has led to them going mobile first on Maxwell. With Maxwell NVIDIA has made the complete transition from top to bottom, and are now designing GPUs bottom-up instead of top-down.

Nevertheless, a mobile first design is not the same as a mobile first build strategy. NVIDIA has yet to ship a Kepler based SoC, let alone putting a Maxwell based SoC on their roadmaps. At least for the foreseeable future discrete GPUs are going to remain as the first products on any new architecture. So while the underlying architecture may be more mobile-friendly than what we’ve seen in the past, what hasn’t changed is that NVIDIA is still getting the ball rolling for a new architecture with relatively big and powerful GPUs.

This brings us to the present, and the world of desktop video cards. Just less than 2 years since the launch of the first Kepler part, the GK104 based GeForce GTX 680, NVIDIA is back and ready to launch their next generation of GPUs as based on the Maxwell architecture.

No two GPU launches are alike – Maxwell’s launch won’t be any more like Kepler’s than Kepler was Fermi’s – but the launch of Maxwell is going to be an even greater shift than usual. Maxwell’s mobile-first design aside, Maxwell also comes at a time of stagnation on the manufacturing side of the equation. Traditionally we’d see a new manufacturing node ready from TSMC to align with the new architecture, but just as with the situation faced by AMD in the launch of their GCN 1.1 based Hawaii GPUs, NVIDIA will be making do on the 28nm node for Maxwell’s launch. The lack of a new node means that NVIDIA would either have to wait until the next node is ready, or launch on the existing node, and in the case of Maxwell NVIDIA has opted for the latter.

As a consequence of staying on 28nm the optimal strategy for releasing GPUs has changed for NVIDIA. From a performance perspective the biggest improvements still come from the node shrink and the resulting increase in transistor density and reduced power consumption. But there is still room for maneuvering within the 28nm node and to improve power and density within a design without changing the node itself. Maxwell in turn is just such a design, further optimizing the efficiency of NVIDIA’s designs within the confines of the 28nm node.

With the Maxwell architecture in hand and its 28nm optimizations in place, the final piece of the puzzle is deciding where to launch first. Thanks to the embarrassingly parallel nature of graphics and 3D rendering, at every tier of GPU – from SoC to Tesla – GPUs are fundamentally power limited. Their performance is constrained by the amount of power needed to achieve a given level of performance, whether it’s limiting clockspeed ramp-ups or just building out a wider GPU with more transistors to flip. But this is especially true in the world of SoCs and mobile discrete GPUs, where battery capacity and space limitations put a very hard cap on power consumption.

As a result, not unlike the mobile first strategy NVIDIA used in designing the architecture, when it comes to building their first Maxwell GPU NVIDIA is starting from the bottom. The bulk of NVIDIA’s GPU shipments have been smaller, cheaper, and less power hungry chips like GK107, which for the last two years has formed the backbone of NVIDIA’s mobile offerings, NVIDIA’s cloud server offerings, and of course NVIDIA’s mainstream desktop offerings. So when it came time to roll out Maxwell and its highly optimized 28nm design, there was no better and more effective place for NVIDIA to start than with the successor to GK107: the Maxwell based GM107.

Over the coming months we’ll see GM107 in a number of different products. Its destiny in the mobile space is all but set in stone as the successor to the highly successful GK107, and NVIDIA’s GRID products practically beg for greater efficiency. But for today we’ll be starting on the desktop with the launch of NVIDIA’s latest desktop video cards: GeForce GTX 750 Ti and GeForce GTX 750.

Maxwell’s Feature Set: Kepler Refined
Comments Locked

177 Comments

View All Comments

  • EdgeOfDetroit - Tuesday, February 18, 2014 - link

    This card (Evga 750 Ti OC) is replacing a 560Ti for me. Its slower but its not my primary game machine anymore anyways. I'll admit I was kinda bummed when the 700 series stopped at the 760, and now that the 750 is here, its like they skipped the true successor to the 560 and 660. I can probably still get something for my 560Ti, at least.
  • rhx123 - Tuesday, February 18, 2014 - link

    I wonder if we'll get the 750Ti or even the 750 in a half height config.

    It would be nice for HTPCs given the power draw, but I'm not optimistic.
    There's still nothing really decent in the half height Nvidia camp.
  • Frenetic Pony - Tuesday, February 18, 2014 - link

    "it is unfortunate, as NVIDIA carries enough market share that their support (or lack thereof) for a feature is often the deciding factor whether it’s used"

    No this time. Both the Xbone and PS4 are fully feature compliant, as is GCN 1.1 cards, heck even GCN 1.0 has a lot of the features required. With the new consoles, especially the PS4, selling incredibly well these are going to be the baseline, and if you buy a NVIDIA card without it, you be SOL for the highest end stuff.

    Just another disappointment with Maxwell, when AMD is already beating Nvidia price for performance wise very solidly. Which is a shame, I love their steady and predictable driver support and well designed cooling set ups. But if they're not going to compete, especially with the rumors of how much Broadwell supposedly massively improves on Intel's mobile stuff, well then I just don't know what to say.
  • Rebel1080 - Tuesday, February 18, 2014 - link

    Can we all come to a consensus by declaring the 8th console generation an a epic bust!!! When the Seventh console generation consoles (PS3/XB360) made their debut it took Nvidia and AMD 12-18 months to ship a mainstream GPU that could match or exceed thier performance. This generation it only took 3 months at 2/3rds the price those cards sold at (3870/8800GT).

    It's pretty condemning that both Sony and MSFT's toy boxes are getting spanked by $119-149 cards. Worst of all the cards are now coming from both gpu companies for which I'm sure gives Nvidia all smiles.
  • FearfulSPARTAN - Tuesday, February 18, 2014 - link

    Really an epic bust.... Come on now we all knew from the start they were not going to be bleeding edge based on the specs. They were not going for strong single threaded performance they were aiming for well threaded good enough cpu performance and the gpus they had were average at their current time. However considering the ps4 and x1 are selling very well calling the entire gen a bust already is just stupid. You dont need high performance for consoles when you have developers coding to scrape every bit of performance they can out of your hardware, thats something we dont have in the pc space and why most gamers are not using those cards that just met last gen console performance seven years ago.
  • Rebel1080 - Tuesday, February 18, 2014 - link

    They're selling well for the same reasons iTards keep purchasing Apple products even though they only offer incremental updates on both hardware and less on software. It's something I like to call "The Lemming Effect".

    Developers code to the metal but that only does so much and then you end up having to compromise the final product via lower res, lower fps, lower texture detail. Ironcially I was watching several YouTube videos of current gen games (BF3&4, Crysis 3, Grid 2, AC4) running at playable fps between 720p & 900P on a Radeon 3870.
  • oleguy682 - Tuesday, February 18, 2014 - link

    Except that unlike Apple, Sony and Microsoft are selling each unit at a loss once the BOM, assembly, shipping, and R&D are taken into consideration. The PS3 was a $3 billion loss in the first two years it was available. The hope is that licensing fees, add-ons, content delivery, etc. will result in enough revenue to offset the investment, subsidize further R&D, and leave a bit left over for profit. Apple, on the other hand, is making money on both the hardware and the services.

    And believe it or not, there are a lot more console gamers than PC gamers. Gartner estimates that in 2012, PC gaming made up only $14 billion of the $79 billion gaming market. This does include hardware, in which the consoles and handheld devices (likely) get an advantage, but 2012 was before the PS4 and Xbone were released.

    So while it might be off-the-shelf for this generation, it was never advertised as anything more than a substantial upgrade over the previous consoles, both of which were developed in the early 2000s. In fact, they were designed for 1080p gaming, and that's what they can accomplish (well, maybe not the Xbone if recent reports are correct). Given that 2160p TVs (because calling it 4K is dumb and misleading) are but a pipe dream for all but the most well-heeled of the world and that PCs can't even come close to the performance needed to drive such dense displays (short of spending $1,000+ on GPUs alone), there is no need to over-engineer the consoles to do something that won't be asked of them until they are near EOL.
  • Rebel1080 - Tuesday, February 18, 2014 - link

    PC Gaming is growing faster globally than the console market because purchasing consoles in many nations is extremely cost prohibitive due to crushing tariffs. Figure that in 3yrs time both Intel and AMD will have IGPs that will trounce the PS4 and will probably sell for under $99 USD. PC hardware is generally much more accessible to people living in places like Brazil, China and India compared to consoles. It would actually cost less to build a gaming PC if you live there.

    The console market is the USA, Japan and Western Europe, as the economies of these nations continue to decline (all 3 are still in recession) people who want to game without spending a ton will seek lower cost alternatives. With low wattage cards like the 750Ti suddenly every Joe with a 5yr old Dell/HP desktop can now have console level gaming for a fraction of the cost without touching any of his other hardware.
  • Rebel1080 - Tuesday, February 18, 2014 - link

    http://www.gamespot.com/articles/sony-says-brazil-...
  • oleguy682 - Wednesday, February 19, 2014 - link

    Brazil is only Brazil. It does not have any bearing on China or India or any other developing nation as they all choose their own path on how they tax and tariff imports. Second, throwing a 750Ti into a commodity desktop (the $800-1,200 variety) from 3 years ago, let alone 5, is unlikely to result in performance gains that would turn it into a full-bore 1080p machine that can run with the same level of eye-candy as a PS4 or XBone. The CPU and memory systems are going to be huge limiting factors.

    As far as the PC being a faster growing segment, the Gartner report from this fall thinks that PC gaming hardware and software will rise from the 2012 baseline of 18.3% of spending to 19.4% of spending in 2015. So yes, it will grow, but it's such a small share already that it barely does anything to move the needle in terms of where gaming goes. In contrast, consoles are expected to grow from 47.4% to 49.6% of spending. The losing sectors are going to be handheld gaming, eaten mostly by tablets and smartphones. PCs aren't dying, but they aren't thriving, regardless of what Brazil does with PS4 imports in 2014.

Log in

Don't have an account? Sign up now