Final Words

When NVIDIA launched the first Maxwell cards earlier this year, we knew that we would be in for a treat with their latest architecture. Though just a fragment of the performance of what their eventual high-end cards would be, NVIDIA’s first Maxwell cards offered an interesting look at an architecture that would be capable of doubling NVIDIA’s performance per watt on the same 28nm TSMC manufacturing process they started with over 2 years ago. To that end I don’t think there has been any doubt that NVIDIA’s eventual second generation Maxwell cards would be equally amazing when it comes to power efficiency, but I feel NVIDIA has still impressed us when it comes to performance, features, and pricing.

In many ways it feels like this latest launch has returned us to the PC video card industry of 2012. NVIDIA’s flagship consumer card is once again powered by a smaller and more potent consumer-class x04 GPU, and once again NVIDIA is swinging the one-two punch of performance and power efficiency. When GTX 680 was launched it set a new high mark for the video card industry, and now we see GTX 980 do more of the same. The GTX 980 is faster, less power hungry, and quieter than the Radeon R9 290X, so once again NVIDIA has landed the technical trifecta. Even if we’re just looking at performance and pricing the GTX 980 is the undisputed holder of the single-GPU performance crown, besting everything else AMD and NVIDIA have to offer, and offering it at a price that while no means a steal is more than reasonable given NVIDIA’s technical and performance advantage. As such GTX 980 comes very, very close to doing to Radeon R9 290X what GTX 680 did to Radeon HD 7970 over 2 years ago.

Meanwhile from a feature perspective the GTX 900 series is going to prove to be a very captivating product. Dynamic Super Resolution is a brutish-yet-clever solution of what to do about anti-aliasing on today's deferred renderer games that cannot support traditional MSAA/SSAA, and while I’m withholding my judgment on Multi-Frame sampled Anti-Aliasing until it’s made available to users in NVIDIA’s drivers, the idea at least has merit. Otherwise I am very happy to see that NVIDIA has now fully caught up to the competition in terms of baseline API features by offering everything needed to support Direct3D 11.2 and beyond.

Along those lines, NVIDIA’s focus on voxel technology for Maxwell 2 is a very interesting route to take, and I am eagerly anticipating seeing whether it gets widely adopted and what developers do with it. VXGI is a very neat concept to generate voxel based global illumination, and building in the features necessary to do significant portions of it in hardware is a wise move by NVIDIA. The catch at this point is the same catch that faces all vendor specific technologies: just because the hardware is there doesn’t mean developers will put it to good use, especially in this age of console ports. NVIDIA for their part has made the right move by making sure VXGI will run on other hardware, but I am concerned that the performance delta means that it’s only going to be viable on Maxwell 2 GPUs for now, which could discourage developers. None the less we do need better lighting in games, and I hope this encourages developers to finally adopt these kinds of high quality global illumination systems.

As for the hardware itself, is there anything left to say other than that GTX 980 is a well-built, well-engineered card? The build quality is impeccable – raising the bar over even GTX Titan – and the power efficiency gains are truly remarkable. With a TDP lower than even GTX 680, this is the lowest power consumption has been for a chart-topping card since 9800 GTX over half a decade ago. It’s really a bit of a honeymoon period since if and when NVIDIA does Big Maxwell one has to expect power consumption will go back up, but for the time being it’s very pleasing to be able to get chart-topping performance inside of 165W. And the fact that this comes from the same company responsible for GTX 480 just 2 generations ago makes this the ultimate technical turnaround.

In conclusion, the GeForce GTX 980 represents another stellar performance from NVIDIA. Their reign at the top is not going to go unchallenged – AMD can’t match NVIDIA on performance, but they can sure drive down prices – but as was the case in 2012 the crown continues to securely reside in NVIDIA’s hands, and once again they have done the technical hard work to earn it.

Finally, as a reminder we will be following up this article next week with our look at GTX 980 SLI performance and a look at the GTX 970. Of the two cards launched today the GTX 970 is without a doubt the more interesting of the two thanks to its relatively low price compared to the performance NVIDIA is offering, but due to our aforementioned board issues we will not be able to take a look at it until next week. So until then stay tuned for the rest of our GM204 coverage.

Overclocking GTX 980
Comments Locked

274 Comments

View All Comments

  • squngy - Wednesday, November 19, 2014 - link

    It is explained in the article.

    Because GTX980 makes so many more frames the CPU is worked a lot harder. The W in those charts are for the whole system so when the CPU uses more power it makes it harder to directly compare GPUs
  • galta - Friday, September 19, 2014 - link

    The simple fact is that a GPU more powerful than a GTX 980 does not make sense right now, no matter how much we would love to see it.
    See, most folks are still gaming @ 1080, some of us are moving up to 1440. Under this scenarios, a GTX 980 is more than enough, even if quality settings are maxed out. Early reviews show that it can even handle 4K with moderate settings, and we should expect further performance gains as drivers improve.
    Maybe in a year or two, when 4K monitors become more relevant, a more powerful GPU would make sense. Now they simply don't.
    For the moment, nVidia's movement is smart and commendable: power efficiency!
    I mean, such a powerful card at only 165W! If you are crazy/wealthy enough to have two of them in SLI, you can cut your power demand by 170W, with following gains in temps and/or noise, and and less expensive PSU, if you're building from scratch.
    In the end, are these new cards great? Of course they are!
    Does it make sense to up-grade right now? Only if you running a 5xx or 6xx series card, or if your demands have increased dramatically (multi-monitor set-up, higher res. etc.).
  • Margalus - Friday, September 19, 2014 - link

    A more powerful gpu does make sense. Some people like to play their games with triple monitors, or more. A single gpu that could play at 7680x1440 with all settings maxed out would be nice.
  • galta - Saturday, September 20, 2014 - link

    How many of us demand such power? The ones who really do can go SLI and OC the cards.
    nVidia would be spending billions for a card that would sell thousands. As I said: we would love the card, but still no sense
    Again, I would love to see it, but in the forseeable future, I won't need it. Happier with noise, power and heat efficiency.
  • Da W - Monday, September 22, 2014 - link

    Here's one that demands such power. I play 3600*1920 using 3 screens, almost 4k, 1/3 the budget, and still useful for, you know, working.
    Don't want sli/crossfire. Don't want a space heater either.
  • bebimbap - Saturday, September 20, 2014 - link

    gaming at 1080@144 or 1080 with min fps of 120 for ulmb is no joke when it comes to gpu requirement. Most modern games max at 80-90fps on a OC'd gtx670 you need at least an OC'd gtx770-780. I'd recommend 780ti. and though a 24" 1080 might seem "small" you only have so much focus. You can't focus on periphery vision you'd have to move your eyes to focus on another piece of the screen. the 24"-27" size seems perfect so you don't have to move your eyes/head much or at all.

    the next step is 1440@144 or min fps of 120 which requires more gpu than @ 4k60. as 1440 is about 2x 1080 you'd need a gpu 2x as powerful. so you can see why nvidia must put out a powerful card at a moderate price point. They need it for their 144hz gsync tech and 3dvision

    imo the ppi race isn't as beneficial as higher refresh rate. For TVs manufacturers are playing this game of misinformation so consumers get the short end of the stick, but having a monitor running at 144hz is a world of difference compared to 60hz for me. you can tell just from the mouse cursor moving across the screen. As I age I realize every day that my eyes will never be as good as yesterday, and knowing that I'd take a 27" 1440p @ 144hz any day over a 28" 5k @ 60hz.
  • Laststop311 - Sunday, September 21, 2014 - link

    Well it all depends on viewing distance. I use a 30" 2560x1600 dell u3014 to game on currently since it's larger i can sit further away and still have just as good of an experience as a 24 or 27 thats closer. So you can't just say larger monitors mean you can;t focus on it all cause you can just at a further distance.
  • theuglyman0war - Monday, September 22, 2014 - link

    The power of the newest technology is and has always been an illusion because the creation of games will always be an exercise in "compromise". Even a game like WOW that isn't crippled by console consideration is created by the lowest common denominator demographic in the PC hardware population. In other words... ( if u buy it they will make it vs. if they make it I will upgrade ). Besides the unlimited reach of an openworld's "possible" textures and vtx counts.
    "Some" artists are of the opinion that more hardware power would result in a less aggressive graphic budget! ( when the time spent wrangling a synced normal mapped representation of a high resolution sculpt or tracking seam problems in lightmapped approximations of complex illumination with long bake times can take longer than simply using that original complexity ). The compromise can take more time then if we had hardware that could keep up with an artists imagination.
    In which case I gotta wonder about the imagination of the end user that really believes his hardware is the end to any graphics progress?
  • ppi - Friday, September 19, 2014 - link

    On desktop, all AMD needs to do is to lower price and perhaps release OC'd 290X to match 980 performance. It will reduce their margins, but they won't be irrelevant on the market, like in CPUs vs Intel (where AMD's most powerful beasts barely touch Intel's low-end, apart from some specific multi-threaded cases)

    Why so simple? On desktop:
    - Performance is still #1 factor - if you offer more per your $, you win
    - Noise can be easily resolved via open air coolers
    - Power consumption is not such a big deal

    So ... if AMD card at a given price is as fast as Maxwell, then they are clearly worse choice. But if they are faster?

    In mobile, however, they are screwed big way, unless they have something REAL good in their sleeve (looking at Tonga, I do not think they do, as I am convinced AMD intends to pull off another HD5870 (i.e. be on the new process node first), but it apparently did not work this time around.)
  • Friendly0Fire - Friday, September 19, 2014 - link

    The 290X already is effectively an overclocked 290 though. I'm not sure they'd be able to crank up power consumption reliably without running into heat dissipation or power draw limits.

    Also, they'd have to invest in making a good reference cooler.

Log in

Don't have an account? Sign up now