Meet The GeForce GTX 980 Ti

Like the rest of NVIDIA’s high-end cards in this generation, the reference GeForce GTX 980 Ti is launching with NVIDIA’s standard metal cooler. This design has served NVIDIA well since the launch of the GTX Titan in 2013 and continues to be the blower design to beat the high end, easily handling the 250W TDP of NVIDIA’s high-end cards without generating a ton of noise in the process.

As with so many other aspects of the GTX 980 Ti, the GTX 980 Ti’s cooler and build is a near-copy of the GTX Titan X. The only difference in the cooler is the paint job; GTX Titan X got a unique black paint job, while GTX 980 Ti gets the more standard bare aluminum finish with black lettering and a black-tinted polycarbonate window.

Otherwise there’s very little to be said about the GTX 980 Ti’s design that hasn’t been said before, so we’ll just recap what we said about the cooler design from our review of the GTX Titan X.

For GTX 980 Ti, NVIDIA has opted to leave well enough alone, having made virtually no changes to the shroud or cooling apparatus. And truth be told it’s hard to fault NVIDIA right now, as this design remains the gold (well, aluminum) standard for a blower. Looks aside, after years of blowers that rattled, or were too loud, or didn’t cool discrete components very well, NVIDIA is sitting on a very solid design that I’m not really sure how anyone would top (but I’d love to see them try).

In any case, our favorite metal shroud is back once again. Composed of a cast aluminum housing and held together using a combination of rivets and screws, it’s as physically solid a shroud as we’ve ever seen. The card measures 10.5” long overall, which at this point is NVIDIA’s standard size for high-end GTX cards.

Drilling down we have the card’s primary cooling apparatus, composed of a nickel-tipped wedge-shaped heatsink and ringed radial fan. The heatsink itself is attached to the GPU via a copper vapor chamber, something that has been exclusive to NVIDIA’s 250W cards and provides the best possible heat transfer between the GPU and heatsink. Meanwhile the rest of the card is covered with a black aluminum baseplate, providing basic heatsink functionality for the VRMs and other components while also protecting them.

Finally at the bottom of the stack we have the card itself, complete with the GM200 GPU, VRAM chips, and various discrete components. The GM200 PCB places the GPU and VRAM chips towards the front of the card, while the VRMs and other discrete components occupy the back. As with the GTX Titan X, GTX 980 Ti features NVIDIA’s reworked component placement to improve airflow to the discrete components and reduce temperatures, along with employing molded inductors.

NVIDIA once again employs a 6+2 phase VRM design, with 6 phases for the GPU and another 2 for the VRAM. This means that GTX 980 Ti has a bit of power delivery headroom – NVIDIA allows the power limit to be increased by 10% to 275W – but hardcore overclockers will find that there isn’t an extreme amount of additional headroom to play with. Based on our sample the actual shipping voltage at the max boost clock is a bit higher than GTX Titan X, coming in at 1.187v, so in non-TDP constrained scenarios there is some additional headroom through overvolting, up to 1.23v in the case of our sample.

In terms of overall design, unlike GTX Titan X and its 24 VRAM chips, for the GTX 980 Ti NVIDIA only needs to use 12 VRAM chips to get the card’s 6GB of VRAM, so all of the VRAM is located at the front of the card. Halving the RAM capacity simplifies the card a bit – there are now no critical components on the back – and it brings down the total VRAM power consumption slightly. However despite this, NVIDIA has not brought back the backplate from the GTX 980, having removed it on the GTX Titan X due to the VRAM chips it placed on the rear.

Moving on, in accordance with GTX 980 Ti’s 250W TDP and the reuse of the metal cooler, power delivery for the GTX 980 Ti is identical to its predecessors. This means a 6-pin and an 8-pin power connector at the top of the card, to provide up to 225W, with the final 75W coming from the PCIe slot.

Meanwhile display I/O follows the same configuration we’ve seen on the rest of the high-end GTX 900 series. This is 1x DL-DVI-I, 3x DisplayPort 1.2, and 1x HDMI 2.0, with a total limit of 4 displays. In the case of GTX 980 Ti the DVI port is somewhat antiquated at this point – the card is generally overpowered for the relatively low maximum resolutions of DL-DVI – but on the other hand the HDMI 2.0 port is actually going to be of some value here since it means GTX 980 Ti can drive a 4K TV. Meanwhile if you have money to spare and need to drive more than a single 4K display, GTX 980 Ti also features a pair of SLI connectors for even more power.

Finally, taking a look at the long term, I wanted to quickly hit upon the subject of the VRAM capacity difference between the GTX 980 Ti and the GTX Titan X. Essentially NVIDIA’s only remaining selling point for the GTX Titan X, the Titan will remain their only 12GB card for some time to come. For NVIDIA this means that they can pitch the GTX Titan X as a more future-proof card than the GTX 980 Ti, as it would be hard-pressed to run out of VRAM.

The question for the moment then is whether 12GB is worth a higher premium, let alone the GTX Titan X’s $350 premium. The original GTX Titan by comparison was fortunate enough to come out with 6GB right before the current-generation consoles launched, and with them their 8GB memory configurations. This lead to a rather sudden jump in VRAM requirements in games that the GTX Titan was well positioned to handle, whereas GTX 780 Ti and its 3GB of VRAM can struggle in the very latest games at 4K resolutions. Much like 6GB in 2013, 12GB is overkill in 2015, all the while 6GB is a more practical amount for a 384-bit card at this time.

But to answer the question at hand, unlike the original GTX Titan, I suspect 12GB will remain overkill for a much longer period of time, especially without a significant technology bump like the consoles to drive up VRAM requirements. And consequently I don’t expect GTX 980 Ti to have any real issues with VRAM capacity in games over the next couple of years, making it better off than the GTX 780 Ti, relatively speaking.

The NVIDIA GeForce GTX 980 Ti Review NVIDIA's Computex Announcements & The Test
Comments Locked

290 Comments

View All Comments

  • Yojimbo - Monday, June 1, 2015 - link

    After some research, I posted a long and detailed reply to such a statement before, I believe it was in these forums. Basically, the offending NVIDIA rebrands fell into three categories: One category was that NVIDIA introduced a new architecture and DIDN'T change the name from the previous one, then later, 6 months if I remember, when issuing more cards on the new architecture, decided to change to a new brand (a higher numbered series). That happened once, that I found. The second category is where NVIDIA let a previously released GPU cascade down to a lower segment of a newly updated lineup. So the high end of one generation becomes the middle of the next generation, and in the process gets a new name to be uniform with the entire lineup. The third category is where NVIDIA is targeting low-end OEM segments where they are probably fulfilling specific requests from the OEMs. This is probably the GF108 which you say has "plagued the low end for too long now", as if you are the arbiter of OEM's product offerings and what sort of GPU their customers need or want. I'm sorry I don't want to go looking for specific citations of all the various rebrands, because I did it before in a previous message in another thread.

    The rumors of the upcoming retail 300 series rebrand (and the already released OEM 300 series rebrand) is a completely different beast. It is an across-the-board rebrand where the newly-named cards seem to take up the exact same segment as the "old" cards they replace. Of course in the competitive landscape, that place has naturally shifted downward over the last two years, as NVIDIA has introduced a new line up of cards. But all AMD seems to be doing is introducing 1 or 2 new cards in the ultra-enthusiast segment, still based on their ~2 year old architecture, and renaming the entire line up. If they had done that 6 months after the lineup was originally released, it would look like indecision. But being that it's being done almost 2 years since the original cards came out, it looks like a desperate attempt at staying relevant.
  • Oxford Guy - Monday, June 1, 2015 - link

    Nice spin. The bottom line is that both companies are guilty of deceptive naming practices, and that includes OEM nonsense.
  • Yojimbo - Monday, June 1, 2015 - link

    In for a penny, in for a pound, eh? I too could say "nice spin" in turn. But I prefer to weigh facts.
  • Oxford Guy - Monday, June 1, 2015 - link

    "I too could say 'nice spin' in turn. But I prefer to weigh facts."

    Like the fact that both companies are guilty of deceptive naming practices or the fact that your post was a lot of spin?
  • FlushedBubblyJock - Wednesday, June 10, 2015 - link

    AMD is guilty of going on a massive PR offensive, bending the weak minds of it's fanboys and swearing they would never rebrand as it is an unethical business practice.

    Then they launched their now completely laughable Gamer's Manifesto, which is one big fat lie.

    They broke ever rule they ever laid out for their corpo pig PR halo, and as we can see, their fanboys to this very day cannot face reality.

    AMD is dirtier than black box radiation
  • chizow - Monday, June 1, 2015 - link

    Nice spin, no one is saying either company has clean hands here, but the level to which AMD has rebranded GCN is certainly, unprecedented.
  • Oxford Guy - Monday, June 1, 2015 - link

    Hear that sound? It's Orwell applauding.
  • Klimax - Tuesday, June 2, 2015 - link

    I see only rhetoric. But facts and counter points are missing. Fail...
  • Yojimbo - Tuesday, June 2, 2015 - link

    Because I already posted them in another thread and I believe they were in reply to the same guy.
  • Yojimbo - Tuesday, June 2, 2015 - link

    Orwell said that severity doesn't matter, everything is binary?

Log in

Don't have an account? Sign up now