Meet the GeForce GTX 1080 Ti Founder’s Edition

When it comes to the design of the GeForce GTX 1080 Ti Founder’s Edition, if you’ve seen a high-end NVIDIA card in the last 4 years then you know roughly what to expect. NVIDIA has found their sweet spot in card design, and while the GTX 1080 Ti does have a few wrinkles of its own, it’s not a radical departure from the likes of the GTX 1080, GTX 980 Ti, or GTX 780 Ti. GeForce GTX 1080 Ti is a traditional NVIDIA reference card, with all the competence, performance, and functionality that entails.

At a high level, the GTX 1080 Ti Founder’s Edition is a 10.5-inch long blower-type card comprised of a cast aluminum housing and held together using a combination of rivets and screws. Designed as much for aesthetics as functionality, NVIDIA’s use of well-secured metal has done a good job of tempering noise generation, and for those who like to show off their rigs, the basic design and LED-backlit logo are unmistakable.

Cracking open the card and removing the shroud exposes the card’s fan and heatsink assembly. Once again NVIDIA is lining the entire card with an aluminum baseplate, which provides heatsinking capabilities for the VRMs and other discrete components below it, along with providing additional protection for the board. Like past NVIDIA 250W cards, the GTX 1080 FE uses NVIDIA’s vapor chamber cooler in order to maximize the heat transfer between the GPU/VRMs/DRAM and the aluminum heatsink above. As far as blower-type cards go, it’s still the card to beat.


Airflow: GTX 1080 Ti vs. GTX 1080

For the GTX 1080 Ti, NVIDIA has refined this basic design just a bit in order to further increase airflow. The key change here is that NVIDIA has removed the DVI port that in past designs took up part of the second slot used for ventilation. Consequently the entire slot is now open for airflow. As we’ll see in our benchmarks the practical difference in noise is not very much, but it still means the GTX 1080 Ti is quieter than the GTX 980 Ti and GTX 780 Ti that came before it.

Otherwise in a blink-and-you’ll-miss-it kind of change, NVIDIA has also tweaked the tapering of the shroud itself to maximize the airflow. It’s difficult to pick up in pictures, but the shroud is just a bit thinner and the bottom of the shroud is just a bit higher, allowing just a bit more unobstructed airflow through the card. This subtle change is a good example of where NVIDIA is in their card design cycle: with the major performance elements of the cooler design essentially being a solved problem, NVIDIA is now toying with small changes to eke out just a bit more performance.

Popping off the cooler, we see NVIDIA’s reference PCB. This appears to be the same PCB used on the similarly configured GP102-based Titan X Pascal, and is consistent with NVIDIA’s past reuse of PCBs.

Of particular note, we can see that NVIDIA has equipped the card with a surprising number of MOSFETs, and it turns out there’s a good reason for this. For the GTX 1080 Ti, NVIDIA has opted to go with two dualFETs for each of the GPU’s 7 power phases, as opposed to the traditional 1-per-phase design used in most NVIDIA cards. While this drives up the total cost of the card a bit, the payoff is that it improves on the card’s power delivery efficiency a bit, especially at the 200W+ range the GTX 1080 Ti operates in.

This isn’t the first overall GeForce card we’ve seen with a large number of MOSFETs – NVIDIA’s board partners at times treat it as a contest in and of itself – but this is the first time we’ve seen NVIDIA use such a large number and promote it. Generally speaking additional phases and MOSFETs can improve a power delivery subsystem by spreading out the load – and in a game of inches, every bit counts –  but the additional MOSFETs are subject to diminishing returns both at idle and load.

Otherwise this is a fairly typical NVIDIA PCB. The reuse of the Titan PCB means that the board should work with water blocks and other add-ons designed for the Titan, though I suspect for best overclocking results the hardcore overclockers will continue to want to look to more specialized designs from the likes of MSI, EVGA, Asus, and others.

Flipping the card over to the back, we find NVIDIA’s now-standard segmented backplate. The presence of the backplate helps to protect the card, but one or both segments can be removed to add precious millimeters of room for airflow in tightly packed SLI designs.

Moving on, towards the top of the card we find the requisite SLI and power connectors. Like NVIDIA’s 250W cards before it, the GTX 1080 Ti features a 6 pin + 8 pin setup. NVIDIA’s 250W limit means that, on-board circuitry aside, the power delivery system isn’t anywhere close to its 300W limit. Otherwise we find a pair of SLI connectors, which like the rest of the GTX 10-series cards are designed for use with NVIDIA’s latest-generation High Bandwidth (HB) bridges.

Finally, getting back to NVIDIA’s display I/O configuration, as we mentioned earlier NVIDIA has removed the DVI port from the card in favor of giving the card unobstructed airflow for better cooling. As a result, the card features only modern ports: 3x DisplayPort 1.4 and 1x HDMI 2.0b. With that said, as a consolation item of sorts for the remaining DVI users, NVIDIA is including a DisplayPort-to-SL-DVI adapter with the Founder’s Edition card. This doesn’t perfectly replace the missing DVI port – in particular, it can’t drive 2560x1440 or 2560x1600 displays – but for lower resolution displays it will do the trick. But if it’s not already clear from this change and the number of motherboards to drop DVI over the years, DVI’s days are numbered, and we’re only going to continue to see DVI ports go away at this point.

The GeForce GTX 1080 Ti Review Second Generation GDDR5X: More Memory Bandwidth
Comments Locked

161 Comments

View All Comments

  • ddriver - Thursday, March 9, 2017 - link

    It is kinda both, although I wouldn't really call it a job, because that's when you are employed by someone else to do what he says. More like it's my work and hobby. Building a super computer on the budget out of consumer grade hardware turned out very rewarding in every possible aspect.
  • Zingam - Friday, March 10, 2017 - link

    This is something I'd like to do. Not necessarily with GPUs but I have no idea how to make any money tobpay the bill yet. I'vw only started thinking about it recently.
  • eddman - Thursday, March 9, 2017 - link

    "nvidia will pay off most game developers to sandbag"

    AMD, nvidia, etc. might work with developers to optimize a game for their hardware.

    Suggesting that they would pay developers to deliberately not optimize a game for the competition or even make it perform worse is conspiracy theories made up on the internet.

    Not to mention it is illegal. No one would dare do it in this day and age when everything leaks eventually.
  • DanNeely - Thursday, March 9, 2017 - link

    Something that blatant would be illegal. What nVidia does do is to offer a bunch of blobs that do various effects simulations/etc that can save developers a huge amount of time vs coding their own versions but which run much faster on their own hardware than nominally equivalent AMD cards. I'm not even going accuse them of deliberately gimping AMD (or Intel) performance, only having a single code path that is optimized for the best results on their hardware will be sub-optimal on anything else. And because Gameworks is offered up as blobs (or source with can't show it to AMD NDA restrictions) AMD can't look at the code to suggest improvements to the developers or to fix things after the fact with driver optimizations.
  • eddman - Thursday, March 9, 2017 - link

    True, but most of these effects are CPU-only, and fortunately the ones that run on the GPU can be turned off in the options.

    Still, I agree that vendor specific, source-locked GPU effects are not helping the industry as a whole.
  • ddriver - Thursday, March 9, 2017 - link

    Have you noticed anyone touching nvidia lately? They are in bed with the world's most evil bstards. Nobody can touch them. Their practice is they offer assistance on exclusive terms, all this aims to lock in developers into their infrastructure, or the very least on the implied condition they don't break a sweat optimizing for radeons.

    I have very close friends working at AAA game studios and I know first hand. It all goes without saying. And nobody talks about it, not if they'd like to keep their job, or be able to get a good job in the industry in general.

    nvidia pretty much do the same intel was found guilty of on every continent. But it is kinda less illegal, because it doesn't involve discounts, so they cannot really pin bribery on them, in case that anyone would dare challenge them.

    amd is actually very competitive hardware wise, but failing at their business model, they don't have the money to resist nvidia's hold on the market. I run custom software at a level as professional as it gets, and amd gpus totally destroy nvidian at the same or even higher price point. Well, I haven't been able to do a comparison lately, as I have migrated my software stack to OpenCL2, which nvidia deliberately do not implement to prop up their cuda, but couple of years back I was able to do direct comparisons, and as mentioned above, nvidia offered 2 to 3 times worse value than amd. And nothing has really changed in that aspect, architecturally amd continue to offer superior compute performance, even if their DP rates have been significantly slashed in order to stay competitive with nvidia silicon.

    A quick example:
    ~2500$ buys you either a:
    fire pro with 32 gigs of memory and 2.6 tflops FP64 perf and top notch CL support
    quadro with 8 gigs of memory and 0.13 tflops FP64 perf and CL support years behind

    Better compute features, 4 times more memory and 20 times better compute performance at the same price. And yet the quadro outsells the firepro. Amazing, ain't it?

    It is true that 3rd party cad software still runs a tad better on a quadro, for the reasons and nvidian practices outlined above, but even then, the firepro is still fast enough to do the job, while completely annihilating quadros in compute. Which is why at this year's end I will be buying amd gpus by the dozens rather than nvidia ones.
  • eddman - Friday, March 10, 2017 - link

    So you're saying nvidia constantly engages in illegal activities with developers?

    I don't see how pro cards and software have to do with geforce and games. There is no API lock-in for games.
  • thehemi - Friday, March 10, 2017 - link

    > "And nobody talks about it, not if they'd like to keep their job"

    Haha we're not scared of NVIDIA, they are just awesome. I'm in AAA for over a decade, they almost bought my first company and worked closely with my next three so I know them very well. Nobody is "scared" of NVIDIA. NVIDIA have their devrel down. They are much more helpful with optimizations, free hardware, support, etc. Try asking AMD for the same and they treat you like you're a peasant. When NVIDIA give us next-generation graphics cards for all our developers for free, we tend to use them. When NVIDIA sends their best graphics engineers onsite to HELP us optimize for free, we tend to take them up on their offers. Don't think I haven't tried getting the same out of AMD, they just don't run the company that way, and that's their choice.

    And if you're really high up, their dev-rel includes $30,000 nights out that end up at the strip club. NVIDIA have given me some of the best memories of my life, they've handed me a next generation graphics card at GDC because I joked that I wanted one, they've funded our studio when it hit a rough patch and tried to justify it with a vendor promotion on stage at CES with our title. I don't think that was profitable for them, but the good-will they instilled definitely has been.

    I should probably write a "Secret diaries of..." blog about my experiences, but the bottom line is they never did anything but offer help that was much appreciated.

    Actually, heh, The worst thing they did, was turn on physx support by default for a game we made with them for benchmarks back when they bought Ageia. My game engine was used for their launch demo, and the review sites (including here I think) found out that if you turned a setting off to software mode, Intel chips doing software physics were faster than NVIDIA physics accelerated mode. Still not illegal, and still not afraid of keeping my job, since I've made it pretty obvious who I am to the right people.
  • ddriver - Friday, March 10, 2017 - link

    Well, for you it might be the carrot, but for others is the stick. Not all devs are as willing to leave their products upoptimized in exchange for a carrot as you are. Nor do they need nvidia to hold them by the hand and walk them through everything that is remotely complex in order to be productive.

    In reality both companies treat you like a peasant, the difference is that nvidia has the resources to make into a peasant they can use, while to poor old amd you are just a peasant they don't have the resources to pamper. Try this if you dare - instead of being a lazy grateful slob take the time and effort to optimize your engine to take the most of amd hardware and brag about that marvelous achievement, and see if nvidia's pampering will continue.

    It is still technically a bribe - helping someone to do something for free that ends up putting them at an unfair advantage. It is practically the same thing as giving you the money to hire someone who is actually competent to do what you evidently cannot be bother with or are unable to do. They still pay the people who do that for you, which would be the same thing if you paid them with money nvidia gave you for it. And you are so grateful for that assistance, that you won't even be bothered to optimize your software for that vile amd, who don't rush to offer to do your job for you like noble, caring nvidia does.
  • ddriver - Friday, March 10, 2017 - link

    It is actually a little sad to see developers so cheap. nvidia took you to see strippers once and now you can't get your tongue out their ass :)

    but it is understandable, as a developer there is a very high chance it was the first pussy you've seen in real life :D

Log in

Don't have an account? Sign up now