Meet the GeForce GTX 1080 Ti Founder’s Edition

When it comes to the design of the GeForce GTX 1080 Ti Founder’s Edition, if you’ve seen a high-end NVIDIA card in the last 4 years then you know roughly what to expect. NVIDIA has found their sweet spot in card design, and while the GTX 1080 Ti does have a few wrinkles of its own, it’s not a radical departure from the likes of the GTX 1080, GTX 980 Ti, or GTX 780 Ti. GeForce GTX 1080 Ti is a traditional NVIDIA reference card, with all the competence, performance, and functionality that entails.

At a high level, the GTX 1080 Ti Founder’s Edition is a 10.5-inch long blower-type card comprised of a cast aluminum housing and held together using a combination of rivets and screws. Designed as much for aesthetics as functionality, NVIDIA’s use of well-secured metal has done a good job of tempering noise generation, and for those who like to show off their rigs, the basic design and LED-backlit logo are unmistakable.

Cracking open the card and removing the shroud exposes the card’s fan and heatsink assembly. Once again NVIDIA is lining the entire card with an aluminum baseplate, which provides heatsinking capabilities for the VRMs and other discrete components below it, along with providing additional protection for the board. Like past NVIDIA 250W cards, the GTX 1080 FE uses NVIDIA’s vapor chamber cooler in order to maximize the heat transfer between the GPU/VRMs/DRAM and the aluminum heatsink above. As far as blower-type cards go, it’s still the card to beat.


Airflow: GTX 1080 Ti vs. GTX 1080

For the GTX 1080 Ti, NVIDIA has refined this basic design just a bit in order to further increase airflow. The key change here is that NVIDIA has removed the DVI port that in past designs took up part of the second slot used for ventilation. Consequently the entire slot is now open for airflow. As we’ll see in our benchmarks the practical difference in noise is not very much, but it still means the GTX 1080 Ti is quieter than the GTX 980 Ti and GTX 780 Ti that came before it.

Otherwise in a blink-and-you’ll-miss-it kind of change, NVIDIA has also tweaked the tapering of the shroud itself to maximize the airflow. It’s difficult to pick up in pictures, but the shroud is just a bit thinner and the bottom of the shroud is just a bit higher, allowing just a bit more unobstructed airflow through the card. This subtle change is a good example of where NVIDIA is in their card design cycle: with the major performance elements of the cooler design essentially being a solved problem, NVIDIA is now toying with small changes to eke out just a bit more performance.

Popping off the cooler, we see NVIDIA’s reference PCB. This appears to be the same PCB used on the similarly configured GP102-based Titan X Pascal, and is consistent with NVIDIA’s past reuse of PCBs.

Of particular note, we can see that NVIDIA has equipped the card with a surprising number of MOSFETs, and it turns out there’s a good reason for this. For the GTX 1080 Ti, NVIDIA has opted to go with two dualFETs for each of the GPU’s 7 power phases, as opposed to the traditional 1-per-phase design used in most NVIDIA cards. While this drives up the total cost of the card a bit, the payoff is that it improves on the card’s power delivery efficiency a bit, especially at the 200W+ range the GTX 1080 Ti operates in.

This isn’t the first overall GeForce card we’ve seen with a large number of MOSFETs – NVIDIA’s board partners at times treat it as a contest in and of itself – but this is the first time we’ve seen NVIDIA use such a large number and promote it. Generally speaking additional phases and MOSFETs can improve a power delivery subsystem by spreading out the load – and in a game of inches, every bit counts –  but the additional MOSFETs are subject to diminishing returns both at idle and load.

Otherwise this is a fairly typical NVIDIA PCB. The reuse of the Titan PCB means that the board should work with water blocks and other add-ons designed for the Titan, though I suspect for best overclocking results the hardcore overclockers will continue to want to look to more specialized designs from the likes of MSI, EVGA, Asus, and others.

Flipping the card over to the back, we find NVIDIA’s now-standard segmented backplate. The presence of the backplate helps to protect the card, but one or both segments can be removed to add precious millimeters of room for airflow in tightly packed SLI designs.

Moving on, towards the top of the card we find the requisite SLI and power connectors. Like NVIDIA’s 250W cards before it, the GTX 1080 Ti features a 6 pin + 8 pin setup. NVIDIA’s 250W limit means that, on-board circuitry aside, the power delivery system isn’t anywhere close to its 300W limit. Otherwise we find a pair of SLI connectors, which like the rest of the GTX 10-series cards are designed for use with NVIDIA’s latest-generation High Bandwidth (HB) bridges.

Finally, getting back to NVIDIA’s display I/O configuration, as we mentioned earlier NVIDIA has removed the DVI port from the card in favor of giving the card unobstructed airflow for better cooling. As a result, the card features only modern ports: 3x DisplayPort 1.4 and 1x HDMI 2.0b. With that said, as a consolation item of sorts for the remaining DVI users, NVIDIA is including a DisplayPort-to-SL-DVI adapter with the Founder’s Edition card. This doesn’t perfectly replace the missing DVI port – in particular, it can’t drive 2560x1440 or 2560x1600 displays – but for lower resolution displays it will do the trick. But if it’s not already clear from this change and the number of motherboards to drop DVI over the years, DVI’s days are numbered, and we’re only going to continue to see DVI ports go away at this point.

The GeForce GTX 1080 Ti Review Second Generation GDDR5X: More Memory Bandwidth
Comments Locked

161 Comments

View All Comments

  • eddman - Friday, March 10, 2017 - link

    So we moved from "nvidia pays devs to deliberately not optimize for AMD" to "nvidia works with devs to optimize the games for their own hardware, which might spoil them and result in them not optimizing for AMD properly".

    How is that bribery, illegal? If they did not prevent the devs from optimizing for AMD then nothing illegal happened. It was the devs own doing.
  • ddriver - Friday, March 10, 2017 - link

    Nope, there is an implicit, unspoken condition to receiving support from nvidia. To lazy slobs, that's welcome, and most devs are lazy slobs. Their line of reasoning is quite simple:

    "Working to optimize for amd is hard, I am a lazy and possibly lousy developer, so if they don't do that for me like nvidia does, I won't do that either, besides that would angry nvidia, since they only assist me in order to make their hardware look better, if I do my job and optimize for amd and their hardware ends up beating nvidia's, I risk losing nvidia's support, since why would they put money into helping me if they don't get the upper hand in performance. Besides, most people use nvidia anyway, so why even bother. I'd rather be taken to watch strippers again than optimize my software."

    Manipulation, bribery and extortion. nvidia uses its position to create situation in which game developers have a lot to profit from NOT optimizing for amd, and a lot to lose if they do. Much like intel did with its exclusive discounts. OEM's weren't exactly forced to take those discounts in exchange for not selling amd, they did what they knew would please intel to get rewarded for it. Literally the same thing nvidia does. Game developers know nvidia will be pleased to see their hardware getting an unfair performance advantage, and they know amd doesn't have the money to pamper them, so they do what is necessary please nvidia and ensure they keep getting support.
  • akdj - Monday, March 13, 2017 - link

    Where to start?
    Best not to start, as you are completely, 100% insane and I've spent two and a half 'reads' of your replies... trying to grasp WTH you're talking about and I'm lost
    Totally, completely lost in your conspiracy theories about two major GPU silicon builders while being apparently and completely clueless about ANY of it!
    Lol - Wow, I'm truly astounded that you were able to make up that much BS ...
  • cocochanel - Friday, March 10, 2017 - link

    You forgot to mention one thing. Nvidia tweaking the drivers to force users into hardware updates. Say, there is a bunch of games coming up this Christmas. If you have a card that's 3-4 years old, they release a new driver which performs poorly on your card ( on those games ) and another driver which performs way better on the newest cards. Then, if you start crying, they say: It's an old card, pal, why don't you buy a new one !
    With DX11 they could do that a lot. With DX12 and Vulkan it's a lot harder. Most if not all optimizations have to be done by the game programmers. Very little is left to the driver.
  • eddman - Friday, March 10, 2017 - link

    That's how the ENTIRE industry is. Do you really expect developers to optimize for old architectures. Everyone does it, nvidia, AMD, intel, etc.

    It is not deliberate. Companies are not going to spend time and money on old hardware with little market share. That's how it's been forever.

    Before you say that's not the case with radeons, it's because their GCN architecture hasn't changed dramatically since its first iteration. As a result, any optimization done for the latest GCN, affects the older ones to some extent too.
  • cocochanel - Friday, March 10, 2017 - link

    There is good news for the future. As DX12 and Vulkan become mainstream API's, game developers will have to roll up their sleeves and sweat it hard. Architecturely, these API's are totally different from the ground up and both trace their origin from Mantle. And Mantle was the biggest advance in graphics API's in a generation. The good days for lazy game developers is coming to an end, since these new API's put just about everything back into their hands whether they like it or not. Tweaking the driver won't make much of a difference. Read the API's documentation.
  • cmdrdredd - Monday, March 13, 2017 - link

    Yes hopefully this will be the future where games are the responsibility of the developer. Just like on Consoles. I know people hate consoles sometimes but the closed platform shows which developers have their stuff together and which are lazy bums because Sony and Microsoft don't optimize anything for the games.
  • Nfarce - Friday, March 10, 2017 - link

    Always amusing watching to tin foil hat Nvidia conspiracy nuts talk. Here's my example: working on Project Cars as an "early investor." Slightly Mad Studios gave both Nvidia and AMD each 12 copies of the beta release to work on, the same copy I bought. Nvidia was in constant communication with SMS developers and AMD was all but never heard from. After about six months, Nvidia had a demo of the racing game ready for a promotion of their hardware. Since AMD didn't take Project Cars seriously with SMS, Nvidia was able to get the game tweaked better for Nvidia. And SMS hat-tipped Nvidia with having billboards in the game showing Nvidia logos.

    Of course all the AMD fanboys claimed unfair competition and the usual whining when their GPUs do not perform as well in some games as Nvidia (they amazingly stayed silent when DiRT Rally, another development I was involved with, ran better on AMD GPUs and had AMD billboards).
  • ddriver - Friday, March 10, 2017 - link

    So was there anything preventing the actual developers from optimizing the game? They didn't have nvidia and amd hardware, so they sent betas to the companies to profile things and see how it runs?

    How silly one must be to expect that nvidia - a company that rakes in billions every year, and amd - a company is in the red most of the time and has lost billions, will have the same capacity to do game developers jobs for them?

    It is the game developer's job top optimize. Alas, as it seems, nvidia has bred a new breed of developers - those who do their job half-assedly and then wait on them to optimize, conveniently creating unfair advantage to their hardware.
  • ddriver - Friday, March 10, 2017 - link

    Also talking about fanboys - I am not that. Yes, I am running dozens of amd gpus, and I don't see myself buying any nvidia product any time soon, but that's only because the offer superior value to what I need them for.

    I don't give amd extra credit for offering a better value. I know this is not what they want. It is what they are being forced into.

    I am in a way grateful to nvidia for sandbagging amd, because this way I can get a much better value products. If things were square between the two, and all games were equally optimized, then both companies would offer products with approximately identical value.

    Which I would hate, because I'd lose the currently, 2-3x better value for the money i get with amd. I benefit and profit from nvidia being crooks, and I am happy that I can do that.

    So nvidia, keep doing what you are doing. I am not really objecting, I am simply stating the facts. Of course, nvidia fanboys would have a problem understanding that, and a problem with anyone tarnishing the good name of that helpful awesome and paying for strippers company.

Log in

Don't have an account? Sign up now