Second Generation GDDR5X: More Memory Bandwidth

One of the more unusual aspects of the Pascal architecture is the number of different memory technologies NVIDIA can support. At the datacenter level, NVIDIA has a full HBM 2 memory controller, which they use for the GP100 GPU. Meanwhile for consumer and workstation cards, NVIDIA equips GP102/104 with a more traditional memory controller that supports both GDDR5 and its more exotic cousin: GDDR5X.

A half-generation upgrade of sorts, GDDR5X was developed by Micron to further improve memory bandwidth over GDDR5. GDDR5X further increases the amount of memory bandwidth available from GDDR5 through a combination of a faster memory bus coupled with wider memory operations to read and write more data from DRAM per clock. And though it’s not without its own costs such as designing new memory controllers and boards that can accommodate the tighter requirements of the GDDR5X memory bus, GDDR5X offers a step in performance between the relatively cheap and slow GDDR5, and relatively fast and expensive HBM2.

With rival AMD opting to focus on HBM2 and GDDR5 for Vega and Polaris respectively, NVIDIA has ended up being the only PC GPU vendor to adopt GDDR5X. The payoff for NVIDIA, besides the immediate benefits of GDDR5X, is that they can ship with memory configurations that AMD cannot. Meanwhile for Micron, NVIDIA is a very reliable and consistent customer for their GDDR5X chips.

When Micron initially announced GDDR5X, they laid out a plan to start at 10Gbps and ramp to 12Gbps (and beyond). Now just under a year after the launch of the GTX 1080 and the first generation of GDDR5X memory, Micron is back with their second generation of memory, which of course is being used to feed the GTX 1080 Ti. And NVIDIA, for their part, is very eager to talk about what this means for them.

With Micron’s second generation GDDR5X, NVIDIA is now able to equip their cards with 11Gbps memory. This is a 10% year-over-year improvement, and a not-insignificant change given that memory speeds increase at a fraction of GPU throughput. Coupled with GP102’s wider memory bus – which sees 11 of 12 lanes enabled for the GTX 1080 Ti – and NVIDIA is able to offer just over 480GB/sec of memory bandwidth with this card, a 50% improvement over the GTX 1080.

For NVIDIA, this is something they’ve been eagerly awaiting. Pascal’s memory controller was designed for higher GDDR5X memory speeds from the start, but the memory itself needed to catch up. As one NVIDIA engineer put it to me “We [NVIDIA] have it easy, we only have to design the memory controller. It’s Micron that has it hard, they have to actually make memory that can run at those speeds!”

Micron for their part has continued to work on GDDR5X after its launch, and even with what I’ve been hearing was a more challenging than anticipated launch last year, both Micron and NVIDIA seem to be very happy with what Micron has been able to accomplish with their second generation GDDR5X memory.

As demonstrated in eye diagrams provided by NVIDIA, Micron’s second generation memory coupled with NVIDIA’s memory controller is producing a very clean eye at 11Gbps, whereas the first generation memory (which was admittedly never speced for 11Gbps) would produce a very noisy eye. Consequently NVIDIA and their partners can finally push past 10Gbps for the GTX 1080 Ti and the forthcoming factory overclocked GTX 1080 and GTX 1060 cards.

Under the hood, the big developments here were largely on Micron’s side. The company continued to optimize their metal layers for GDDR5X, and combined with improved test coverage were able to make a lot of progress over the first generation of memory. This in turn is coupled with improvements in equalization and noise reduction, resulting in the clean eye we see above.

Longer-term here, GDDR6 is on the horizon. But before then, Micron is still working on further improvements to GDDR5X. Micron’s original goal was to hit 12Gbps with this memory technology, and while they’re not there quite yet, I wouldn’t be too surprised to be having this conversation once again for 12Gbps memory within the next year.

Finally, speaking of memory, it’s worth noting that NVIDIA also dedicated a portion of their GTX 1080 Ti presentation to discussing memory capacity. To be honest, I get the impression that NVIDIA feels like they need to rationalize equipping the GTX 1080 Ti with 11GB of memory, beyond the obvious conclusions that it is cheaper than equipping the card with 12GB and it better differentiates the GTX 1080 Ti from the Titan X Pascal.

In any case, NVIDIA believes that based on historical trends, 11GB will be sufficient for 5K gaming in 2018 and possibly beyond. Traditionally NVIDIA has not been especially generous on memory – cards like the 3GB GTX 780 Ti and 2GB GTX 770 felt the pinch a bit early – so going with a less-than-full memory bus doesn’t improve things there. On the other hand with the prevalence of multiplatform games these days, one of the biggest drivers in memory consumption was that the consoles had 8GB of RAM each; and with 11GB, the GTX 1080 Ti is well ahead of the consoles in this regard.

Meet the GeForce GTX 1080 Ti Founder’s Edition Driver Performance & The Test
Comments Locked

161 Comments

View All Comments

  • ddriver - Friday, March 10, 2017 - link

    Also, granted, there are some amd optimized games, albeit few and far in between. But that doesn't excuse what nvidia does, nor does it justify it.

    Besides it was nvidia who started this practice, amd does simply try its best to balance things out, but they don't have nowhere nearly the resources.

    amd optimized games are so rare, than out of my many contacts in the industry, I don't know a single one. So I cannot speak of the kinds of terms amd offers their assistance. I can only do that for nvidia's terms.

    If amd's terms for support are just as exclusive as nvidia's, then amd is being guilty too. But even then, that doesn't make nvidia innocent. It makes amd guilty, and it makes nvidia like a 100 times guiltier.
  • eddman - Friday, March 10, 2017 - link

    What terms? Are we back to "If we help, you cannot optimize your game for AMD"? How do you know there are such terms?

    Also, you said the entire helping out thing is illegal, terms or no terms. Now it's illegal only if there are certain terms?
  • ddriver - Friday, March 10, 2017 - link

    Read the reply above. nvidia doesn't state the terms, because that would be illegal, the terms are implied, and they refuse further support if you break them... and worse... so it is a form of legal bribery

    and since their drivers are closed source, any hindrances they might implement to hamper your software remain a secret, but hey, there is a good reason why those drivers keep getting more and more bloated
  • eddman - Friday, March 10, 2017 - link

    What if is nothing implied? Why are you so sure something must be implied if they're helping a dev? What if they simply want that game to work best with their hardware because it's an important game in their mind and might help sell some cards?

    We are heavily into guessing and assuming territory.

    I'm not saying shady stuff doesn't happen at all, but to think that it happens all the time without exception would be extreme exaggeration.
  • ddriver - Friday, March 10, 2017 - link

    There is no "nothing implied". It doesn't take a genius to figure what nvidia's motivation for helping is. Of course, if it is an important, prominent title, nvidia might help out even if the studio optimizes for amd just to save face.

    But then again, nvidia support can vary a lot, it can be just patching up something that would make them look bad, it can be free graphics cards and strippers as in the case of our lad above. I am sure he didn't optimize for amd. I mean that developers don't really care all that much how well their software runs, if it runs bad, just get a faster gpu. They care about how much pampering they get. So even in the case of a studio which is too big for nvidia to blackmail, there is still ample motivation to please it for the perks which they won't be getting from amd.

    There is no assuming in what I say. I know this first hand. nvidia is very kind and generous to those willing to play ball, and and very thuggish with those who don't. So it doesn't come as a surprise if most of the developers chose to be on its good side. The more you please nvidia, the more you get from it, if tomorrow you apply for a job, and there is a sexy chick competing for the position, it can get the job by blowing the manager, and even if you would too, he is not into guys. You are not in the position to compete, and it is an unethical thing that wins her the job. U happy about it?
  • eddman - Friday, March 10, 2017 - link

    You know this first hand how? You are claiming a lot and providing nothing concrete.
  • eddman - Friday, March 10, 2017 - link

    I already listed the motivation. Game runs good on their cards. People buy their cards.
  • cocochanel - Saturday, March 11, 2017 - link

    I don't understand your stubbornness. What ddriver is alluding to is questionable practices. In a free market system, fierce competition and all that, it becomes the norm. But it doesn't make it right. Free markets, you know, are like democracy. And you probably know what good old Winston had to say about that.
  • eddman - Saturday, March 11, 2017 - link

    No, he's outright calling such partnerships illegal. He claims to have "first hand" info but reveals nothing.

    Hardware companies working with software studios has been going on for decades. It wasn't illegal then and it isn't now.
  • DMCalloway - Saturday, March 11, 2017 - link

    Agreed. Had Intel been fined the true value of what it gained with global dominance over the next 10+ years, things would be vastly different in Sunnyvale right now. So much so I would even speculate that Green team's only hope of viability would have come in the form of an acquisition on Blue team's part.

Log in

Don't have an account? Sign up now