Polaris: Made For FinFET

The final aspect of RTG’s Polaris hardware presentation (and the bulk of their slide deck) is focused on the current generation FinFET manufacturing processes and what that means for Polaris.

As RTG’s slide concisely and correctly notes, the regular march of progress in semiconductor fabrication has quickly tapered off over the last decade. What was once a yearly cadence of new manufacturing processes – a major new node every 2 years with a smaller step in the intermediate years – became just every two years. And even then, after the 20nm planar process proved unsuitable for GPUs due to leakage, we are now to our fifth year of 28nm planar as the leading manufacturing node for GPUs. The failure of 20nm has essentially stalled GPU manufacturing improvements, and in RTG’s case resulted in GPUs being canceled and features delayed to accommodate the unexpected stall at 28nm.

For their most recent generation of products both RTG and NVIDIA took steps to improve their architectural efficiency due to the lack of a new manufacturing process – with NVIDIA having more success at this than RTG – but ultimately both parties were held back from what they originally were planning back around 2010. So to say that the forthcoming move to FinFET for new GPUs is a welcome change is an understatement; after nearly half a decade of 28nm GPUs we finally will see the kind of true generational improvements that can only come from a new manufacturing node.

To no surprise then, RTG is aggressively targeting FinFET with Polaris and promoting the benefits thereof. With power efficiency essentially being the limiting factor to GPU performance these days, the greatest gains can only be reached by improving overall power efficiency, and for RTG FinFETs will be a big part of getting there. Polaris will be the first RTG architecture designed for FinFETs, and coupled with the architecture improvements discussed earlier, it should result in the largest overall increase in performance per watt for any Radeon GPU family.

We’ve already covered the technical aspects of FinFET a number of times before, so I’m not going to go into too much depth here.  But at the most basic level, FinFETs are the solution to the leakage problems that have made planar transistors impractical below 28nm (and ultimately killed 20nm for GPUs). By using multiple fins to essentially make a transistor 3D, it becomes possible to control leakage in a manner not possible with planar transistors, and that in turn will significantly improve energy efficiency by reducing the amount of energy a GPU wastes just to be turned on.

With the introduction of FinFET manufacturing processes, GPU manufacturing can essentially get back on track after the issues at 20nm. FinFETs will be used for generations to come, and while the initial efficiency gain from adding FinFETs will likely be the single greatest gain, it solves the previous leakage problem and gives foundries a route to 10nm and beyond. At the same time however as far as the Polaris GPUs are concerned, it should be noted that the current generation of 16nm/14nm FinFET processes are not too far removed from 20nm with FinFETs. Which is to say that the move to FinFETs gets GPU manufacturing back on track, but it won’t make up for lost time. 14nm/16nm FinFET is essentially only one generation beyond 28nm by historical performance standards, and the gains we're expecting from the move to FinFET should be framed accordingly.

As for RTG’s FinFET manufacturing plans, the fact that RTG only mentions “FinFET” and not a specific FinFET process (e.g. TSMC 16nm) is intentional. The group has confirmed that they will be utilizing both traditional partner TSMC’s 16nm process and AMD fab spin-off (and Samsung licensee) GlobalFoundries’ 14nm process, making this the first time that AMD’s graphics group has used more than a single fab. To be clear here there’s no expectation that RTG will be dual-sourcing – having both fabs produce the same GPU – but rather the implication is that designs will be split between the two fabs. To that end we know that the small Polaris GPU that RTG previewed will be produced by GlobalFoundries on their 14nm process, meanwhile it remains to be seen how the rest of RTG’s Polaris GPUs will be split between the fabs.

Unfortunately what’s not clear at this time is why RTG is splitting designs like this. Even without dual sourcing any specific GPU, RTG will still incur some extra costs to develop common logic blocks for both fabs. Meanwhile it's also not clear right now whether any single process/fab is better or worse for GPUs, and what die sizes are viable, so until RTG discloses more information about the split order, it's open to speculation what the technical reasons may be. However it should be noted that on the financial side of matters, as AMD continues to execute a wafer share agreement with GlobalFoundries, it’s likely that this split helps AMD to fulfill their wafer obligations by giving GlobalFoundries more of AMD's chip orders.

Closing Thoughts

And with that, we wrap up our initial look at RTG's Polaris architecture and our final article in this series on RTG's 2016 GPU plans. As a high level overview what we've seen so far really only scratches the surface of RTG's plans - and this is very much by design. But as the first occasion of RTG opening up their roadmaps and giving us a bit of a look into the future, it's a welcome change not only for developers, but for the press and public alike. 

Backed by the first major node shrink for GPUs in over 4 years, RTG has laid out an aggressive plan for Polaris in 2016. At this point RTG needs to catch up and close the market share gap with NVIDIA - of this RTG is quite aware - and Polaris will be the means to do that. What needs to happen now is for RTG to fully execute on the plans they've laid out, and if they can do so then 2016 should turn out to be an interesting (and competitive) year in the GPU industry.

Polaris: A High Level Look
Comments Locked

153 Comments

View All Comments

  • Cinnabuns - Monday, January 4, 2016 - link

    Power:performance translates into $:performance if you're the one footing the utility bill.
  • nikaldro - Tuesday, January 5, 2016 - link

    This is a common myth. Unless your PC drinks A LOT of Watts, your power bill won't change much.
  • HollyDOL - Tuesday, January 5, 2016 - link

    Depends where you live (cost per MWh), how long do you run the PC and how loaded it is... It ain't that hard to calculate... where I live I pay ~ 62Eur per MWh (75 with tax)... so running 600W power hog vs 300W 8 hours a day (wife@home, so prolly it's even more) puts you on 108 vs 54 Eur a year (plus tax) on computer alone. It's not tragic, but also not that little to just plain neglect it...
  • Peter2k - Monday, January 4, 2016 - link

    Because AMD runs too hot to be cooled easily compared to NVidia
    Well high end cards anyway
    Less heat = less noise/or more frequency (=more performance)
  • Ramon Zarat - Monday, January 4, 2016 - link

    Thanks to the destruction of the middle class, the erosion of purchasing power and price rise of all sources of energy in the last 40 years, the Watt per frame equation is actually more important than ever. Unless you are part of the privileged 1% and wipe your ass with a Benjamin...
  • Spoelie - Tuesday, January 5, 2016 - link

    Actually, no.

    You may want to watch this if you want to improve your world-view beyond that of a chimp.
    https://t.co/kpnCsLDidb
  • anubis44 - Thursday, January 14, 2016 - link

    I watched your TED Talk, and I don't see how what Ramon Zarat said is refuted in this TED Talk.

    Obviously, when he refers to the 'destruction of the middle class', he means the one in the United States. And he's right. there has bee a net destruction of good-paying, solid, middle class jobs in the United States (and Canada, where I live, for that matter). He's also right that the optimistic goal espoused after WWII to use nuclear power to produce cheap energy for all has been completely co-opted by a corrupt cabal of wealthy manipulators, and for some mysterious reason, we're STILL burning fossil fuels to produce most of our energy (including transportation). Of course, the expensive energy economy these greedy fools have tried to impose on us is unsustainable, and is unravelling at the seams now that so many countries are dependent on oil revenues, and have allowed their manufacturing to flow to China, there is no solidarity among oil producers, and all they're ALL pumping the stuff out of the ground as fast as they can because they're all dependent on the oil revenues to survive. Either most of these countries get out of the oil business and realize they need to have a manufacturing-based economy once again while they can, or we'll descend into a desperate, anarchic world, with countries simply invading each other for oil in order to try to maintain their falling oil incomes by controlling more and more of the production. Disgusting.
  • ASEdouardD - Tuesday, January 5, 2016 - link

    Like Spoelie said, kinda, the world as a whole is living much more materially comfortably than 40 years ago, thanks in a large part to the rise of Asian middle-classes and a significant improvement in South America. Purchasing power and living standards are also higher in Europe and the US than they were 40 years ago, even though the gains from economic growth has been disproportionately concentrated in the hands of the richest, especially in the US.

    Now, there is clearly a stagnation of income for the American middle-class, but things are not really going worse than they were 40 years ago.

    We can debate the economic fortunes of the US and Europe in the last 10 years, but the world average is much, much better than it was in 1975.
  • boozed - Monday, January 4, 2016 - link

    If your parents are paying for your electricity and air conditioning, sure, go ahead and ignore power consumption.

    Meanwhile, in the real world...
  • looncraz - Tuesday, January 5, 2016 - link

    Meanwhile, in the real world... people who pay their own bills know that a 40W GPU difference is effectively zero.

    I replaced all of my lighting with LEDs and saved an estimated average of 500W 24/7. The extra 40W, or even 140W, under load for the few hours of gaming a day any normal person has time for will not make any type of impact on the electric bill that is noticeable beyond the noise of a two degree temperatures swing outside.

Log in

Don't have an account? Sign up now