For much of the last month we have been discussing bits and pieces of AMD’s GPU plans for 2016. As part of the Radeon Technology Group’s formation last year, the leader and chief architect of the group, Raja Koduri, has set about to make his mark on AMD’s graphics technology. Along with consolidating all graphics matters under the RTG, Raja and the rest of the RTG have also set about to change how they interact with the public, with developers, and with their customers.

One of those changes – and the impetus for these recent articles – has been that the RTG wants to be more forthcoming about future product developments. Traditionally AMD always held their cards close to their chest about architectures, keeping them secret until the first products based on a new architecture launch (and even then sometimes not talking about matters in detail). With the RTG this is changing, and similar to competitors Intel and NVIDIA, the RTG wants to prepare developers and partners for new architectures sooner. As a result the RTG has been giving us a limited, high-level overview of their GPU plans for 2016.

Back in December we started things off talking about RTG’s plans for display technologies – DisplayPort, HDMI, Freesync, and HDR – and how the company would be laying the necessary groundwork in future architectures to support their goals for higher resolution displays, more ubiquitous Freesync-over-HDMI, and the wider color spaces and higher contrast of HDR. The second of RTG’s presentations that we covered was focused on their software development plans, including Linux driver improvements and the consolidation of all of RTG’s various GPU libraries and SDKs under the GPUOpen banner, which will see these resources released on GitHub as open source projects.

Last but not least among RTG’s presentations is without a doubt the most eagerly anticipated subject: the hardware. As RTG (and AMD before them) has commented on in the past couple of years, a new architecture is being developed for future RTG GPUs. Dubbed Polaris (the North Star), RTG’s new architecture will be at the heart of their 2016 GPUs, and is designed for what can now be called the current-generation FinFET processes. Polaris incorporates a number of new technologies, including a 4th generation Graphics Core Next design for the heart of the GPU, and of course the new display technologies that RTG revealed last month. Finally, the first Polaris GPUs should be available in mid-2016, or roughly 6 months from now.

First Polaris GPU Is Up and Running

But before we dive into Polaris and RTG’s goals for the new architecture, let’s talk about the first Polaris GPUs. With the first products expected to launch in the middle of this year, to no surprise RTG has their first GPUs back from the fab and up & running. To that end – and I am sure many of you are eager to hear about – as part of their presentation RTG showed off the first Polaris GPU in action, however briefly.

As a quick preface here, while RTG demonstrated a Polaris based card in action we the press were not allowed to see the physical card or take pictures of the demonstration. Similarly, while Raja Koduri held up an unsoldered version of the GPU used in the demonstration, again we were not allowed to take any pictures. So while we can talk about what we saw, at this time it’s all we can do. I don’t think it’s unfair to say that RTG has had issues with leaks in the past, and while they wanted to confirm to the press that the GPU was real and the demonstration was real, they don’t want the public (or the competition) seeing the GPU before they’re ready to show it off. That said, I do know that RTG is at CES 2016 planning to recap Polaris as part of AMD’s overall presence, so we may yet see the GPU at CES after the embargo on this information has expired.

In any case, the GPU RTG showed off was a small GPU. And while Raja’s hand is hardly a scientifically accurate basis for size comparisons, if I had to guess I would wager it’s a bit smaller than RTG’s 28nm Cape Verde GPU or NVIDIA’s GK107 GPU, which is to say that it’s likely smaller than 120mm2. This is clearly meant to be RTG’s low-end GPU, and given the evolving state of FinFET yields, I wouldn’t be surprised if this was the very first GPU design they got back from Global Foundries as its size makes it comparable to current high-end FinFET-based SoCs. In that case, it could very well also be that it will be the first GPU we see in mid-2016, though that’s just supposition on my part.

For their brief demonstration, RTG set up a pair of otherwise identical Core i7 systems running Star Wars Battlefront. The first system contained an early engineering sample Polaris card, while the other system had a GeForce GTX 950 installed (specific model unknown). Both systems were running at 1080p Medium settings – about right for a GTX 950 on the X-Wing map RTG used – and generally hitting the 60fps V-sync limit.

The purpose of this demonstration for RTG was threefold: to showcase that a Polaris GPU was up and running, that the small Polaris GPU in question could offer performance comparable to GTX 950, and finally to show off the energy efficiency advantage of the small Polaris GPU over current 28nm GPUs. To that end RTG also plugged each system into a power meter to measure the total system power at the wall. In the live press demonstration we saw the Polaris system average 88.1W while the GTX 950 system averaged 150W. Meanwhile in RTG’s own official lab tests (and used in the slide above) they measured 86W and 140W respectively.

Keeping in mind that this is wall power – PSU efficiency and the power consumption of other components is in play as well – the message RTG is trying to send is clear: that Polaris should be a very power efficient GPU family thanks to the combination of architecture and FinFET manufacturing. That RTG is measuring a 54W difference at the wall is definitely a bit surprising as GTX 950 averages under 100W to begin with, so even after accounting for PSU efficiency this implies that power consumption of the Polaris video card is about half that of the GTX 950. But as this is clearly a carefully arranged demo with a framerate cap and a chip still in early development, I wouldn’t read too much into it at this time.

Polaris: A High Level Look
Comments Locked

153 Comments

View All Comments

  • baii9 - Monday, January 4, 2016 - link

    All this efficency talk, you guys must love intel.
  • Sherlock - Tuesday, January 5, 2016 - link

    Off topic - Is it technically feasible to have a GPU with only USB 3 ports & other interfaces tunneled through it?
  • Strom- - Tuesday, January 5, 2016 - link

    Not really. The available bandwidth with USB 3.1 Gen 2 is 10 Gbit/s. DisplayPort 1.3 has 25.92 Gbit/s.
  • daddacool - Tuesday, January 5, 2016 - link

    The big question is whether we'll see Radeons with the same/slightly lower power consumption as current high end cards but with significant performance hikes. Getting the same/marginally better performance at a much lower power consumption isn't really that appealing to me :)
  • BrokenCrayons - Tuesday, January 5, 2016 - link

    Identical performance at reduced heat output would be appealing to me. I'd be a lot more interested in discrete graphics cards if AMD or NV can produce a GPU for laptops that can be passively cooled alongside a passively cooled CPU. If that doesn't happen soon, I'd rather continue using Intel's processor graphics and make do with whatever they're capable of handling. Computers with cooling fans aren't something I'm interested in dealing with.
  • TheinsanegamerN - Tuesday, January 5, 2016 - link

    Almost all laptops have fans. Only the core m series can go fanless, and gaming on them is painful at best.
  • TheinsanegamerN - Tuesday, January 5, 2016 - link

    And amd or no making a 2-3 watt you would perform no better. Physically, it's impossible to make a decent got in a fanless laptop
  • BrokenCrayons - Wednesday, January 6, 2016 - link

    The Celeron N3050 in the refreshed HP Stream 11 & 13 doesn't require a cooling fan. Under heavy gaming demands, the one I own only gets slightly warm to the touch. Based on my experiences with the Core M, I do think it's a decent processor but highly overpriced and simply not deserving of a purchase in light of Cherry Trail processors' 16 EU IGP being a very good performer. I do lament the idea of manufacturers capping system memory at 2GB on a single channel and storage at 32GB in the $200 price bracket since, at this point, it's unreasonable to penalize performance that way and force the system to burn up flash storage life by swapping. At least the storage problem is easy to fix with SD or a tiny plug-and-forget USB stick that doesn't protrude much from the case.

    I think that a ~2 watt AMD GPU could easily be added to such a laptop to relieve the system's RAM of the responsibility of supporting the IGP and be placed somewhere in the chassis where it could work fine with just a copper heat plate without thermally overloading the design. Something like that would make a fantastic gaming laptop and I'd happily part with $300 to purchase one.

    If that doesn't come to fruition soon, I think I'd rather just shift my gaming over to an Android. A good no contract phone can be had for about $30 and really only needs a bluetooth keyboard and controller pad to become a gaming platform. Plus you can make the occasional call with it if you want. I admit that it makes buying a laptop for gaming a hard sell with the only advantage being an 11 versus 4 inch screen. But I've traveled a lot with just a Blackberry for entertainment and later a 3.2 inch Android and it worked pretty well for a week or two in a hotel to keep me busy after finishing on-site. I have kicked around the idea of a 6 inch Kindle Fire though too....either way, AMD is staring down a lot of good options for gaming so it needs to get to work bringing wattage way down in order to compete with gaming platforms that can fit in a pocket.
  • Nagorak - Wednesday, January 6, 2016 - link

    Well, you're clearly not much of a gamer, since what the Intel IGP's are capable of is practically nothing.

    I will agree that the space heaters we currently have in laptops are kind of annoying, but I don't see that improving any time soon.
  • doggface - Wednesday, January 6, 2016 - link

    I don't know about fanless. But if nVidia or AMD could get a meaningful CPU + GPU package in under ~30 watts tdp - that would be very interesting. I can see an ultrabook with a small fan @ 1080p/medium being a great buy for rocket league/lol fans.

Log in

Don't have an account? Sign up now