Along with this week’s teaser of the forthcoming Radeon RX 470 and RX 460 at E3, AMD also held a short press briefing about Polaris. The bulk of AMD’s presentation is going to be familiar to our readers who keep close tabs on AMD’s market strategy (in a word, VR), but this latest presentation also brought to light a few more details on the company’s two Polaris GPUs that I want to quickly touch upon.

First and foremost, AMD’s presentation included a slide with pictures of the two chips, and confirmation on their full configurations. The larger Polaris 10 is a 36 CU (2304 SP) chip, meaning that the forthcoming Radeon RX 480 video card is using a fully enabled chip. Meanwhile the smaller Polaris 11 (note that these pictures aren’t necessarily to scale) packs 16 CUs (1024 SPs). This puts it a bit below Pitcairn (20 CUs) before factoring in GCN 4’s higher efficiency. Meanwhile as is common for these lower-power GPUs, AMD’s slide also confirms that it features a 128-bit memory bus.

AMD is expecting Polaris 11 to offer over 2 TFLOPs of performance. Assuming a very liberal range of 2.0 to 2.5 TFLOPs for possible shipping products, this would put clockspeeds of a high-end Polaris 11 part at between 975MHz and 1220MHz, which is similar to our projections for RX 480/Polaris 10. Note that AMD has not yet announced any specific product using Polaris 11, however as we now know that RX 470 is a Polaris 10 based card, it’s safe to assume that RX 460 is Polaris 11, and the over-2 TFLOPs projection is for that card.

Second, briefly mentioned in AMD’s press release on Monday was the low z-height of at least Polaris 11, and it pops up in this slide deck again. There was some confusion whether z-height referred to the laptop or the chip, but the slide makes it clear that this is about the chip. So it will be interesting to see how thin Polaris 11 is, how that compares to other chips, and just what manufacturers can in turn do with it.



View All Comments

  • name99 - Wednesday, June 15, 2016 - link

    If you have content encoded at 10b AND a decent decoder, you can get most of the value of the 10b by dithering the final output down to 8b. (Cheapest is probably just to add blue noise to the signal and clamp.) This would look better (and definitely result in vastly diminished banding) than encoding at 8b.

    Now DO the existing decoders do a decent dither at the final stage of creating the output frame, as opposed to just clipping the lowest two bits? I have no idea.
  • Alexvrb - Thursday, June 16, 2016 - link

    With any halfway decent player, yes. 10b looks great on an 8b display. Heck VLC does an admirable job and anyone can use that. It's even available on mobile devices, Apple TV, etc - and they're beta testing a UWP version.

    Some of the older/default players for some devices do clip/ignore though which results in funny colors at times. But like I said there's options for novices and advanced users alike.
  • Fujikoma - Thursday, June 16, 2016 - link

    I have 4k 60fps files that are usually short films dealing with science. Monitors do support 10 bit content... as I don't have cable, this works fine in my home. I stream tv and movies, from a couple of NAS setups, over a wireless AC MU-MIMO setup through some WD boxes and computers. Reply
  • plonk420 - Friday, June 17, 2016 - link

    the point of 10-bit is that even if your material to encode is 8-bit, things that would challenge an 8-bit (re)encoder is a piece of cake with a 10-bit colorspace to avoid worse gradient artifacting without having to use obscenely higher bitrates to compensate Reply
  • hojnikb - Wednesday, June 15, 2016 - link

    >AnandTech's prior testing showed that Intel GPUs choke on 4K 60p material, and failed entirely at playing back 10-bit material

    Actually, it works just fine, but you need dual channel ram for it to work.

    Also, 10 bit support is offloaded to the GPU, not cpu.
  • npz - Wednesday, June 15, 2016 - link

    No, the fixed function quicksync video decoders do not support 10bit. Reply
  • npz - Wednesday, June 15, 2016 - link

    And last i checked they did not implement a shader based decoder either for 10bit hevc Reply
  • vladx - Wednesday, June 15, 2016 - link

    You're in luck then because Kabt Lake will bring full hardware 4k encode/decode. No need to pay more for an AMD card. Reply
  • vladx - Wednesday, June 15, 2016 - link

    *Kaby Lake Reply
  • mdriftmeyer - Thursday, June 16, 2016 - link

    No, just pay up the ass for the Intel CPU/iGPU when I can buy the Zen/dGPU and get more bang for my money. Reply

Log in

Don't have an account? Sign up now