A Bit More On Graphics Core Next 1.1

With the launch of Hawaii, AMD is finally opening up a bit more on what Graphics Core Next 1.1 entails. No, they still aren’t giving us an official name – most references to GCN 1.1 are noting that 290X (Hawaii) and 260X (Bonaire) are part of the same IP pool – but now that AMD is in a position where they have their new flagship out they’re at least willing to discuss the official feature set.

So what does it mean to be Graphics Core Next 1.1? As it turns out, the leaked “AMD Sea Islands Instruction Set Architecture” from February appears to be spot on. Naming issues with Sea Islands aside, everything AMD has discussed as being new architecture features in Hawaii (and therefore also in Bonaire) previously showed up in that document.

As such the bulk of the changes that come with GCN 1.1 are compute oriented, and clearly are intended to play into AMD’s plans for HSA by adding features that are especially useful for the style of heterogeneous computing AMD is shooting for.

The biggest change here is support for flat (generic) addressing support, which will be critical to enabling effective use of pointers within a heterogeneous compute context. Coupled with that is a subtle change to how the ACEs (compute queues) work, allowing GPUs to have more ACEs and more queues in each ACE, versus the hard limit of 2 we’ve seen in Southern Islands. The number of ACEs is not fixed – Hawaii has 8 while Bonaire only has 2 – but it means it can be scaled up for higher-end GPUs, console APUs, etc. Finally GCN 1.1 also introduces some new instructions, including a Masked Quad Sum of Absolute Differences (MQSAD) and some FP64 floor/ceiling/truncation vector functions.

Along with these architectural changes, there are a couple of other hardware features that at this time we feel are best lumped under the GCN 1.1 banner when talking about PC GPUs, as GCN 1.1 parts were the first parts to introduce this features and every GCN 1.1 part (at least thus) far has that feature. AMD’s TrueAudio would be a prime example of this, as both Hawaii and Bonaire have integrated TrueAudio hardware, with AMD setting clear expectations that we should also see TrueAudio on future GPUs and future APUs.

AMD’s Crossfire XDMA engine is another feature that is best lumped under the GCN 1.1 banner. We’ll get to the full details of its operation in a bit, but the important part is that it’s a hardware level change (specifically an addition to their display controller functionality) that’s once again present in Hawaii and Bonaire, although only Hawaii is making full use of it at this time.

Finally we’d also roll AMD’s power management changes into the general GCN 1.1 family, again for the basic reasons listed above. AMD’s new Serial VID interface (SIV2), necessary for the large number of power states Hawaii and Bonaire support and the fast switching between them, is something that only shows up starting with GCN 1.1. AMD has implemented power management a bit differently in each product from an end user perspective – Bonaire parts have the states but lack the fine grained throttling controls that Hawaii introduces – but the underlying hardware is identical.

With that in mind, that’s a short but essential summary of what’s new with GCN 1.1. As we noted way back when Bonaire launched as the 7790, the underlying architecture isn’t going through any massive changes, and as such the differences are of primarily of interest to programmers more than end users. But they are distinct differences that will play an important role as AMD gears up to launch HSA next year. Consequently what limited fracturing there is between GCN 1.0 and GCN 1.1 is primarily due to the ancillary features, which unlike the core architectural changes are going to be of importance to end users. The addition of XDMA, TrueAudio, and improved power management (SIV2) are all small features on their own, but they are features that make GCN 1.1 a more capable, more reliable, and more feature-filled design than GCN 1.0.

The AMD Radeon R9 290X Review Hawaii: Tahiti Refined
Comments Locked

396 Comments

View All Comments

  • Sandcat - Friday, October 25, 2013 - link

    That depends on what you define as 'acceptable frame rates'. Yeah, you do need a $500 card if you have a high refresh rate monitor and use it for 3d games, or just improved smoothness in non-3d games. A single 780 with my brothers' 144hz Asus monitor is required to get ~90 fps (i7-930 @ 4.0) in BF3 on Ultra with MSAA.

    The 290x almost requires liduid...the noise is offensive. Kudos to those with the equipment, but really, AMD cheaped out on the cooler in order to hit the price point. Good move, imho, but too loud for me.
  • hoboville - Thursday, October 24, 2013 - link

    Yup, and it's hot. It will be worth buying once the manufacturers can add their own coolers and heat pipes.

    AMD has always been slower at lower res, but better in the 3x1080p to 6x1080p arena. They have always aimed for high-bandwidth memory, which is always performs better at high res. This is good for you as a buyer because it means you'll get better scaling at high res. It's essentially forward-looking tech, which is good for those who will be upgrading monitors in the new few years when 1440p IPS starts to be more affordable. At low res the bottleneck isn't RAM, but computer power. Regardless, buying a Titan / 780 / 290X for anything less than 1440p is silly, you'll be way past the 60-70 fps human eye limit anyway.
  • eddieveenstra - Sunday, October 27, 2013 - link

    Maybe 60-70fps is the limit. but at 120Hz 60FPS will give noticable lag. 75 is about the minimum. That or i'm having eagle eyes. The 780gtx still dips in the low framerates at 120Hz (1920x1080). So the whole debate about titan or 780 being overkill @1080P is just nonsense. (780gtx 120Hz gamer here)
  • hoboville - Sunday, October 27, 2013 - link

    That really depends a lot on your monitor. When they talked about Gsync and frame lag and smoothness, they mentioned when FPS doesn't exactly match the refresh rate you get latency and bad frame timing. That you have this problem with a 120 Hz monitor is no surprise as at anything less than 120 FPS you'll see some form of stuttering. When we talk about FPS > refresh rate then you won't notice this. At home I use a 2048x1152 @ 60 Hz and beyond 60 FPS all the extra frames are dropped, where as in your case you'll have some frames "hang" when you are getting less than 120 FPS, because the frames have to "sit" on the screen for an interval until the next one is displayed. This appears to be stuttering, and you need to get a higher FPS from the game in order for the frame delivery to appear smoother. This is because apparent delay decreases as a ratio of [delivered frames (FPS) / monitor refresh speed]. Once the ratio is small enough, you can no longer detect apparent delay. In essence 120 Hz was a bad idea, unless you get Gsync (which means a new monitor).

    Get a good 1440p IPS at 60 Hz and you won't have that problem, and the image fidelity will make you wonder why you ever bought a monitor with 56% of 1440p pixels in the first place...
  • eddieveenstra - Sunday, October 27, 2013 - link

    To be honnest. I would never think about going back to 60Hz. I love 120Hz but don't know a thing about IPS monitors. Thanks for the response....

    Just checked it and that sounds good. When becoming more affordable i will start thinking about that. Seems like the IPS monitors are better with colors and have less blur@60Hz than TN. link:http://en.wikipedia.org/wiki/IPS_panel
  • Spunjji - Friday, October 25, 2013 - link

    Step 1) Take data irrespective of different collection methods.

    Step 2) Perform average of data.

    Step 3) Completely useless results!

    Congratulations, sir; you have broken Science.
  • nutingut - Saturday, October 26, 2013 - link

    But who cares if you can play at 90 vs 100 fps?
  • MousE007 - Thursday, October 24, 2013 - link

    Very true, but remember, the only reason nvidia prices their cards where they are is because they could. (Eg Intel CPUs v AMD) Having said that, I truly welcome the competition as it makes it better for all of us, regardless of which side of the fence you sit.
  • valkyrie743 - Thursday, October 24, 2013 - link

    the card runs at 95C and sucks power like no tomorrow. only only beats the 780 by a very little. does not overclock well.

    http://www.youtube.com/watch?v=-lZ3Z6Niir4
    and
    http://www.youtube.com/watch?v=3OHKWMgBhvA

    http://www.overclock3d.net/reviews/gpu_displays/am...

    i like his review. its pure honest and shows the facts. im not a nvidia fanboy nore am i a amd fanboy. but ill take nvidia right how over amd.

    i do like how this card is priced and the performance for the price. makes the titan not worth 1000 bucks (or the 850 bucks it goes used on forums) but as for the 780. if you get a non reference 780. it will be faster than the 290x and put out LESS heat and LESS noise. as well as use less power.

    plus gtx 780 TI is coming out in mid November which will probably cut the cost of the current 780 too 550 and and this card would be probably aorund 600 and beat this card even more.
  • jljaynes - Friday, October 25, 2013 - link

    you say the review sticks with the facts - he starts off talking about how ugly the card is so it needs to beat a titan. and then the next sentence he says the R9-290X will cost $699.

    he sure seems to stick with the facts.

Log in

Don't have an account? Sign up now