Having just left the stage at AMD’s financial analyst day is CEO Dr. Lisa Su, who was on stage to present an update on AMD’s computing and graphic business. As AMD has already previously discussed their technology roadmaps over the next two years earlier in this presentation, we’ll jump right into the new material.

Not mentioned in AMD’s GPU roadmap but now being mentioned by Dr. Su is confirmation that AMD will be launching new desktop GPUs this quarter. AMD is not saying much about these new products quite yet, though based on their description it does sound like we’re looking at high-performance products (and for anyone asking, the picture of the card is a placeholder; AMD doesn’t want to show any pictures of the real product quite yet). These new products will support DirectX 12, though I will caution against confusing that with Feature Level 12_x support until we know more.

Meanwhile the big news here is that these forthcoming GPUs will be the first AMD GPUs to support High Bandwidth Memory. AMD’s GPU roadmap coyly labels this as a 2016 technology, but in fact it is coming to GPUs in 2015. The advantage of going with HBM at this time is that it will allow AMD to greatly increase their memory bandwidth capabilities while bringing down power consumption. Coupled with the fact that any new GPU from AMD should also include AMD’s latest color compression technology, and the implication is that the effective increase in memory bandwidth should be quite large. For AMD, they see this as being one of the keys of delivering better 4K performance along with better VR performance.

In the process AMD has also confirmed that these HBM-equipped GPUs will allow them to experiment with new form factors. By placing the memory on the same package as the GPU, AMD will be able to save space and produce smaller cards, which will allow them to produce designs other than the traditional large 10”+ cards that are typical of high-end video cards. AMD competitor NVIDIA has been working on HBM as well and has already shown off a test vehicle for one such card design, so we have reason to expect that AMD will be capable of something similar.


With apologies to AMD: NVIDIA’s Pascal Test Vehicle, An Example Of A Smaller, Non-Traditional Video Card Design

Finally, while talking about HBM on GPUs, AMD is also strongly hinting that they intend to bring HBM to other products as well. Given their product portfolio, we consider this to be a pretty transparent hint that the company wants to build HBM-equipped APUs. AMD’s APUs have traditionally struggled to reach peak performance due to their lack of memory bandwidth – 128-bit DDR3 only goes so far – so HBM would be a natural extension to APUs.

Comments Locked

146 Comments

View All Comments

  • Hicks12 - Wednesday, May 6, 2015 - link

    I dont understand the issue with AMD drivers, I go for whatever team makes the best card for the money be it the red or green team.

    I used to rock a gtx480 (yay for free upgrades!) and moved to 7950 later on, drivers if anything have been more stable on the AMD side of late as I have always had issue running multiple monitors on the Nvidia GPUs (very finicky), I would assume its been corrected in the latest batch but I dont know as I current dont own an Nvidia card but I cant find a single broken feature for myself on the AMD front.

    Crossfire does work, not sure where you got that information from? Does it work 100% of the time with 0 faults? Hell no but neither does Nvidia SLI solution.

    Free sync works as advertised and was integrated back into the VESA standard, shows how solid the idea is (yes AMD simply revised an old system to work for this century). G-Sync has issues and Free Sync is still gaining momementum, you pay about £50 more for a Free sync monitor vs a standard one so the small outlay is great if you're looking to replace one, G-Sync requires some consideration as its significantly more expensive and you're locked in with Nvidia, thats fine if you know you will always be rocking a green card but honestly no one knows what will happen in 5 years time!

    G-Sync is good but Free sync does the job (seems a lot of reviewers came to the same conclusion) and will only improve over the next year. It would be nice if Nvidia simply integrated support for free sync then no one would have to ever decide on a GPU manufacturer just because of this silly feature that should have been an industry standard years ago.

    Think ive lost my point... all I see is AMD bashing but it seems unjust as drivers are okay on average for both sides now, if you went back 5 years ago then yes AMD were flaky but not any more unfortunately...
  • chizow - Thursday, May 7, 2015 - link

    Hi, you may want to do some fact-checking before you post in reply.

    CrossFire was in reference to FreeSync, still broken, there was just an update on it saying driver was delayed.

    FreeSync doesn't work as advertised, there are gaps in its VRR windows, at which point it drops and reverts to Vsync Off and exhibits all that unpleasant behavior. Also, on some panels it disables Overdrive completely causing ghosting. And the new Asus panel? Its VRR window is limited to 90Hz using FreeSync but goes up to 144Hz without . Certainly not the 9-240Hz AMD claimed and their fanboys parroted, inaccurately.

    You're locked in on the monitor either way, so once again, why not spend more on the better solution, especially given Nvidia offers the better GPU solution also? What is $1050 ($750 G-Sync + $300 970) in hardware compared to $900 ($600 FreeSync + $300 290X) in hardware to someone spending almost $1000 anyways?
  • testbug00 - Thursday, May 7, 2015 - link

    Er, Freesync works exactly as intended:
    1. In the refresh range that the MONITOR MANUFACTURER CHOOSES it has a variable refresh rate.
    2. What do you mean gaps in VVR? Any proof? All the "major issues" I've seen with Freesync have NOTHING to do with Freesync, instead have to do with the monitor manufacturer or is FUD.
    2. the 9-240Hz range is the specification of the refresh range that adaptive-sync has. For a manufacturer to make a 240Hz monitor and get everything working for it to run 9-240Hz smoothly would be really expensive for the manufacturer.

    And, the fact you cannot tell the difference between 900 and 1050 clearly shows that you are in the top sliver of the population. And, for the record, the difference between the 1440p monitors (what I would pair with a 970/290 for most people) is 480 to 749 dollars. If you're aiming for a 1080p display, G-sync seems to be the better buy.

    However, well... Given you're going for a "value 1440p" build... Well, the AMD Freesync version would currently be ~$1600 (http://pcpartpicker.com/p/MFvb6h) while the Nvidia version would be ~1900 (http://pcpartpicker.com/parts/partlist/). You could equalize the price by going 1080p with Nvidia, or lower it a bit by going 2160p (with max 60Hz, however) also.

    G-sync has some fringe advantages, for $300? I could not recommend that to the people I deal with. And, I spend a lot of time configuring computers for friends and people I barely know. For a 1080p build... Nvidia is currently the winner hands down, however! :)
  • chizow - Thursday, May 7, 2015 - link

    1. Except that's not what AMD said during the run-up. Nowhere did they state such VRR window limitations in all the pressers and interviews they did. They also published a whitepaper that fanboys like Creig dishonestly quoted numerous times stating 9-240Hz supported refresh rates, in a clearly dishonest attempt to make FreeSync look better.

    2. There's plenty of proof, below the stated VRR window, VSync is forced off so you get a lurch followed by ghosting, tearing and stuttering. Any reviews, even Anandtech's superficial review covers this. At high refresh rates, you get the same behavior although you can choose to force it on at the top end. It is less noticeable at high refresh though since any tearing is going to be less incremented.
    3. Again, this was quoted by MANY AMD fanboys as one of the reasons FreeSync was claimed to be superior to G-Sync months before anyone even had a chance to prove it! See, this is exactly the kind of BS/misinformation that AMD put out there that simply dies hard. Same for being free, firmware flashable, standard every DP monitor will have etc. It sets up FALSE and UNREALISTIC expectations that ultimately turn out to be bogus. That's what AMD loves to do though, because they didn't actually have a working solution, they used this kind of FUD and misinformation to try and hold off adoption. Doesn't matter now though, world gets to see what they have been working on for the last 21 months, and its not real good lol.

    I didn't say I couldn't tell the difference. I said $150 for anyone who is spending that much is not going to be too much to spend for the better solution, especially when there's a good chance they won't have to spend that much. With Nvidia, there's a good chance the cards you own already support G-Sync. With AMD, there's a good chance you're going to have to upgrade if you don't already own a 290/X.

    And what are you comparing at 480? The sucky Acer TN to the awesome Swift? Nah, that Acer is junk but if you want a better comparison you can use the Asus IPS 1440p vs. the Acer IPS 1440p and you can see, the difference is only $150-$200. Which is about right given the Asus only supports 90Hz refresh in VRR FreeSync mode. Once again, looks like the Nvidia premium is justified.

    I guess that's good you recommend AMD to people you barely know, it'd probably be pretty hard to get repeat business from them making the kinds of recommendations you're giving without even knowing the kinds of issues you're setting them up for. Certainly better than explaining to them why there's this ghosting they haven't seen since 2007 PMVA days or why their FreeSync panel keeps popping in and out of tearing/stuttering modes I guess.
  • testbug00 - Thursday, May 7, 2015 - link

    1. Freesync slide clearly says "published refresh rate range == 9-240Hz" which is because it's based off of A-sync, which... has a published ranged of 9-240Hz!!!! The issue is not what AMD did. The issue is that people who support or don't support it don't seem to understand that is the SPEC which does not reflect what shipping products will have. Shipping products are doing 40-144Hz. Hence, currently, G-sync has a slight lower range advantage. Depending on which Freesync monitor you get, you may have a range that is even worse.

    A side note before we continue, Anandtech's Freesync review(s) have all been pitiful compared to what some other sites have done. The best one I've managed to read so far in terms of exploring the differences between Gsync and Freesync is techreport (http://techreport.com/review/28073/benq-xl2730z-fr...

    2. Ghosting is caused/controlled by the monitor firmware/control panel Free-sync is not involved.
    Tearing and stuttering happen on every monitor to varying degrees. When below (or above) The monitor runs at max Refresh rate, causing tearing and stuttering.

    3. Once more, AMD was very clear in their slides that that was the published range. The issue is people on both sides blowing things up. A fully implemented A-sync range monitor would be better than current G-sync monitors given the firmware/control panel are adjusted for ghosting/overdrive/etc. That product and QA and such probably would end up costing around the same as Gsync does. So, Freesync can deliver a slightly worse product for a noticeable price reduction, or, a better monitor for the same price. Now, the same price is my guestimate. It could be wrong.

    As for the monitors... I've honestly stopped pushing for nicer IPS given the display can be calibrated properly. Personally, I will pay the extra. Most people I've dealt with aren't willing to. Hence, the larger difference in price due to the monitor.

    Given it was me buying the setup, I would likely end up going for a Freesync display as once Async is enabled by Nvidia/Intel it will be essentially be hardware agnostic. And, afterwards, buy a cheap GPU (750-750ti level of peformance) to drive the games I play at low settings until 16FF comes out. And, in general for recommendations end up being whatever is the best in the price range. ~3 years ago i5+7870 (3 monitor setup). Since the 970 came out... well, that one's obvious. Same for when Hawaii came out.

    I do a bit of support for any issues for that stuff. And, I've not noticed any major issues for either vendor. OF course, my sample size is likely not representative of the population as a whole, and, not large enough given that it was representative. Given you have ~80 identical systems sans 40 Nvidia 40 AMD in a population that represents the average video card owner... Perhaps you could draw some useful information. However, as you've said, you run pretty much exclusively NVidia and have experience with a whole 1 AMD card in the sample.
  • chizow - Friday, May 8, 2015 - link

    1. Again, now you're in the awkward position of making excuses for AMD's published deceptive specs. Don't pull a Creig here and keep insisting some day far off in the future FreeSync may support 9-240Hz but today it is better because an AMD spec sheet said so. Its dishonest, plain and simple. AMD should have published specs according to what they knew and what was available on their test samples, but again, we know this wasn't possible because they published those BS specs when they did not have product!

    2. Did you read the TFT Central review I linked you? It is clearly linked to FreeSync because the FreeSync command overrides and directly conflicts with the OverDrive command, thus disabling anti-ghosting measures only when FreeSync is enabled. Tearing and stuttering don't happen with G-Sync, ever actually, because Nvidia has provisions on both ends of the spectrum to explicitly prevent it. While AMD only deals with these fundamental issues within a much more limited VRR window. So surely, you can agree the fact there are so many asterisks and special cases with FreeSync, AMD was dishonest when they said FreeSync would actually be free and that all that hardware Nvidia was charging for might actually be worth the premium?

    3. No, it is not people on both sides, it is 1 side putting out nonsense and the other calling them on it. AMD put out bullshit misinformation because they didn't have product and they were trying to slow adoption. This is their MO. Their fanboys take this misinformation and run with it, scream it from the rooftops and perpetuate it, and it simply never dies. Even today you still have dim-witted AMD fanboys asking when the firmware flashes will be availalble for their half a decade old monitor. Where did they get this idea? No, it couldn't be AMD at fault could it? The ones who coined the misnomer name FreeSync to begin with and told everyone who would listen that existing monitors on the market could support their spec with a firmware update, for essentially free? But I guess you will exonerate them of this as well when people start seeing some awful ghosting and tearing even at that 30-40Hz range and wonder why FreeSync isn't working at 9Hz like AMD claimed?

    Again, I dont' think people who are going to pay $600+ for a monitor are going to balk at another $100-$150 if it means getting a better product. Indeed, we've seen the top G-Sync panels sold out consistently at launch, like the Asus ROG Swift and Acer 27" IPS. Meanwhile, the FreeSync panels are available in abundance everywhere they are sold.

    You could buy a FreeSync panel hoping Nvidia adopts Adaptive Sync someday, but you'd be making a pretty big mistake with that expectation. Again, what guarantee do you have that even if Nvidia were to adopt an Adaptive Sync solution, that panel would be compatible? That AMD would even allow it? You know these panels are certified under a logo program, one that is trademarked by AMD. In any case I would love to see it honestly, it would just be one less concern when buying an Nvidia GPU as you would get your choice of either Adaptive Sync or G-Sync. Win-win for Nvidia users.

    As for the last bit, sorry to burst your bubble, but the real world isn't 40/40 split. Its more like 60/20 nowadays. And you still generally see less issues with Nvidia users and systems. Also, my experience with AMD isn't just limited to the 290X, my wife actually used an AMD card before we met and her 5850, despite being a media darling, was a disaster when it actually came to game compatiblity. Sims 3, one of the most popular games in the world, you would think AMD had their drivers straight there. Nope, no SM2.0 for pet hair fur lol. Also, broken in-game MSAA support on AMD cards, you had to enable FSAA via driver, which was MUCH more performance intensive. Ultimately I "downgraded" her to an old GTX 280 I had lying around and it was actually a much better experience despite being 15-20% slower on paper.
  • Hicks12 - Friday, May 8, 2015 - link

    Thanks you're correct about the free sync with crossfire, I didn't know you were specifically talking about that so I will give you that :).

    The price though.... Sorry I don't see it as only $150 difference,, a quick look shows the acer is $300 more for the gsync model.... That's insane! You could spend $300 on a decent gpu upgrade really.

    Nope.. Gsync is more mature but free sync is better for the market in the end and its exactly why VESA put it in their standard.
  • chizow - Friday, May 8, 2015 - link

    @Hicks12? Do you already own an R9 290/X? Because if not, that's your $300 right there. And the premium on the Asus is well deserved (its actually $200 now) because its a premium build panel, unlike the Acer which is pretty poor build quality before you even get to the inferior VRR results from FreeSync.

    But hey, feel free to go that route, I think it is important for everyone to go with their gut and see what works best for them!
  • Crunchy005 - Monday, May 11, 2015 - link

    Hmm a nice monitor with freesync and a Gcard UPGRADE for the same price as the other monitor. I don't see the issue there.
  • chizow - Wednesday, May 13, 2015 - link

    Because you are paying the same amount for an inferior solution at that point, but no surprise you don't see an issue there, you are an AMD fan after all.

Log in

Don't have an account? Sign up now