• What
    is this?
    You've landed on the AMD Portal on AnandTech. This section is sponsored by AMD. It features a collection of all of our independent AMD content, as well as Tweets & News from AMD directly. AMD will also be running a couple of huge giveaways here so check back for those.
    PRESENTED BY
POST A COMMENT

36 Comments

Back to Article

  • skiboysteve - Monday, May 26, 2014 - link

    Why doesn't every sku have true audio and trust Zone... Reply
  • MikeMurphy - Monday, May 26, 2014 - link

    Because AMD hasn't migrated all of their chips to the GCN 1.1 architecture found in Bonaire and Hawaii that brings True Audio and trust zone support. It will happen soon enough. Reply
  • extide - Tuesday, May 27, 2014 - link

    No, it has nothing to do with that. All of these chips (Mobile Kaveri) are GCN 1.1, they are just disabling some features on some chips like Intel does.

    Also, trust zone has nothing to do with GCN 1.1, Trust Zone is the ARM Cortex A5 core...

    However, only having trust zone enabled on the lower end sku's is interesting.
    Reply
  • extide - Tuesday, May 27, 2014 - link

    Thinking about it some more, I bet there are 2 dies, the single-module (with the Cortex A5) and the dual-module, which probably doesn't have the Cortex A5 on it, and is probably why the higher end SKU's don't support Trust Zone. They are both going to be GCN 1.1 for sure, though, however the smaller die may not include the actual DSP's for TrueAudio, but the rest of the GPU part would still be GCN 1.1. Reply
  • BMNify - Monday, May 26, 2014 - link

    the bag lady's that amd pull off the street can only sort through so much crap failed cores to re-classify them as these lower tear PR re-badged Soc , they will get to the next GCN 1.1 architecture crap pile soon enough so you can buy these cheeper failed core later... Reply
  • Khenglish - Monday, May 26, 2014 - link

    Interesting that CPU clocks are higher than Richland, unlike on desktops. This should mean that we get a significant and badly needed CPU performance increase. Reply
  • TheinsanegamerN - Monday, May 26, 2014 - link

    On top of that, kaveri on desktop keeps its base clock on the cpu when the gpu is under heavy load. if they can carry that behavior over to mobile (where the trinity a10 cpu runs at about 1.2GHz when the gpu is under load, hampering performance) we could see double the cpu performance under full load, which would be huge for AMD. Reply
  • Kevin G - Tuesday, May 27, 2014 - link

    That would be ideal but that's also a very big 'if' considering the power budgets allocated here. Reply
  • TheinsanegamerN - Tuesday, May 27, 2014 - link

    The a8-7600 did relatively well at 45 watt, so im hoping that amd can pull it off at 35 watt with lower clocks. Reply
  • jamescox - Thursday, May 29, 2014 - link

    Seems to me that anything faster then a Core2 chip is sufficient for most applications. Most games seem to be GPU limited at actually useful resolutions. Having the fastest cpu is obviously great marketing, but for mobile use, performance per Watt is more important. Where is a Kaveri expected to fall? Reply
  • rahulgarg - Monday, May 26, 2014 - link

    That 17W Kaveri looks quite useless IMO. 15W Beema would outperform it and at a much lower cost and power. Reply
  • lkb - Monday, May 26, 2014 - link

    I agree - I also wonder why there are 2 SKUs one at 17 watt and one at 19 watt but nothing in between them and 35 watts. It seems to me that an OEM could design their product to acommodate 2 extra watts, but they are left with nothing if their product intends for a TDP somewhere between 19 and 35 watts. Reply
  • DanNeely - Monday, May 26, 2014 - link

    The 17W TDP gives a direct match to Intel; with Intel spanking AMD so badly on mobile, offering a cheaper option that can recycle cooling will probably make some OEMs happy. The lack of an intermediate TDP is almost certainly lack of demand. Intel droped the 25W point a few years ago; because once performance was no longer so neutered at 17W the demand for 25W chiped dried up. OEMs either went for the lower power ones to be thin and long lasting, or stayed at 35W for either really cheap stuff or desktop replacements. Reply
  • monstercameron - Monday, May 26, 2014 - link

    I disagree, with more gcn CUs, dual channel memory controller and higher single threaded perf., it should be faster in games and legacy x86 apps[read single threaded]. Reply
  • DanNeely - Monday, May 26, 2014 - link

    The 17W one has less GPU as well as less CPU. The only reason I can see it making sense is if it gets much lower idle power. Reply
  • gruffi - Monday, May 26, 2014 - link

    No, the 15W Beema wouldn't outperform it in general. In scenarios with up to two threads Kaveri is faster. Beema is only faster if it can use 3 or 4 cores. Graphics performance should be quite similar, 192 stream processors @ 553 MHz vs 128 stream processors @ 800 MHz. But keep in mind, Beema is still single-channel bottlenecked. Reply
  • Alexvrb - Monday, May 26, 2014 - link

    Agreed. In 1-2 thread scenarios Kaveri most likely wins. In graphics, Kaveri wins - the clock difference vs shader cores very nearly balance out in terms of raw performance, but dual channel 1600 beats single channel 1866 by miles. With that being said, I still think a 6310 Beema system is more balanced and less likely to get bottlenecked when multitasking.

    Ultimately though, which one makes more sense comes down to pricing, and both of them are absolutely destroyed by the new 19W models. So the only reason to buy something less than the 19W units is if form factor/battery life or pricing prohibits it.
    Reply
  • bleh0 - Monday, May 26, 2014 - link

    Hmm...I wonder how much a FX-7600P laptop is going to cost in comparison to an Iris Pro equipped model. If AMD manages to get it in more then just a few designs here and there. Reply
  • MonkeyPaw - Monday, May 26, 2014 - link

    Imagine if AMD could drop an eDRAM die on the FX-7600P like Iris Pro? You would see something quite incredible. Reply
  • TheinsanegamerN - Monday, May 26, 2014 - link

    That.....would be epic.
    I wonder though, why doesnt amd allow two memory channels on their mobile chips, so we could use 4 memory modules? little more expensive, sure, but doubling memory bandwidth would go a LONG way in improving performance vs chasing higher memory speeds.
    Reply
  • monstercameron - Monday, May 26, 2014 - link

    that seems like a possible trade of between board complexity and space requirements for more memory channels or higher power consumption for higher clocks. Reply
  • MikeMurphy - Monday, May 26, 2014 - link

    They are already dual channel, one channel per DIMM slot. Only larger laptops allow for 4 memory modules, though, due to space constraints (and probably power consumption penalty). These days with 8GB DIMMs commonplace the need for 4 memory slots is long past. I expect 16GB ought to be enough for 99.9% of the users out there. Reply
  • Drumsticks - Monday, May 26, 2014 - link

    Hell, 8GB is good enough for 98% of users, and 4GB is good enough for 90% of them :P Reply
  • TheinsanegamerN - Monday, May 26, 2014 - link

    It's not a matter of capacity, but rather memory bandwidth. four 4GB 1600 memory modules would provide 51.2 GB/s theoretical throughput, whereas two 8GB 2133 memory modules only supply a theoretical 34.2GB/s. This bandwidth problem is why the iris pro had dedicated cache memory, to take some load off of the primary memory bus. Reply
  • Alexvrb - Monday, May 26, 2014 - link

    You don't understand. As Mike tried to point out, AMD already HAS dual memory channels on these chips (and has for years!). Only their lowend/low power chips have single channel, such as Beema/Mullins and their predecessors. The number of memory slots doesn't dictate bandwidth without additional channels in the processor. Desktop AMD systems with 4 memory slots... guess what? Still two channels! Same bandwidth potential whether you populate 2 or 4 slots with memory sticks... actually maybe a bit less with 4 since it's harder to hit the same speeds when clocking aggressively.

    What you're REALLY asking for is quad channel, which is silly for many reasons (I won't get into all of them like chip redesigns). Suffice it to say it would only generally fit in very large laptops (17"+) and it would be far better to just slap a seperate graphics chip in there - whether you go dual graphics or not, the discrete graphics will wipe the floor with a quad channel integrated GPU solution - and cost about the same.

    What I'd really like to see tested is an FX-7600P rig with a 512+ shader Cape Verde GDDR5 chip paired in dual graphics.
    Reply
  • Kevin G - Tuesday, May 27, 2014 - link

    With DDR4 on the horizon, this will change shortly. Consumer chips will likely only support one DDR4 DIMM per channel. Only server parts that'll support registered memory will be able to support two or three DIMMs per channel. Oh, and with DDR4's point-to-point nature, adding more DIMMs could reduce performance since to reach the second DIMM the signal has to be propagated through the first DIMM (similar to how FB-DIMMs worked). Reply
  • TheinsanegamerN - Tuesday, May 27, 2014 - link

    "What you're REALLY asking for is quad channel, which is silly for many reasons (I won't get into all of them like chip redesigns). Suffice it to say it would only generally fit in very large laptops (17"+) and it would be far better to just slap a seperate graphics chip in there - whether you go dual graphics or not, the discrete graphics will wipe the floor with a quad channel integrated GPU solution - and cost about the same."
    considering AMD's driver issues getting dual graphics to work, no, id rather have just the integrated.
    also, i dont see what is so rediculous about quad channel. heck, the APU itself references already HAVING two more channels for ddr5, but AMD wont use them
    http://www.anandtech.com/show/7702/amd-kaveri-docs...
    why AMD would cut them out is beyond me, but is it so much to ask for them to use the entire package? considering their previous mobile offering were....poor, seeing as how their "superior" gpus were only a hair faster then intel's. I want a competitive chip, not a neuteered, held back mobile chip.
    Reply
  • Alexvrb - Wednesday, May 28, 2014 - link

    The latest iterations of dual graphics aren't horrible, they're improving with every generation. However, if you'll read what I said, even if you disabled dual graphics the discrete GPU is better.

    Anyway, to add quad channel means a new memory controller, new socket, new chip, higher thermals/power, etc. Not to mention it would only see use in a few systems, most OEMs are going to use a single/dual channel setup for cost and space reasons. So the quad-channel chips would hardly see any use. Why should AMD bother with a third design, when OEMs can just drop in discrete graphics?

    Here let me rephrase things: Discrete GPU with GDDR5 @ 128-bit+ interface > Equivalent Integrated GPU with quad-channel DDR3. There's no getting around that, even if you ignore all the other factors.
    Reply
  • TheinsanegamerN - Tuesday, May 27, 2014 - link

    also, in what universe does having four sodimms take a 17 inch laptop, but 2 sodimms and a dedicated gpu with its own vram fits into a 14 inch laptop? Reply
  • Alexvrb - Wednesday, May 28, 2014 - link

    Regardless of channels, how many laptops do you see with 4 memory slots? Are there technical reasons for this beyond space? Yeah, probably. But it doesn't matter. The fact is that if you want that many RAM slots it probably isn't going to happen on a smaller system. Reply
  • extide - Tuesday, May 27, 2014 - link

    four 4GB 1600 modules would only be 25.6GB/sec .... it's still only dual channel ... Even with 4 DIMM's Reply
  • jamescox - Thursday, May 29, 2014 - link

    Is there any reason they can't release an APU meant to be soldered on a board?. Combined with some fast memory, also soldered on the board, this could reach discrete GPU levels of performance. This seems like it would be cheaper than discrete graphics, and only slightly more expensive than integrated graphics with separate memory modules. Reply
  • Flunk - Monday, May 26, 2014 - link

    It will be interesting to see the graphics performance on the FX-7600P. In theory it should be about on par with a mid-range mobile GPU. If it's even slightly close it could make a good budget gaming laptop that could be fairly thin, not having to cool a separate GPU. It would take a forward-thinking manufacturer to make one, however. Reply
  • charliem76 - Monday, May 26, 2014 - link

    Given how frequently that happens, I suppose I'll have to settle for a value line a6 and hope that I can find the fx chip separately. That's what I had to do for trinity to get the a10 Reply
  • Penti - Tuesday, May 27, 2014 - link

    You better release the Carrizo APU's for mobile platforms right away when you get them out next year if you want a stake on notebooks AMD!

    They just fail on the notebook space and lacks so much thought and planing. Plus they absolutely need business features such as their vPro/AMT competitor DASH with remote IPKVM. They can't keep cutting features and graphics performance on notebooks either. It just makes them too weak. Kaveri on notebooks just seems pointless without 512 SP's, without the A5-trustzone core, without aggressive enough clock speeds, without any OEM-competition against Intel-models. If they could get ultrabook-class performance on their 19W variants and could undersell Intel maybe it would be interesting, but as it is now one couldn't care less if Intel's mobile processors is 300 dollars. An Intel low-end Haswell-based Ultrabook is still just 700 USD. Nobody cares about 50 USD differences in that range any way. Would need to be much larger differences.
    Reply
  • t.s. - Tuesday, May 27, 2014 - link

    Agreed. AMD as of now is ridiculously in pricing, wether on mobile space or desktop. Reply

Log in

Don't have an account? Sign up now