• What
    is this?

    You've landed on the AMD Portal on AnandTech. This section is sponsored by AMD. It features a collection of all of our independent AMD content, as well as Tweets & News from AMD directly. AMD will also be running a couple of huge giveaways here so check back for those.


Kabini: Mainstream APU for Notebooks

AMD will be building two APUs based on Jaguar: Kabini and Temash. Kabini is AMD’s mainstream APU, which you can expect to see in ultra-thin affordable notebooks. Note that both of these are full blown SoCs by conventional definitions - the IO hub is integrated into the monolithic die. Kabini ends up being the first quad-core x86 SoC if we go by that definition.

Kabini will carry A and E series branding, and will be available in a full quad-core version (A series) as well as dual-core (E series). The list of Kabini parts launching is below:

On the GPU side we have a 2 Compute Unit implementation of AMD’s Graphics Core Next architecture. The geometry engine has been culled a bit (1/4 primitive per clock) in order to make the transition into these smaller/low cost APUs. Double precision is supported at 1/16 rate, although adds and some muls will run at 1/8 the single precision rate.

Kabini features a single 64-bit DDR3 memory controller and ranges in TDPs from 9W to 25W. Although Jaguar supports dynamic frequency boosting (aka Turbo mode), the feature isn’t present/enabled on Kabini - all of the CPU clocks noted in the table above are the highest you’ll see regardless of core activity.

We have a separate review focusing on the performance of AMD’s A4-5000 Kabini APU live today as well.

Temash: Entry Level APU for Tablets

While Kabini will go into more traditional notebook designs, Temash will head down into the tablet space. The Temash TDPs range from 3.9W all the way up to 9W. Of the three Temash parts launching today, two are dual-core designs with the highest end A6-1450 boasting 4 cores as well as support for turbo core. The A6-1450’s turbo core implementation also enables TDP sharing between the CPU and GPU cores (idle CPUs can be power gated and their thermal budget given to the GPU, and vice versa).

The A4-1200 is quite interesting as it carries a sub-4W TDP, low enough to make it into an iPad-like form factor. It’s also important to note that AMD doesn’t actually reduce the number of GPU cores in any of the Temash designs, it just scales down clock speed.

Xbox One & PlayStation 4

In both our Xbox One and PS4 articles I referred to the SoCs as using two Jaguar compute units - now you can understand why. Both designs incorporate two quad-core Jaguar modules, each with their own shared 2MB L2 cache. Communication between the modules isn’t ideal, so we’ll likely see both consoles prefer that related tasks run on the same module.

Looking at Kabini, we have a good idea of the dynamic range for Jaguar on TSMC’s 28nm process: 1GHz - 2GHz. Right around 1.6GHz seems to be the sweet spot, as going to 2GHz requires a 66% increase in TDP.

The major change between AMD’s Temash/Kabini Jaguar implementations as what’s done in the consoles is really all of the unified memory addressing work and any coherency that’s supported on the platforms. Memory buses are obviously very different as well, but the CPU cores themselves are pretty much identical to what we’ve outlined here.

The Jaguar Compute Unit & Physical Layout/Synthesis Final Words


View All Comments

  • tipoo - Thursday, May 23, 2013 - link

    Multiple reasons, AMD has historically been better with console contracts than Nvidia or Intel though, those two want too much control over their own chips while AMD licences them out and lets Sony or MS do whatever with them. They're probably also cheaper, and no one else has an all in one APU solution with this much GPU grunt yet. Reply
  • araczynski - Thursday, May 23, 2013 - link

    thanks. Reply
  • WaltC - Thursday, May 23, 2013 - link

    Good article!--as usual, it's mainly Anand's conclusions that I find wanting...;) Nobody's "handing" AMD anything, as far as I can see. AMD is far, far ahead of Intel on the gpu front and has been for years--no accident there. AMD earned whatever position it now enjoys--and it's the only company in existence to go head-to-head with Intel and beat them, and not just "once," as Anand points out. Indeed, we can thank AMD for Core 2 and x86-64; had it been Intel's decision to make we'd all have been puttering happily away on dog-slow, ultra-expensive Itanium derivatives of one kind or another. (What a nightmare!) Intel invested billions in world-wide retool for Rdram while AMD pushed the superior market alternative, DDR Sdram. AMD won out there, too. There are many expamples of AMD's hard work, ingenuity, common sense and lack of greed besting Intel--far more than just two. It's no accident that AMD is far ahead of Intel here: as usual, AMD's been headed in one direction and Intel in another, and AMD gets there first.

    But I think I know what Anand means, and that's that AMD can not afford to sit on its laurels. There's nothing here to milk--AMD needs to keep the R&D pedal to the medal if the company wants to stay ahead--absolutely. Had the company done that pre-Core 2, while Intel was telling us all that we didn't "need" 64-bits on the desktop, AMD might have remained out front. The company is under completely different management now, so we can hope for the best, as always. Competition is the wheel that keeps everything turning, etc.
  • Sabresiberian - Thursday, May 23, 2013 - link

    The point Anandtech was trying to make is that no one is stepping up to compete with AMD's Jaguar, and so they are handing that part of the business to AMD - just as AMD handed their desktop CPU business to Intel by deciding not to step up on that front. If you don't do what it takes to compete, you are "handing" the business to those who do. This is a complement to AMD and something of a slam on the other guys, not a suggestion that AMD needed some kind of charity to stay in business here.

    I want to suggest that you are letting a bit of fanboyism color your reaction to what others say. :)

    Perhaps if AMD had been a bit more "greedy" like Intel is in your eyes, they wouldn't have come so close to crashing permanently. Whatever, it has been very good to see them get some key people back, and that inspires hope in me for the company and the competition it brings to bear. We absolutely need someone to kick Intel in the pants!

    Good to see them capture the console market (the two biggest, anyway). Unfortunately, as a PC gamer that hates the fact that many games are made at console levels, I don't see the new generation catching up like they did back when the PS3 and Xbox 360 were released. It looks to me like we will still have weaker consoles to deal with - better than the previous gen, but still not up to mainstream PC standards, never mind high-end. The fact that many developers have been making full-blown PC versions from the start instead of tacking on rather weak ports a year later is more hopeful than the new console hardware, in my opinion.
  • blacks329 - Thursday, May 23, 2013 - link

    I honestly expect the fact that both PS4 and X1 are x86 will benefit PC games quite significantly as well. Last gen devs initially developed for 360 and ported over to the PS3 and PC and later in the gen shifted to PS3 as the lead architect with some using PCs. I expect now, since porting to PS4 and X1 will be significantly easier, PC will eventually become the lead platform and will scale down accordingly for the PS4 and X1.

    As someone who games more on consoles than PCs, I'm really excited for both platforms as devs can spend less time tweaking on a per platform basis and spend more time elsewhere.
  • FearfulSPARTAN - Thursday, May 23, 2013 - link

    Actually im pretty sure 90% still made the games on xbox first then ported to other platforms, however with all of them (excluding the wii u) being x86, the idea of them porting down from pc is quit possible and I didnt think about that. It would probably start to happen mid to late in the gen though. Reply
  • blacks329 - Thursday, May 23, 2013 - link

    I know its definitely not that high for any individual platform, but I do remember a lot of major publishers, Ubi, EA and a bunch of other smaller studios had said (early-mid gen) that because porting to PS3 was such a nightmare and resource intensive, that it was more efficient to spend extra resources initially and use the PS3 as the lead and then have it ported over to 360, which was significantly easier.

    While I'm sure quite a large chunk still use 360's as their lead platform, I would say 90% was probably very early in this gen and since then has dropped to be much closer between 360 and PS3.

    Although at this point both architectures are well understood and accounted for that most engines should make it easier to develop for both regardless of what platform is started with.
  • mr_tawan - Sunday, May 26, 2013 - link

    I don't think using x86 would benefit the dev as much as many expected. Sure using the same hw-level arch may simplify the low-level code like asm, but seriously, I don't many of devs nowaday uses asm intensively anymore. (I had been working for current-gen console titles for a little, and never write even a single line of asm). Current-gen of game is complex, and need the best software architecture, otherwise it would lead to delay-to-death shipping schedule. Using asm would lead to premature optimisation that gains little-to-nothing.

    What would really affect the dev heavily is sdk. XB1 uses custom OS, but the SDK should be closed to Windows' DirectX (just like XB360). PS4, if it's in the same fashion as PS3, would use the custom-made SDK with OpenGL/OpenGL ES API (PS3 uses OpenGL ES, if I'm not mistaken). It needs another layer of abstration to make it easier to make it fully cross-platform, just like the current generation.

    The thing that might be shared across two platform might be the shader code, if AMD can convince both MS and Sony to use the same language.

    That's only guesses, I might be wrong.
  • mganai - Thursday, May 23, 2013 - link

    That, and Intel's been making a bigger push for the smartphone market; it even says so in the article!

    Silvermont should change things up quite favorably.
  • mschira - Thursday, May 23, 2013 - link

    Well all this is pointless if nobody makes good hardware using it.
    It's the old story. The last generation Trinity would have allowed very decent mid range notebooks with very long battery run time and more than sufficient power at reasonably low costs.

    Have we seen anything?

    So where is a nice 11" Trinity Laptop?
    Or a 10" Brazos?
    All either horrible cheap Atom or expensive ULV core anything.

    Are the hardware makers afraid that AMD can deliver enough chips?
    Are they worried stepping on Intels toes?
    Are they simply uncreative all running in the same direction some stupid mainstream guide tell them?

    I suspect it is largely the latter - and most current notebooks are simply uncreative. The loss of sales comes to no surprise I think. And its not all M$ fault.

Log in

Don't have an account? Sign up now