Kabini: Mainstream APU for Notebooks

AMD will be building two APUs based on Jaguar: Kabini and Temash. Kabini is AMD’s mainstream APU, which you can expect to see in ultra-thin affordable notebooks. Note that both of these are full blown SoCs by conventional definitions - the IO hub is integrated into the monolithic die. Kabini ends up being the first quad-core x86 SoC if we go by that definition.

Kabini will carry A and E series branding, and will be available in a full quad-core version (A series) as well as dual-core (E series). The list of Kabini parts launching is below:

On the GPU side we have a 2 Compute Unit implementation of AMD’s Graphics Core Next architecture. The geometry engine has been culled a bit (1/4 primitive per clock) in order to make the transition into these smaller/low cost APUs. Double precision is supported at 1/16 rate, although adds and some muls will run at 1/8 the single precision rate.

Kabini features a single 64-bit DDR3 memory controller and ranges in TDPs from 9W to 25W. Although Jaguar supports dynamic frequency boosting (aka Turbo mode), the feature isn’t present/enabled on Kabini - all of the CPU clocks noted in the table above are the highest you’ll see regardless of core activity.

We have a separate review focusing on the performance of AMD’s A4-5000 Kabini APU live today as well.

Temash: Entry Level APU for Tablets

While Kabini will go into more traditional notebook designs, Temash will head down into the tablet space. The Temash TDPs range from 3.9W all the way up to 9W. Of the three Temash parts launching today, two are dual-core designs with the highest end A6-1450 boasting 4 cores as well as support for turbo core. The A6-1450’s turbo core implementation also enables TDP sharing between the CPU and GPU cores (idle CPUs can be power gated and their thermal budget given to the GPU, and vice versa).

The A4-1200 is quite interesting as it carries a sub-4W TDP, low enough to make it into an iPad-like form factor. It’s also important to note that AMD doesn’t actually reduce the number of GPU cores in any of the Temash designs, it just scales down clock speed.

Xbox One & PlayStation 4

In both our Xbox One and PS4 articles I referred to the SoCs as using two Jaguar compute units - now you can understand why. Both designs incorporate two quad-core Jaguar modules, each with their own shared 2MB L2 cache. Communication between the modules isn’t ideal, so we’ll likely see both consoles prefer that related tasks run on the same module.

Looking at Kabini, we have a good idea of the dynamic range for Jaguar on TSMC’s 28nm process: 1GHz - 2GHz. Right around 1.6GHz seems to be the sweet spot, as going to 2GHz requires a 66% increase in TDP.

The major change between AMD’s Temash/Kabini Jaguar implementations as what’s done in the consoles is really all of the unified memory addressing work and any coherency that’s supported on the platforms. Memory buses are obviously very different as well, but the CPU cores themselves are pretty much identical to what we’ve outlined here.

The Jaguar Compute Unit & Physical Layout/Synthesis Final Words
Comments Locked

78 Comments

View All Comments

  • skatendo - Friday, May 24, 2013 - link

    Not entirely true. The Wii U CPU is highly customized and has enhancements not found in typical PowerPC processors. It's been completely tailored for gaming. I'm not saying it's the power of the newer Jaguar chipsets, but the beauty of custom silicon is that you can do much more with less (Tegra 3's quad-core, 12-core GPU vs. Apple's A5 dual core CPU/GPU anyone? yeah A5 kicked its arse for games) that's why Nintendo didn't release tech specs because they tailored a system for games and performance will manifest with upcoming games (not these sloppy ports we've seen so far).
  • tipoo - Friday, May 24, 2013 - link

    I'm aware it would be highly customized, but a plethora of developers have also come out and said the CPU sucks.
  • skatendo - Saturday, May 25, 2013 - link

    Also the "plethora" of developers that said it sucked (namely the Metro: Last Light dev) said they had an early build of the Wii U SDK and said it was "slow". Having worked for a developer, they base their opinions on how fast/efficient they can port over their game. The Wii U is a totally different infrastructure that lazy devs don't want to take the time to learn, especially with a newer GPGPU.
  • Kevin G - Sunday, May 26, 2013 - link

    If a developer wants to do GPGPU, the PS4 and Xbox One will be highly preferable due to unified virtual memory space. If GPGPU was Nintendo's strategy, they shouldn't have picked a GPU from the Radeon 6000 generation. Sure, it can do GPU but there are far more compromises to hand off the workload.
  • Simen1 - Thursday, May 23, 2013 - link

    What is the TDP and die size of the APUs in X-Box One and Playstation 4?
  • haukionkannel - Thursday, May 23, 2013 - link

    Douple the 1.6 Ghz 4 core version and you are near. The wider memory controller eats some extra energy to, so maybe you have to add 0.2 to 0.3 calculation...
  • fellix - Thursday, May 23, 2013 - link

    "The L2 cache is also inclusive, a first in AMD’s history."

    Not exactly correct. The very first Athon (K7) on Slot A with off-die L2 used inclusive cache hierarchy. All models after that moved to exclusive design.
  • Exophase - Thursday, May 23, 2013 - link

    Bulldozer is also mostly inclusive. Not strictly inclusive, but certainly not exclusive (you really wouldn't get such a thing from a writethrough L1 cache)
  • whyso - Thursday, May 23, 2013 - link

    Ahh amd, I love your marketing slides. Lets compare battery life and EXCLUDE the screen. Never mind that the screen consumes a large amount of power and that when you add it to the total battery life savings go down tremendously. (That's why sandy-> ivy bridge didn't improve battery life that much on mobile). Lets also leave out the Rest of system power and soc power for brazos. It also looks like the system is using an SSD to generate these numbers which looking at the target market almost no OEM will do.
  • extide - Thursday, May 23, 2013 - link

    It's a perfectly valid comparison to make. All laptops will include a screen and the screen has nothing to do with AMD (or Intel).

Log in

Don't have an account? Sign up now