Final Words

Bobcat was a turning point for AMD. The easily synthesized, low cost CPU design was found in the nearly 50 million Brazos systems AMD sold since its introduction. Jaguar improves upon Bobcat in a major way. The move to 28nm helps drive power even lower, which will finally get AMD into tablet designs with Temash. Despite being lower power, Jaguar also manages to increase performance appreciably over Bobcat. AMD claims up to a 22% increase in IPC compared to Bobcat. Combine the IPC gains with a more multi-core friendly design and Jaguar based APUs should be appreciably faster than their predecessors.

Quite possibly one of the only real weaknesses with Jaguar is the lack of aggressive turbo modes in any of the shipping implementations of the design. It appears that the first implementations of Jaguar were under time constraints, leaving many features (including improved thermal monitoring/management and turbo boost) on the cutting room floor. Kabini and Temash seem ripe for a mid-cycle update enabling turbo across more parts, which could do wonders for single threaded performance.

The Jaguar power story actually looks very good, it's just hampered by traditional PC legacy. None of the launch APUs here support the low power IOs necessary to drive platform power down even further. AMD is getting very close though. Jaguar's core power is easily sub-2W for lightweight tablet tasks, the rest of the platform (excluding display) drives it up to 4 - 7W. AMD definitely has the right building blocks to go after truly low power tablets in a major way, should it have the resources and bandwidth to do so.

In its cost and power band, Jaguar is presently without competition. Intel’s current 32nm Saltwell Atom core is outdated, and nothing from ARM is quick enough. It’s no wonder that both Microsoft and Sony elected to use Jaguar as the base for their next-generation console SoCs, there simply isn’t a better option today. As Intel transitions to its 22nm Silvermont architecture however Jaguar will finally get some competition. For the next few months though, AMD will enjoy a position it hasn’t had in years: a CPU performance advantage.

I can’t stress enough how important it is that AMD continues to focus on driving the single threaded performance of its cat-line of cores. Second chances are rare in this business, but that’s exactly what AMD has been offered with the rise of good enough computing. Jaguar vs. Atom is the best CPU story AMD has had in years. Regular updates to the architecture coupled with solid execution are necessary to ensure that history doesn’t repeat itself in a new segment of AMD’s business.

Long term, I can’t help but wonder what Bobcat’s success will do to shape AMD’s future microarchitecture decisions. I’m not sure what Jim Keller’s SoC project is, but I’m wondering if the days of really big cores might be over. I don’t know that really small cores are the answer either, but perhaps something in between...

The APUs: Kabini, Temash, Xbox One & PS4
Comments Locked

78 Comments

View All Comments

  • skatendo - Friday, May 24, 2013 - link

    Not entirely true. The Wii U CPU is highly customized and has enhancements not found in typical PowerPC processors. It's been completely tailored for gaming. I'm not saying it's the power of the newer Jaguar chipsets, but the beauty of custom silicon is that you can do much more with less (Tegra 3's quad-core, 12-core GPU vs. Apple's A5 dual core CPU/GPU anyone? yeah A5 kicked its arse for games) that's why Nintendo didn't release tech specs because they tailored a system for games and performance will manifest with upcoming games (not these sloppy ports we've seen so far).
  • tipoo - Friday, May 24, 2013 - link

    I'm aware it would be highly customized, but a plethora of developers have also come out and said the CPU sucks.
  • skatendo - Saturday, May 25, 2013 - link

    Also the "plethora" of developers that said it sucked (namely the Metro: Last Light dev) said they had an early build of the Wii U SDK and said it was "slow". Having worked for a developer, they base their opinions on how fast/efficient they can port over their game. The Wii U is a totally different infrastructure that lazy devs don't want to take the time to learn, especially with a newer GPGPU.
  • Kevin G - Sunday, May 26, 2013 - link

    If a developer wants to do GPGPU, the PS4 and Xbox One will be highly preferable due to unified virtual memory space. If GPGPU was Nintendo's strategy, they shouldn't have picked a GPU from the Radeon 6000 generation. Sure, it can do GPU but there are far more compromises to hand off the workload.
  • Simen1 - Thursday, May 23, 2013 - link

    What is the TDP and die size of the APUs in X-Box One and Playstation 4?
  • haukionkannel - Thursday, May 23, 2013 - link

    Douple the 1.6 Ghz 4 core version and you are near. The wider memory controller eats some extra energy to, so maybe you have to add 0.2 to 0.3 calculation...
  • fellix - Thursday, May 23, 2013 - link

    "The L2 cache is also inclusive, a first in AMD’s history."

    Not exactly correct. The very first Athon (K7) on Slot A with off-die L2 used inclusive cache hierarchy. All models after that moved to exclusive design.
  • Exophase - Thursday, May 23, 2013 - link

    Bulldozer is also mostly inclusive. Not strictly inclusive, but certainly not exclusive (you really wouldn't get such a thing from a writethrough L1 cache)
  • whyso - Thursday, May 23, 2013 - link

    Ahh amd, I love your marketing slides. Lets compare battery life and EXCLUDE the screen. Never mind that the screen consumes a large amount of power and that when you add it to the total battery life savings go down tremendously. (That's why sandy-> ivy bridge didn't improve battery life that much on mobile). Lets also leave out the Rest of system power and soc power for brazos. It also looks like the system is using an SSD to generate these numbers which looking at the target market almost no OEM will do.
  • extide - Thursday, May 23, 2013 - link

    It's a perfectly valid comparison to make. All laptops will include a screen and the screen has nothing to do with AMD (or Intel).

Log in

Don't have an account? Sign up now