Compute Unit

Bobcat was pretty simple from a multi-core standpoint. Each Bobcat core had its own private 512KB L2 cache, and all core-to-core communication happened via a bus interface on each of the cores. The cache hierarchy was exclusive, as has been the case with all of AMD’s previous architectures.

Jaguar changes everything. AMD defines a Jaguar compute unit as up to four cores with a single, large, shared L2 cache. The L2 cache can be up to 2MB in size and is 16-way set associative. The L2 cache is also inclusive, a first in AMD’s history. In the past AMD always implemented exclusive caches as the inclusive duplicating of L1 data in L2 meant a smaller effective L2 cache. The larger shared L2 cache is responsible for up to another 5-7% increase in IPC over Bobcat (totaling ~22%).

AMD’s new cache architecture and lower latency core-to-core communication within a Jaguar compute unit means an even greater performance advantage over Bobcat in multithreaded workloads:

Multithreaded Performance Comparison
  # of Cores Cinebench 11.5 (Single Threaded) Cinebench 11.5 (Multithreaded)
AMD A4-5000 (1.5GHz Jaguar x 4) 4 0.39 1.5
AMD E-350 (1.6GHz Bobcat x 2) 2 0.32 0.61
Advantage 100% 21.9% 145.9%

The L1 caches remain unchanged at 32KB/32KB (I/D cache) per core.

Physical Layout and Synthesis

Bobcat was AMD’s first easily synthesized CPU core, it was a direct result of the ATI acquisition years before. With Jaguar, AMD made a conscious effort to further reduce the number of unique macros required by the design. The result was a great simplification, which helped AMD port Jaguar between foundries. There’s of course an area tradeoff when moving away from custom macros to more general designs but it was deemed worthwhile. Looking at the results, you really can’t argue. A single Jaguar core measures only 3.1mm^2 at 28nm compared to 4.9mm^2 for a 40nm Bobcat.

Integer & FP Units, Load/Store Improvements The APUs: Kabini, Temash, Xbox One & PS4
Comments Locked

78 Comments

View All Comments

  • Spunjji - Friday, May 24, 2013 - link

    Every CPU manufacturer does that... why would they include numbers they have no control over?
  • araczynski - Thursday, May 23, 2013 - link

    Can anyone clue me in as to how AMD got the rights to make 'PU's for both of the consoles? Was it just bang/$ vs Intel/IBM/etc? Not a fanboy of either camp (amd/intel), just curious.
  • Despoiler - Thursday, May 23, 2013 - link

    It was purely the fact that they have an APU with a high end GPU on it. Intel is nowhere near AMD in terms of top tier graphics power. Nvidia doesn't have x86. The total package price for an APU vs CPU/GPU made it impossible for an Intel/Nvidia solution to compete. The complexity is also much less on an APU system than CPU/GPU. The GPU needs a slot on the mobo. You have to cool it as well as the CPU. Less complexity = less cost.
  • araczynski - Thursday, May 23, 2013 - link

    thanks.
  • tipoo - Thursday, May 23, 2013 - link

    Multiple reasons, AMD has historically been better with console contracts than Nvidia or Intel though, those two want too much control over their own chips while AMD licences them out and lets Sony or MS do whatever with them. They're probably also cheaper, and no one else has an all in one APU solution with this much GPU grunt yet.
  • araczynski - Thursday, May 23, 2013 - link

    thanks.
  • WaltC - Thursday, May 23, 2013 - link

    Good article!--as usual, it's mainly Anand's conclusions that I find wanting...;) Nobody's "handing" AMD anything, as far as I can see. AMD is far, far ahead of Intel on the gpu front and has been for years--no accident there. AMD earned whatever position it now enjoys--and it's the only company in existence to go head-to-head with Intel and beat them, and not just "once," as Anand points out. Indeed, we can thank AMD for Core 2 and x86-64; had it been Intel's decision to make we'd all have been puttering happily away on dog-slow, ultra-expensive Itanium derivatives of one kind or another. (What a nightmare!) Intel invested billions in world-wide retool for Rdram while AMD pushed the superior market alternative, DDR Sdram. AMD won out there, too. There are many expamples of AMD's hard work, ingenuity, common sense and lack of greed besting Intel--far more than just two. It's no accident that AMD is far ahead of Intel here: as usual, AMD's been headed in one direction and Intel in another, and AMD gets there first.

    But I think I know what Anand means, and that's that AMD can not afford to sit on its laurels. There's nothing here to milk--AMD needs to keep the R&D pedal to the medal if the company wants to stay ahead--absolutely. Had the company done that pre-Core 2, while Intel was telling us all that we didn't "need" 64-bits on the desktop, AMD might have remained out front. The company is under completely different management now, so we can hope for the best, as always. Competition is the wheel that keeps everything turning, etc.
  • Sabresiberian - Thursday, May 23, 2013 - link

    The point Anandtech was trying to make is that no one is stepping up to compete with AMD's Jaguar, and so they are handing that part of the business to AMD - just as AMD handed their desktop CPU business to Intel by deciding not to step up on that front. If you don't do what it takes to compete, you are "handing" the business to those who do. This is a complement to AMD and something of a slam on the other guys, not a suggestion that AMD needed some kind of charity to stay in business here.

    I want to suggest that you are letting a bit of fanboyism color your reaction to what others say. :)

    Perhaps if AMD had been a bit more "greedy" like Intel is in your eyes, they wouldn't have come so close to crashing permanently. Whatever, it has been very good to see them get some key people back, and that inspires hope in me for the company and the competition it brings to bear. We absolutely need someone to kick Intel in the pants!

    Good to see them capture the console market (the two biggest, anyway). Unfortunately, as a PC gamer that hates the fact that many games are made at console levels, I don't see the new generation catching up like they did back when the PS3 and Xbox 360 were released. It looks to me like we will still have weaker consoles to deal with - better than the previous gen, but still not up to mainstream PC standards, never mind high-end. The fact that many developers have been making full-blown PC versions from the start instead of tacking on rather weak ports a year later is more hopeful than the new console hardware, in my opinion.
  • blacks329 - Thursday, May 23, 2013 - link

    I honestly expect the fact that both PS4 and X1 are x86 will benefit PC games quite significantly as well. Last gen devs initially developed for 360 and ported over to the PS3 and PC and later in the gen shifted to PS3 as the lead architect with some using PCs. I expect now, since porting to PS4 and X1 will be significantly easier, PC will eventually become the lead platform and will scale down accordingly for the PS4 and X1.

    As someone who games more on consoles than PCs, I'm really excited for both platforms as devs can spend less time tweaking on a per platform basis and spend more time elsewhere.
  • FearfulSPARTAN - Thursday, May 23, 2013 - link

    Actually im pretty sure 90% still made the games on xbox first then ported to other platforms, however with all of them (excluding the wii u) being x86, the idea of them porting down from pc is quit possible and I didnt think about that. It would probably start to happen mid to late in the gen though.

Log in

Don't have an account? Sign up now