Compute Unit

Bobcat was pretty simple from a multi-core standpoint. Each Bobcat core had its own private 512KB L2 cache, and all core-to-core communication happened via a bus interface on each of the cores. The cache hierarchy was exclusive, as has been the case with all of AMD’s previous architectures.

Jaguar changes everything. AMD defines a Jaguar compute unit as up to four cores with a single, large, shared L2 cache. The L2 cache can be up to 2MB in size and is 16-way set associative. The L2 cache is also inclusive, a first in AMD’s history. In the past AMD always implemented exclusive caches as the inclusive duplicating of L1 data in L2 meant a smaller effective L2 cache. The larger shared L2 cache is responsible for up to another 5-7% increase in IPC over Bobcat (totaling ~22%).

AMD’s new cache architecture and lower latency core-to-core communication within a Jaguar compute unit means an even greater performance advantage over Bobcat in multithreaded workloads:

Multithreaded Performance Comparison
  # of Cores Cinebench 11.5 (Single Threaded) Cinebench 11.5 (Multithreaded)
AMD A4-5000 (1.5GHz Jaguar x 4) 4 0.39 1.5
AMD E-350 (1.6GHz Bobcat x 2) 2 0.32 0.61
Advantage 100% 21.9% 145.9%

The L1 caches remain unchanged at 32KB/32KB (I/D cache) per core.

Physical Layout and Synthesis

Bobcat was AMD’s first easily synthesized CPU core, it was a direct result of the ATI acquisition years before. With Jaguar, AMD made a conscious effort to further reduce the number of unique macros required by the design. The result was a great simplification, which helped AMD port Jaguar between foundries. There’s of course an area tradeoff when moving away from custom macros to more general designs but it was deemed worthwhile. Looking at the results, you really can’t argue. A single Jaguar core measures only 3.1mm^2 at 28nm compared to 4.9mm^2 for a 40nm Bobcat.

Integer & FP Units, Load/Store Improvements The APUs: Kabini, Temash, Xbox One & PS4
Comments Locked

78 Comments

View All Comments

  • blacks329 - Thursday, May 23, 2013 - link

    I know its definitely not that high for any individual platform, but I do remember a lot of major publishers, Ubi, EA and a bunch of other smaller studios had said (early-mid gen) that because porting to PS3 was such a nightmare and resource intensive, that it was more efficient to spend extra resources initially and use the PS3 as the lead and then have it ported over to 360, which was significantly easier.

    While I'm sure quite a large chunk still use 360's as their lead platform, I would say 90% was probably very early in this gen and since then has dropped to be much closer between 360 and PS3.

    Although at this point both architectures are well understood and accounted for that most engines should make it easier to develop for both regardless of what platform is started with.
  • mr_tawan - Sunday, May 26, 2013 - link

    I don't think using x86 would benefit the dev as much as many expected. Sure using the same hw-level arch may simplify the low-level code like asm, but seriously, I don't many of devs nowaday uses asm intensively anymore. (I had been working for current-gen console titles for a little, and never write even a single line of asm). Current-gen of game is complex, and need the best software architecture, otherwise it would lead to delay-to-death shipping schedule. Using asm would lead to premature optimisation that gains little-to-nothing.

    What would really affect the dev heavily is sdk. XB1 uses custom OS, but the SDK should be closed to Windows' DirectX (just like XB360). PS4, if it's in the same fashion as PS3, would use the custom-made SDK with OpenGL/OpenGL ES API (PS3 uses OpenGL ES, if I'm not mistaken). It needs another layer of abstration to make it easier to make it fully cross-platform, just like the current generation.

    The thing that might be shared across two platform might be the shader code, if AMD can convince both MS and Sony to use the same language.

    That's only guesses, I might be wrong.
  • mganai - Thursday, May 23, 2013 - link

    That, and Intel's been making a bigger push for the smartphone market; it even says so in the article!

    Silvermont should change things up quite favorably.
  • mschira - Thursday, May 23, 2013 - link

    Well all this is pointless if nobody makes good hardware using it.
    It's the old story. The last generation Trinity would have allowed very decent mid range notebooks with very long battery run time and more than sufficient power at reasonably low costs.

    Have we seen anything?
    Nope.

    So where is a nice 11" Trinity Laptop?
    Or a 10" Brazos?
    All either horrible cheap Atom or expensive ULV core anything.

    Are the hardware makers afraid that AMD can deliver enough chips?
    Are they worried stepping on Intels toes?
    Are they simply uncreative all running in the same direction some stupid mainstream guide tell them?

    I suspect it is largely the latter - and most current notebooks are simply uncreative. The loss of sales comes to no surprise I think. And its not all M$ fault.
    M.
  • Mathos - Thursday, May 23, 2013 - link

    It could be another instance of Intel paying oems to not use certain AMD parts. They've done it before, wouldn't be surprised if it happens again in area's where AMD might have a better component.

    But it's also not totally true, having worked at Wal-Mart and other big chain stores, I can tell you that many do carry laptops and ultrathins that use Trinity A series chips, and Brazos E series chips. But, right now, everyone still wants that ipad or galaxy tab. And in general the only people I saw buying laptops and ultrathins were the people during the back to school or back to college crowds. And of course black Friday hordes.

    And with AMD having both next gen consoles under their belt, them and many OEMs may be able to leverage that to draw sales of jaguar based systems.
  • Gest - Saturday, May 25, 2013 - link

    So does Jaguar have new (any) hardware instructions that intel processors don't? (Will intel add them in haswell?) I think game makers will use them actively during the consoles lifetime.
  • scaramoosh - Monday, May 27, 2013 - link

    Doesn't this just mean the console CPU power is lacking compared to what PCs currently have?
  • Silma - Wednesday, May 29, 2013 - link

    Absolutely.

Log in

Don't have an account? Sign up now