Graphics

The big upgrade in graphics for Carrizo is that the maximum number of compute units for a 15W mobile APU moves up from six (384 SPs) to eight (512 SPs), affording a 33% potential improvement. This means that the high end A10 Carrizo mobile APUs will align with the A10 Kaveri desktop APUs, although the desktop APUs will use 6x the power. Carrizo also moves to AMD’s third generation of Graphics Core Next, meaning GCN 1.2 and similar to Tonga based retail graphics cards (the R9 285).

This gives DirectX 12 support, but one of AMD’s aims with Carrizo is full HSA 1.0 support. Earlier this year when AMD first released proper Carrizo details, we were told that Carrizo will support the full HSA 1.0 draft as it currently stands as it has not been ratified, and they will not push back the launch of Carrizo until that happens. So there is a chance that Carrizo will not be certified has a fully HSA 1.0 compliant APU, but very few people are predicting major changes to the specification at this point before ratification that requires hardware adjustments.

The difference between Kaveri’s ‘HSA Ready’ and Carrizo’s ‘HSA Final’ nomenclature comes down to one main feature – context switching. Kaveri can do everything Carrizo can do, apart from this. Context switching allows the HSA device to switch between work asynchronously while it waits on the other part that needs to finish. I would imagine that if Kaveri came across work that required this, it would sit there idle waiting for work to finish before continuing, which means that Carrizo would be faster in this regard.

One of the key parts of HSA is pointer translation, allowing both the CPU and GPU to access the same memory despite their different interpretations of how the memory in the system is configured. One of the features on Carrizo will be the use of address translation caches inside the GPU, essentially keeping a record of which address points to which data and when an address is in a lower cache, that data can be accessed quicker. These ATC L1/L2 caches will be inside the compute units themselves as well as the GPU memory controller and an overriding ATC L2 beyond the regular L2 per compute unit.

Use of GCN 1.2 means that AMD can use their latest color compression algorithms with little effort – it takes a little more die area to implement (of which Excavator has more to play with than Kaveri), but affords performance improvements particularly in gaming. The texture data is stored losslessly to maintain visual fidelity, and move between graphics cores in this compressed state.

In yet more effort to suction power out of the system, the GPU will have its own dedicated voltage plane as part of the system, rather than a separate voltage island requiring its own power delivery mechanism as before. AMD’s latest numbers on the improvements here only date back to June 2013 via internal simulations, rather than an actual direct comparison.

All the performance metrics rolled in, and AMD is quoting a 65% performance improvement at 15W compared to Kaveri. The adjustment in design is allowing higher frequency for the same power, combined with the additional compute units and other enhancements for the overall score. At 35W the gain is less pronounced, but more akin to regular generational improvements anyway. What we see at 35W is what we would normally expect, and it pales in comparison to the 15W numbers.

Unified Video Decoder and Playback Pathways AMD Secure Processor and Final Thoughts
Comments Locked

137 Comments

View All Comments

  • Refuge - Wednesday, June 3, 2015 - link

    Built my mother a new system from her old scraps with a new A8, she loves that desktop, and when she put an SSD in it finally she loved it ten times more. the upgrade only cost her $300, for CPU, Mobo, RAM. Threw it together in 45 minutes, and she hasn't had a problem with it in 2 years so far.
  • nathanddrews - Wednesday, June 3, 2015 - link

    I prefer the following setup:
    1. Beast-mode, high-performance desktop for gaming, video editing, etc.
    2. Low-power, cheap notebook/tablet for In-Home Steam Streaming and light gaming (720p) on the go.

    In my use case, as long as I can load and play the game (20-30fps for RTS, 30fps+ for everything else) on a plane ride or some other scenario without AC access, I'm not really concerned with the AA or texture quality. I still want to get the best experience possible, but balanced against the cheapest possible price. The sub-$300 range is ideal for me.
  • AS118 - Wednesday, June 3, 2015 - link

    Yeah, that's my thing as well. High resolutions at work, and at home, 768p or 900p's just fine, especially for gaming.

    I also recommend AMD to friends and relatives who want laptops and stuff that can do casual gaming for cheap.
  • FlushedBubblyJock - Tuesday, June 9, 2015 - link

    Why go amd when HD3000 does just fine gaming and the added power of the intel cpu is an awesome boost overall ...
  • Valantar - Wednesday, June 3, 2015 - link

    1366*768 on anything larger than 13" looks a mess, but in a cheap laptop I'd rather have a 13*7 IPS for the viewing angles and better visuals than a cheap FHD TN panel - bad viewing angles especially KILL the experience of using a laptop. Still, 13*7 is pretty lousy for anything other than multimedia - it's simply too short to fit a useful amount of text vertically. A decent compromise would be moving up to 1600*900 as the standard resolution on >11" displays. Or, of course, moving to 3:2 or 4:3 displays, which would make the resolution 1366*911 or 1366*1024 and provide ample vertical space. Still, 13*7 TN panels need to go away. Now.
  • yankeeDDL - Wednesday, June 3, 2015 - link

    Like I said, to each his own. I have a Lenovo Z50 which I paid less than $470 with the A10 7300.
    Quite frankly, I could not be happier and I think it provides a massive value for that money.
    Sure, a larger battery and a better screen would not hurt, but for hustling it around the house, or bring it to friend/family house, watch movies, play games at native resolution, it is fantastic.
    It's no road warrior, for sure (heavy and the battery life doesn't go much beyond 3hrs of "serious" use) but playing at 1366*768 on something that weights 5 pounds and costs noticeably less than $500, is quite amazing. Impossible on an Intel+discrete graphics, as far as I know.
  • FlushedBubblyJock - Tuesday, June 9, 2015 - link

    Nope, HD3000 plays just fine
  • Margalus - Wednesday, June 3, 2015 - link

    I'd rather have a cheaper 15.6" 1366x768 TN panel over a more expensive smaller ips panel.
  • UtilityMax - Wednesday, June 3, 2015 - link

    1366x768 is fine for movies and games. But it's a bad resolution for reading text or viewing images on the web, since you see pixels the size of moon crater.
  • BrokenCrayons - Thursday, June 4, 2015 - link

    I understand there's going to be a variety of differing opinions on the idea of seeing individual pixels. As far as I'm concerned, seeing individual pixels isn't a dreadful or horrific thing. In fact, to me it simply doesn't matter. I'm busy living my meat world life and enjoying whatever moments I have with family and friends so I don't give the ability to discern an individual pixel so much as a second thought. It is an insignificant part of my life, but what isn't is the associated decline in battery life (on relative terms) required to drive additional, utterly unnecessary pixels and to push out sufficient light as a result of the larger multitude of them. That sort of thing is marginally annoying -- then again, I still just don't care that much one way or another aside from noticing that a lot of people are very much infatuated with an insignificant, nonsense problem.

Log in

Don't have an account? Sign up now