The run up to Computex has been insane. Kabini, Haswell and Iris hit us back to back to back, not to mention all of the travel before receiving those products to get briefed on everything. Needless to say, we're in major catchup mode. There's a lot more that I wanted to do with Haswell desktop that got cut out due to Iris, and much more I wanted to do with Iris that I had to scrap in order to fly out to Computex. I will be picking up where I left off later this month, but with WWDC, Samsung and a couple of NDA'd events later this month, it's not going to be as quick as I'd like.

One part that arrived while I was in the middle of launch central was AMD's Richland for desktop. Effectively a refresh of Trinity with slightly higher clocks, a software bundle and more sophisticated/aggressive turbo. Richland maintains socket compatibility with Trinity (FM2), so all you should need is a BIOS update to enable support for the chip. AMD sent over two Richland parts just before I left for Computex: the 100W flagship A10-6800K and the 65W A10-6700. I didn't have time to do Richland justice before I left, however I did make sure to test the 6800K in tandem with Haswell's GPU just so I had an idea of how things would stack up going forward as I was writing my Iris Pro conclusion.

For all intents and purposes, Iris Pro doesn't exist in the desktop space, making Haswell GT2 (HD 4600) the fastest socketed part with discrete graphics that Intel ships today. In our Haswell desktop review I didn't get a chance to really analyze HD 4600 performance, so I thought I'd take this opportunity to refresh the current state of desktop integrated processor graphics. Unlike the staggered CPU/GPU launch of Trinity on the desktop, the situation with Richland is purely a time limitation on my end. This was all I could put together before I left for Computex.

Although Richland comes with a generational increase in model numbers, the underlying architecture is the same as Trinity. We're still talking about Piledriver modules and a Cayman derived GPU. It won't be until Kaveri that we see GCN based processor graphics from AMD at this price segment (Kabini is already there).

As Jarred outlined in his launch post on Richland, the 6800K features 4 - 8% higher CPU clocks and a 5% increase in GPU clocks compared to its predecessor. With improved Turbo Core management, AMD expects longer residency at max turbo frequencies but you shouldn't expect substantial differences in performance on the GPU side. The A10-6800K also includes official support for DDR3-2133. AMD is proud of its valiation on the A10-6800K, any parts that won't pass at DDR3-2133 are demoted to lower end SKUs. I never spent a ton of time testing memory overclocking with Trinity, but my A10-5800K sample had no issues running at DDR3-2133 either. I couldn't get DDR3-2400 working reliably however.

AMD Elite A-Series Desktop APUs, aka Richland
Model A10-6800K A10-6700 A8-6600K A8-6500 A6-6400K A4-4000
Modules/Cores 2/4 2/4 2/4 2/4 1/2 1/2
CPU Base Freq 4.1 3.7 3.9 3.5 3.9 3.0
Max Turbo 4.4 4.3 4.2 4.1 4.1 3.2
TDP 100W 65W 100W 65W 65W 65W
Graphics HD 8670D HD 8670D HD 8570D HD 8570D HD 8470D ?
GPU Cores 384 384 256 256 192 128
GPU Clock 844 844 844 800 800 724
L2 Cache 2x2MB 2x2MB 2x2MB 2x2MB 1MB 1MB
Max DDR3 2133 1866 1866 1866    
Price (MSRP) $150 ($142) $149 ($142) $120 ($112) $119 ($112) $80 $46

Just to put things in perspective, here are the previous generation Trinity desktop APUs:

AMD Trinity Desktop APUs
Model A10-5800K A10-5700 A8-5600K A8-5500 A6-5400K A4-5300
Modules/Cores 2/4 2/4 2/4 2/4 1/2 1/2
CPU Base Freq 3.8 3.4 3.6 3.2 3.6 3.4
Max Turbo 4.2 4.0 3.9 3.7 3.8 3.6
TDP 100W 65W 100W 65W 65W 65W
Graphics HD 7660D HD 7660D HD 7560D HD 7560D HD 7540D HD 7480D
GPU Cores 384 384 256 256 192 128
GPU Clock 800 760 760 760 760 723
L2 Cache 2x2MB 2x2MB 2x2MB 2x2MB 1MB 1MB
Max DDR3 2133 1866 1866 1866    
Current Price $130 $129 $110 $105 $70 $55

For my Richland test platform I used the same Gigabyte UD4 Socket-FM2 motherboard I used for our desktop Trinity review, simply updated to the latest firmware release. I ran both AMD platforms using the same Catalyst 13.6 driver with the same DDR3-2133 memory frequency. AMD was quick to point out that only the A10-6800K ships with official DDR3-2133 support, so the gap in performance between it and Trinity may be even larger if the latter tops out at DDR3-1866. The HD 4000/4600 numbers are borrowed from my Iris Pro review using DDR3-2400, however I didn't notice scaling on Haswell GT2 beyond DDR3-1866.

I'll be following up with a more thorough look at Richland once I'm back from my current bout of traveling.

Gaming Performance
Comments Locked

102 Comments

View All Comments

  • Will Robinson - Sunday, June 9, 2013 - link

    LOL...NeelyCam must be crying his eyes out over those results.
    Good work AMD!
  • Wurmer - Sunday, June 9, 2013 - link

    I still find this article interesting even if IGP are certainly not the main focus of gamers. I don't consider myself a hardcore gamer but I don't game on IGP. I am currently using a 560 GTX which provides me with decent performances in pretty much any situation. On the other hand, it gives an idea of the progress made by IGP. I certainly would enjoy more performance from the one I am using at work which is a GMA 4500 paired with a E8400. There are markets for good IGP but gaming is not of them. As I see it, IGP are more suited to be paired with low to mid CPUs which would make very decent all around machine.
  • lordmetroid - Monday, June 10, 2013 - link

    Using high end games that will never be played on the internal graphic processor is totally pointless, why not use something like ETQW?
  • skgiven - Monday, June 10, 2013 - link

    Looks like you used a 65W GT640, released just over a year ago.
    You could have used the slightly newer and faster 49W or 50W models or a 65W GTX640 (37% faster than the 65W GT640).
    Better still a GeForce GT 630 Rev. 2 (25W) with the same performance as a 65W GT640!
    (I'm sure you don't have every GPU thats ever been released lying around, so just saying whats there).

    An i7-4770K, or one of its many siblings, costs ~$350.
    For most light gaming and GPU apps, the Celeron G1610T (35W) along with a 49W GT640 would outperform the i7-4770K.
    The combined Wattage is exactly the same - 84W but the relative price is $140!
    Obviously the 25W GK208 GeForce GT 630 Rev. 2 would save you another $20 and give you a combined TDP of 60W, which is 40% better than the i7-4770K.
    It’s likely that there will be a few more GT600 Rev.2 models and the GK700 range has to fill out. Existing mid-range GPU’s offer >5times the performance of the i7-4770K.
    The reasons for buying an i7 still have little or nothing to do with its GPU!
  • skgiven - Monday, June 10, 2013 - link

    - meant GTX645 (not GTX640)
  • NoKidding - Monday, June 24, 2013 - link

    i shudder to think what an a10 kaveri can bring to the table considering it'll be equipped with amd's gcn architecture and additional ipc improvements. low price + 4 cores + (possibly) hybrid xfire with a 7xxx series radeon? a great starting point for a decent gaming rig. not to mention that the minimum baseline for pc gaming will rise from decent to respectable.
  • Silma - Friday, June 28, 2013 - link

    Sometimes I really don't understand your comparisons and even less the conclusions.
    Why compare a Richland to a Haswell when obviously they will get used for totally different purposes? Who will purchase a desktop Haswell without graphic card for gaming? Why use super expensive 2133 memory with a super bad processor?

    There are really 3 conclusions to be had:
    - CPU-Wise Richland sucks aplenty
    - GPU-Wise there is next to no progress as compared to Trinity, the difference being fully explained by a small frenquency increase.
    - If you want cheap desktop gaming you will be much better server by a Pentium G2020 + Radeon HD6670 or HD 7750 for the same price as a crappy A6800 or A6700.
  • XmenMR - Monday, September 2, 2013 - link

    You make me laugh. I normally do not post comments on these things based on the fact that I read them just to get a laugh, but I do have to point out how wrong you are. I have a G1620, G2020, i3-3240, A8, A10 and a more and have ran benchmarks with a 6450, 6570, 6670, 7730, 7750 and 7770 for budget build gaming computers for customers.
    Your build of a G2020 with a 6670 in my test was beaten, hands down by the A10-6800k hxf with 7750 (yes I said it, hybrid crossfire with 7750, it can be done although not popular supported by AMD). G2020 with 6670 will run you about $130, and an A10 with 7750 is about $230. To match the A10 hxf 7750 ($230 value) performance with Intel I did have to use 7750/7770 or higher with the Pentiums and I3+7750 ($210 value) did quite well but still was beaten in quite a few things graphics related.
    Point being a discrete GPU changes the whole aspect of the concept. I3+7750 are very close to A10+hxf7750 in more ways than just performance, but that’s not the point of this Topic. It was AMD 8670D vs Intel HD 4600. I know lots of people that buy Intel i5 and i7 and live off the iGPU thinking one day they would have the money to get a nice GPU and call it good, %60 of the time this does not happen, new tech comes out that’s better and they just change their minds and try to get a whole new system. The APU on the other hand would have been cheaper and performed better for what they needed, had they just gone that road, and I am not the only one that came to that conclusion. AMD has done a great job with the APU and after testing many myself, I have become a believer. Stock i5 computer for $700 got smashed by stock A10 $400 in CS6 sitting side by side, I could not believe it. I do not have to argue how good the APU is doing because Microsoft and Sony have already done it. So I leave with a question. If the APU was not a fantastic alternative that delivers a higher standard of graphics performance, then why are they going to be used in the Xbox1 and PS4?
  • ezjohny - Tuesday, September 10, 2013 - link

    when are we going to get a APU where you could go in game an adjust the graphic setting to very high with out a bottle neck!
  • nanomech - Sunday, December 8, 2013 - link

    This is a slanted review. The i7 with the separate Nvidia card skews the results, perhaps erroneously, toward Intel. How about the A10 with the same separate Nvidia card and/or the comparable separate AMD video card? The performance difference can be quite drastic.

    IMHO, one should compare apples to apples as much as possible. Doing so yields a much more complete comparison. I realize that these APUs tout their built-in graphic abilities, but Intel is trying to do so as well. It's the only way to give the CPU part of the APU a fair shake. That or leave the i7-Nvidia results out completely.

Log in

Don't have an account? Sign up now