AMD A8-7670K Conclusion

We have tested all of the new AMD APUs as they have trickled into the market, but there are a few obvious points that come up from comments and forums when we discuss them. To start, the base architecture in these APUs, though part of the Kaveri or Kaveri Refresh family, debuted in January 2014, making it nearly two years old. The underlying design that has been iterated upon three times for Kaveri — Bulldozer — is now four years old, released in October 2011. For all intents and purposes, because these processors are AMD’s latest desktop offerings, they are compared to Intel’s latest counterparts.

Despite AMD’s push into using their heterogeneous system architecture platform as a way to accelerate day-to-day tasks that involve any vector calculations (JPEG decode, video playback), as most benchmark workloads will show, the AMD APUs come out slower on the pure CPU aspect, and they're power-hungry due to the 28-nm lithography node on which they are produced (particularly compared to Intel's latest 14-nm node). I was at an event earlier this year where a technology journalist told AMD up front that they considered the 28-nm processors hot and slow, and that users were not likely to be interested in them.

To combat what AMD sees as an pervasive dislike of the platform, AMD has been focusing on three talking points in their marketing message in order to communicate the areas where they believe they have an advantage. This is, by its nature, a bit of a spiel on AMD's part, but at the same time, there are some nuggets of truth in these claims, as illustrated by our benchmark results.

AMD's first talking point is, of course, price. AMD considers their processors very price-competitive, especially for low-cost systems when you consider performance as a function of total system cost. AMD's second talking point is on the power-consumption issue. For some time now, AMD's line has been that they don't believe that most users think about power consumption when gaming, suggesting that for the markets they are targeting, it might not be an issue to begin with. AMD's third talking point is on graphics performance, where AMD believes that their integrated graphics (or dual graphics with an R7 discrete card) will easily win on price and performance, especially for e-sports titles currently favored by budget gamers.

For the validity of AMD's talking points, we can verify Nos. 1 and 3 with our benchmarks, dollar for dollar. Especially when a dual graphics profile for a game exists, the gaming performance will be better for the same price. However, one might argue that relatively few users use a PC just for games, and items such as JavaScript/HTML5 performance for social media interaction is also important, with this being the main barrier low-frequency APUs have to compete against (in comparison to equally priced Intel counterparts). As for talking point No. 2, it's debatable whether users really care about the power consumption of their system during gaming. A personal thought for this would be if the system fans were to spin up, then maybe it would play on the mind, especially if the system is being used to watch a film or play music. But typically, users concerned with this sort of power consumption tend to be over 25 years old and can afford to be more selective with their purchases, as opposed to e-sports gamers on tight budgets. Nevertheless, some users will wholeheartedly disagree.

Ultimately these points lie at the heart of AMD’s dilemma. On one hand, many users will avoid an APU due to specifications or experience, no matter the budget. On the other hand, AMD has a tight space to work in, but there are areas where their APUs hold an edge over Intel's CPUs. The trick for AMD right now is convincing skeptical buyers of this.

If we look beyond today’s review, everyone who cares about CPU performance is hoping that AMD's new microarchitecture in 2016, Zen, allows AMD to catch up to Intel in raw CPU performance. At present, AMD has released slides claiming a 40% increase in IPC for their new design. If AMD can deliver on their performance goals then this should significantly improve their standing as far as x86 CPU performance goes, though this will initially be aimed at the high-performance market. Otherwise for budget users or the e-sports crowd, we will have to wait and see what the Zen microarchitecture brings and how it will be implemented for APUs.

Until then, AMD's APUs still win for that Rocket League style of player, beating any equivalent Intel implementation at the same price. The A8-7670K, with a minor recent discount to $100, is essentially the center point of that APU stack, on AMD's latest process design tweaks. We overclocked our sample to 4.6 GHz, but your mileage may vary.

On a personal note, as you might expect, I build systems for my family. My father, who wanted an audio workstation, had a big enough budget to consider something with many cores and hyperthreading, focusing on low audio latency and a configuration that used software that took advantage of that. I've mentioned in these reviews that I outfitted my 15-year-old cousin-in-law with an APU and a discrete card for a small cheap dual graphics system that probably cost $400 or so. With it, he does school work, talks to his friends and plays a range of MOBA and MMO games without issues. He's rather happy with it.

For future reference, all of our regular benchmark results can also be found in our benchmark engine, Bench.

Rocket League on an APU
Comments Locked

154 Comments

View All Comments

  • BrokenCrayons - Thursday, November 19, 2015 - link

    I'd love to see better IGPs from Intel as well, but it really only serves to move the bottom rung of the graphics ladder up a notch. They don't seem like they'll ever really catch up with system requirements on contemporary games. It's more of a matter of waiting for the current Intel graphics processor to be good enough to run what's already been on the market for a while in terms of entertainment. Beyond that, if Intel dedicates more space to graphics, there'll invariably be someone else who complains that it's a complete waste for there to even be integrated graphics in the first place since they have a discrete GPU.
  • raikoh05 - Thursday, November 19, 2015 - link

    you can probably make it run better https://www.youtube.com/watch?v=uiCCKurW9TU
  • plonk420 - Thursday, November 19, 2015 - link

    how the hell is this doing better than an A10 with 128 more streaming processors?
  • JoeMonco - Friday, November 20, 2015 - link

    It's summed up as "LOL AMD".
  • Rexolaboy - Thursday, December 10, 2015 - link

    The a10 benchmarks are from the launch tests. Not current, there is no reason to include them. Anandtech fail.
  • Tunrip - Friday, November 20, 2015 - link

    "I outfitted my 15-year-old cousin-in-law with an APU" THE FUTURE IS HERE! :D
  • hmmmmmmmmmm - Friday, November 20, 2015 - link

    People spending so much time on what-ifs. Why don't they just wait for the benchmarks for Zen and Kaby Lake instead of giving each other lessons in history and mathematics.
  • BMNify - Saturday, November 21, 2015 - link

    There is an incredible bias, but to no fault of your own, in the web benchmarks part. Considering that Intel is tied for second (with Opera. Samsung is the largest, of all companies) as the most active contributor for Chromium since about 2012, major effort is being invested by Intel to optimize Chrome for their chips. It just doesn't make logical sense every Intel chip performs that much better than AMD on web benchmarks other than they have invested a lot of time in helping advance chromium development. I mean come on, even a low end Pentium at stock speeds destroys even the highest AMD chip on those javascript/Web benchmarks. That has to be obvious bias.

    http://mo.github.io/assets/git-source-metrics-chro...

    I am not knocking Intel as their efforts are commendable in chromium development and any Chrome users who also use heavy JS browser apps would be amiss to choose AMD, but just wanted to point out that in benchmarking (which should be as level field as possible), you guys should switch to another browser like Firefox or even IE 11/Edge.
  • hojnikb - Saturday, November 21, 2015 - link

    Edge uses the same engine as Chrome... IE11 is old.

    If AMD actually invested something in software optimization wouldn't hurt.
  • Gigaplex - Monday, November 23, 2015 - link

    Edge does not use the same engine as Chrome. Edge uses EdgeHTML and Chakra, Chrome uses Blink/WebKit and V8.

Log in

Don't have an account? Sign up now