Overclocking

With Sandy Bridge Intel killed budget overclocking by completely clock locking all CPUs without turbo boost enabled. While you used to be able to buy an entry level CPU and overclock it quite nicely, Intel moved all overclocking to its higher priced parts. As a gift to the overclocking community, Intel ramped up the presence of its fully unlocked K-series parts. Anything with a K at the end shipped with a fully unlocked clock multiplier, at a small price premium. Given that Intel hadn't shipped unlocked CPUs since the days of the original Pentium, this was a welcome move on its part. What would really be nice is the addition of some lower priced K SKUs, unfortunatley we won't get that unless there's significant competitive pressure from AMD.

Trinity doesn't have what it takes to really force Intel into doing such a thing, but that doesn't mean AMD won't try. The Trinity lineup includes AMD's own K-series SKUs that, like their Intel counterparts, ship fully unlocked. From $67 all the way up to $122, AMD is offering unlocked Trinity APUs. The value of these parts really depends on just how overclockable Trinity is to begin with. The Bulldozer/Piledriver architecture is designed to push frequency, however AMD is already shipping these things at very close to 4GHz to begin with. Take AMD's turbo frequencies into account and you're already at 4.2GHz with the A10-5800K. How much additional headroom is there?

With a stock cooler and not a ton of additional voltage, it looks like there's another 5 - 15% depending on whether you're comparing base clocks or max turbo clocks. With an extra 0.125V (above the 1.45V standard core voltage setting) I was able to hit 4.4GHz on the A10-5800K. I could boot into Windows at 4.5GHz however the system wasn't stable. Although I could post at 4.6GHz, Windows was highly unstable at that frequency. With more exotic cooling I do believe I could probably make 4.5 work on the A10-5800K.

Cinebench 11.5 - Multi-Threaded

The extra frequency isn't enough to erase the single threaded performance gap between the A10 and Intel's Core i3 3220 however:

Cinebench 11.5 - Single Threaded

The only way AMD is going to close this gap is through a serious focus on improving single threaded performance in future architectures.

Discrete GPU Gaming Performance Power Consumption
Comments Locked

178 Comments

View All Comments

  • Roland00Address - Tuesday, October 2, 2012 - link

    1 watt *1000 hours equals 1 Kilowatt hour which is abbreviated 1 kWh

    24 hours a day *365 days per year equals 8760 hours or 8.76 kWh

    I should have said 24 hours a day, 7 days a week, 52 weeks a year. I apologize for leaving off the 52 weeks a year part it was a slip of the tongue.
  • KAlmquist - Tuesday, October 2, 2012 - link

    Good point. By my calculations, the break even point between the AMD A10 and the Intel i3 occurs when the amount of computation you are doing has the busy A10 doing 14.5% of the time and the i3 is busy 16.5% of the time. That assumes that the computation you are doing is similar to the second pass of x264 HD; the numbers might be different with a different work load. I do know that my computer is busy much less than 15% of the time. Right now, for example, I am using the computer to enter this comment, and the CPU is basically idle.

    Of course Visual is right that power use under max load does matter even if your system is idle most of the time. But after seeing how Bulldozer fared against Sandy Bridge, I expected Ivy Bridge to crush Piledriver (the Trinity CPU) in power consumption. You can argue that Ivy Bridge still wins, but it is a big surprise (at least to me) that it is a close call.
  • bgt - Tuesday, October 2, 2012 - link

    I also believe the experience is important too. Since I use Intel and AMD PC's I often found the bench numbers a bit meaningless. Intel CPU's may be fast at small single threaded jobs but AMD is often better at big workloads. When the A10-5800 is out here I will compare it to my 3225, used in a HTPC. I already noticed a difference in graphic display between my 3850 and 3225 when watching a HD movie on my TV. The screen looks a bit hazy,foggy when the 3225 is used. Compared to the 3850 I mean.Contrast is not as nice. Deep black is not really black.
  • nevertell - Tuesday, October 2, 2012 - link

    It would've been really great if you had included Phenom II and Athlon II cpus, to see whether or not it is reasonable for someone owning those systems to upgrade to the new APU's. AFAIK, I still think these CPUs are relevant to the general consumer, as long as they are in stock.
  • Medallish - Tuesday, October 2, 2012 - link

    I still run on my Phenom II X4 965@ 3.8GHz, it's a wonderful CPU, although I suspect it will be replaced by Vishera soon.
  • Mickatroid - Tuesday, October 2, 2012 - link

    Me too. A C3 stepping at 3.9 with a modest over volting. I am planning a micro-ITX trinity for my new workshop computer in a custom box with a car air filter (my Athlon XP is well past it).

    The good news is that for basic stuff (and more besides) Trinity offers all the computing powered I could ask for.
  • Roland00Address - Tuesday, October 2, 2012 - link

    Click bench in the upper corner of the website and enter the two processors you want to compare.

    The A10 5800k gets roughly the same performance as the phenom ii x4 965
  • mikato - Wednesday, October 3, 2012 - link

    Wow! Glad you mentioned this. I have a Phenom II X4 965 and plan to keep it for quite a while yet and it's as fast as anything (that I would notice). With that in mind, an A10 5800k would be a beast for my parents build or an HTPC. Better yet a 5700 if I can get one.
  • owlxp - Tuesday, October 2, 2012 - link

    Why even do the comparison with high end discrete cards? Everyone knew what those results were going to look like. We've known for months. Despite the "i5 performace at i3 prices" marketing, AMD can't compete with the better intel CPU's............but that's not where the value is. AMD dominates the low end market and the bang for buck. Why is there no review on what the max discrete card that can be used in the hybrid xfire set ups? What about some tests to see if Trinity can do multi-discrete + on die GPU triple and quad xfire? That's what I'd like to see. There were some reviews comparing radeon 7750 in crossfire to nvidia's 640, 650, and 660 and the dual 7750 set up was winning the 1080p with 4x and 8x aa on match up. If the a10 + 7750 can put out similar results.........that's going to be an easy way to capture the budget and midrange gamers. Especially if triple and quad hybrid xfire set ups are possible.

    Where is the value test and the chart for the hybrid xfire set ups? That's what I want to see.
  • meloz - Tuesday, October 2, 2012 - link

    >AMD can't compete with the better intel CPU's............but that's not where the value is.

    If there's no value there, please alert Intel. They didn't get the memo, nor did the consumers. Inspite of making CPUs with "poor value", Intel outsell AMD 9:1 _and_ make record profits quarter after quarter.

    >AMD dominates the low end market and the bang for buck.

    And in the process AMD make losses, quarter after quarter.

    No one wants to "dominate" the low end, broseph. Low end is where you naturally end up if you are not good enough to compete with the top and mid end.

Log in

Don't have an account? Sign up now