Discrete GPU Gaming Performance

Although likely not the target market for someone buying a Trinity APU, we looked at performance of AMD's latest APU when paired with a high-end discrete GPU. The end result is a total loss for Trinity. If you're going to use processor graphics Trinity is a clear winner, but if you plan on pairing the APU with a high end discrete GPU you're much better off with the Core i3 3220.

Metro 2033 Frontline Benchmark - 1024 x 768 - DX11 High Quality

DiRT 3 - Aspen Benchmark - 1024 x 768 Low Quality

Crysis Warhead Assault Benchmark - 1680 x 1050 Mainstream DX10 64-bit

Civilization V - 1680 x 1050 - DX11 High Quality

Video Transcoding Performance Overclocking & Power Consumption
Comments Locked

178 Comments

View All Comments

  • Roland00Address - Tuesday, October 2, 2012 - link

    1 watt *1000 hours equals 1 Kilowatt hour which is abbreviated 1 kWh

    24 hours a day *365 days per year equals 8760 hours or 8.76 kWh

    I should have said 24 hours a day, 7 days a week, 52 weeks a year. I apologize for leaving off the 52 weeks a year part it was a slip of the tongue.
  • KAlmquist - Tuesday, October 2, 2012 - link

    Good point. By my calculations, the break even point between the AMD A10 and the Intel i3 occurs when the amount of computation you are doing has the busy A10 doing 14.5% of the time and the i3 is busy 16.5% of the time. That assumes that the computation you are doing is similar to the second pass of x264 HD; the numbers might be different with a different work load. I do know that my computer is busy much less than 15% of the time. Right now, for example, I am using the computer to enter this comment, and the CPU is basically idle.

    Of course Visual is right that power use under max load does matter even if your system is idle most of the time. But after seeing how Bulldozer fared against Sandy Bridge, I expected Ivy Bridge to crush Piledriver (the Trinity CPU) in power consumption. You can argue that Ivy Bridge still wins, but it is a big surprise (at least to me) that it is a close call.
  • bgt - Tuesday, October 2, 2012 - link

    I also believe the experience is important too. Since I use Intel and AMD PC's I often found the bench numbers a bit meaningless. Intel CPU's may be fast at small single threaded jobs but AMD is often better at big workloads. When the A10-5800 is out here I will compare it to my 3225, used in a HTPC. I already noticed a difference in graphic display between my 3850 and 3225 when watching a HD movie on my TV. The screen looks a bit hazy,foggy when the 3225 is used. Compared to the 3850 I mean.Contrast is not as nice. Deep black is not really black.
  • nevertell - Tuesday, October 2, 2012 - link

    It would've been really great if you had included Phenom II and Athlon II cpus, to see whether or not it is reasonable for someone owning those systems to upgrade to the new APU's. AFAIK, I still think these CPUs are relevant to the general consumer, as long as they are in stock.
  • Medallish - Tuesday, October 2, 2012 - link

    I still run on my Phenom II X4 965@ 3.8GHz, it's a wonderful CPU, although I suspect it will be replaced by Vishera soon.
  • Mickatroid - Tuesday, October 2, 2012 - link

    Me too. A C3 stepping at 3.9 with a modest over volting. I am planning a micro-ITX trinity for my new workshop computer in a custom box with a car air filter (my Athlon XP is well past it).

    The good news is that for basic stuff (and more besides) Trinity offers all the computing powered I could ask for.
  • Roland00Address - Tuesday, October 2, 2012 - link

    Click bench in the upper corner of the website and enter the two processors you want to compare.

    The A10 5800k gets roughly the same performance as the phenom ii x4 965
  • mikato - Wednesday, October 3, 2012 - link

    Wow! Glad you mentioned this. I have a Phenom II X4 965 and plan to keep it for quite a while yet and it's as fast as anything (that I would notice). With that in mind, an A10 5800k would be a beast for my parents build or an HTPC. Better yet a 5700 if I can get one.
  • owlxp - Tuesday, October 2, 2012 - link

    Why even do the comparison with high end discrete cards? Everyone knew what those results were going to look like. We've known for months. Despite the "i5 performace at i3 prices" marketing, AMD can't compete with the better intel CPU's............but that's not where the value is. AMD dominates the low end market and the bang for buck. Why is there no review on what the max discrete card that can be used in the hybrid xfire set ups? What about some tests to see if Trinity can do multi-discrete + on die GPU triple and quad xfire? That's what I'd like to see. There were some reviews comparing radeon 7750 in crossfire to nvidia's 640, 650, and 660 and the dual 7750 set up was winning the 1080p with 4x and 8x aa on match up. If the a10 + 7750 can put out similar results.........that's going to be an easy way to capture the budget and midrange gamers. Especially if triple and quad hybrid xfire set ups are possible.

    Where is the value test and the chart for the hybrid xfire set ups? That's what I want to see.
  • meloz - Tuesday, October 2, 2012 - link

    >AMD can't compete with the better intel CPU's............but that's not where the value is.

    If there's no value there, please alert Intel. They didn't get the memo, nor did the consumers. Inspite of making CPUs with "poor value", Intel outsell AMD 9:1 _and_ make record profits quarter after quarter.

    >AMD dominates the low end market and the bang for buck.

    And in the process AMD make losses, quarter after quarter.

    No one wants to "dominate" the low end, broseph. Low end is where you naturally end up if you are not good enough to compete with the top and mid end.

Log in

Don't have an account? Sign up now