• What
    is this?
    You've landed on the AMD Portal on AnandTech. This section is sponsored by AMD. It features a collection of all of our independent AMD content, as well as Tweets & News from AMD directly. AMD will also be running a couple of huge giveaways here so check back for those.


Back to Article

  • Crono - Tuesday, October 02, 2012 - link

    It's good to finally see pricing on Trinity.
    Looks like AMD is still competitive at lower price points for Video transcoding performance, which is good news for me and others who's highest CPU workloads are mostly video editing.

    But even for gaming the A10-5800/K doesn't look too bad for $122 chips.
    Glad to see it isn't quite "Bulldozer" all over again. I need to build a new system for Windows 8 and Trinity looks promising for a budget to mid range build.
  • ac2 - Tuesday, October 02, 2012 - link

    Big deal...

    Just look at pg 2, the single threaded performance of the A10 is lower than a friggin Pentium G850

    Pg 3 Sysmark, only slightly ahead of the Pentium even with 2 additional Integer cores

    I wish Anand had included the Pentium G850 power comparison as well though at 65W vs 100W TDP we can be sure which way that will swing...

    And the G850 costs a little more than HALF the A10 suggested price... That puts a $70 saving towards a discrete graphics card, which, as per pg 6 should result in a much better gaming performance as well...
  • StevoLincolnite - Tuesday, October 02, 2012 - link

    It's also slower than an old Core 2 Quad Q6600 overclocked to 3.6ghz+ which was released in 2007 both in single threaded and multi-threaded tasks.

    However, what I would like to see is some under-volting tests to see how low they can go in terms of TDP as AMD has always been incredibly conservative (Even on the GPU side) with voltages, would be excellent in a Mini-ITX rig without the need for a dedicated GPU.
  • MrMilli - Tuesday, October 02, 2012 - link

    http://www.computerbase.de/artikel/prozessoren/201... Reply
  • Belard - Tuesday, October 02, 2012 - link

    Really cool how the performance charts are NOT static.

    Ah, German technology. :)
  • B3an - Tuesday, October 02, 2012 - link

    Trinity truly is embarrassingly slow. I think it should be even cheaper for this performance level. And it's also disappointing that Intel have no performance competition AT ALL from AMD. We need this for some real progression to be made in desktop CPU's and also drive prices down at the high end. Reply
  • IKeelU - Tuesday, October 02, 2012 - link

    What kind of progress are you talking about? Power efficiency? Raw general computing performance? Progress comes in many forms.

    I think there is very little need for raw single-threaded performance increases in consumer-level "general computation" processors (e.g. few-core x86 processors). So who cares if we don't have a second intel? What we do need are better ways of extracting performance from multiple cores, and from the massively parallel architectures found in GPUs. Part of the equation is ensuring that these two paradigms are pervasive, and therefore, from a big-picture point-of-view, AMD really *is* pushing things forward with Trinity.
  • Belard - Tuesday, October 02, 2012 - link

    AMD's progress is about 2 years behind. While this is somewhat intel's fault by illegal competitive practices which in turn kept AMD from profiting as much as they could have during the Athlon64~X2 era.

    AMD has done a lot of things wrong and would take at least 2 years to sort out... if ever. FM2 is no better than FM1 and not cross compatible. Neither is compatible with Socket AM3 and nothing AMD has on the market even supports PCIe 3.0.

    Compared to buying an intel i3-something system, in which you can upgrade to an i7-something easily. AMD has a mess on their hands and I suspect part of their performance problems could also be rooted in their chipsets. (Across the board, the performance hits a wall). Yeah, for a notebook - the powerful GPU part comes into play. And both intel i3 and Trinity is more than powerful enough for MOST people. hell, I'm still running on a Q6600 at 2.4Ghz and it does quite well (going i5-3570 this weekend)... But for those who want their money's worth when performance counts, AMD is not in the game.

    The performance from AMD has to be consistent, and its not. The heat is not good along with the cheap fans they include that are noisy and should be replaced with a good $20~50 cooler. Meanwhile intel CPUs are running cooler and their fans are very quiet.

    So as of today, in general - it costs more to get an AMD. I really want AMD to do well, I've sold lots of AMD systems in the past.
  • Origin64 - Wednesday, October 03, 2012 - link

    About PCIe 3.0, nobody needs that. 2.0 * 8 is enough to drive every card on the market with maybe 2% performance loss, which is within the error margin of every benchmark. To keep prices low, of course they're not going to add extra features that nobody needs. I think it's a smart move.

    Other than that, I agree. AMD hasn't been doing well price-performance wise the last two years. I have a Phenom II, and when they were released they were still competitive, for the price and the time, but the last two generations have seen no singlethreaded performance increase and cost about the same.
  • CeriseCogburn - Friday, October 12, 2012 - link

    AMD's engineers can't seem to figure out pci-e 3.0

    They released their 7000 series with pci-e 3.0 uncertified (while nVidia waited and did a proper and offical implementation), then a bunch of amd card people had trouble in boards, but since the "press" kept their yappers shut except for a few, it wasn't widespread knowledge, so amd fanboys ( and others) suffered with their crap non certified cards, unable to chase down the plaguing glitches properly.
  • rarson - Wednesday, October 03, 2012 - link

    "So as of today, in general - it costs more to get an AMD."

    ...unless you actually want decent gaming performance.

    The problem for Intel is that the things that Trinity doesn't do well aren't much of an issue to the average consumer. The average consumer doesn't have a clue how threaded their workloads are, and won't notice any significant differences between the i3 and the A10 other than how much faster the graphics are.

    If I were going to build an HTPC without discrete graphics, I'd be buying a Trinity. Intel doesn't make sense in that application (of course, I'd probably still go discrete, in which case Intel makes more sense).
  • CeriseCogburn - Friday, October 12, 2012 - link

    That's great rarson, when amd is crap, it's ok because people need crap, and won't notice the crap they have.
    Someone may be full of crap.
  • g101 - Monday, October 22, 2012 - link

    Yeah, you. You lifeless little shit. Find something better to do than comment on every anandtech article with senseless garbage that never contains a shred of evidence or fact. Reply
  • rarson - Wednesday, October 03, 2012 - link

    "I think there is very little need for raw single-threaded performance increases in consumer-level "general computation" processors (e.g. few-core x86 processors)."

    Exactly. While AMD could stand to improve their single-threaded performance, the fact that Intel is so far ahead in that specific metric doesn't automatically mean that Trinity is "slow." It does other things quite well, including multi-threaded performance, which is obviously more of a concern at this point considering software is moving towards multi-threading anyway.
  • CeriseCogburn - Tuesday, October 09, 2012 - link

    LOL - you are such a sad fanboy
    I hope amd showers you as fanboy of the month with their slow crap so you have to use it.
  • amd4evernever - Sunday, October 14, 2012 - link


    choke on that.... no matter how you troll the amd solution in the ends beats the core i3 both in price and in gaming read the full article at tomshardware.. you lunatic.
  • g101 - Monday, October 22, 2012 - link

    You stupid little bitch, you comment on EVERY article with pro nvidia/intel comments and every single one is absolute rubbish.

    What I really want to know is: how the fuck can you have nothing better to do, yet still be so ignorant?
  • rarson - Wednesday, October 03, 2012 - link

    "We need this for some real progression to be made in desktop CPU's and also drive prices down at the high end."

    Why? Is an i7 not fast enough for you? Are you being bogged down by your uber-user workload?

    The vast majority of people buy processors that are cheaper than the A10 here. Sure, it'd be nice to see AMD compete at the high end again, but the high end doesn't drive the market; the high end is barely a blip on the radar.
  • Flunk - Tuesday, October 02, 2012 - link

    Exactly, why buy the kids a new A10 when I can just give them my old Q6600? AMD really needs to make a better case for themselves than this. Reply
  • silverblue - Tuesday, October 02, 2012 - link

    They are going to.

  • Origin64 - Wednesday, October 03, 2012 - link

    That's what they said about Trinity last year. I'm not buying it, not since Bulldozer. Reply
  • CeriseCogburn - Tuesday, October 09, 2012 - link

    Yep, it's always the future, the next one, for the amd fanboy Reply
  • sean.crees - Wednesday, October 10, 2012 - link

    I agree, the best case scenario for AMD is the A10-5700 with 65 watt TDP in a SFF ITX enclosure utilizing the integrated GPU. Reply
  • ac2 - Tuesday, October 02, 2012 - link

    Oops that last line should read "That puts a $50 saving towards..." not $70

    - Pentium G850: $70 (HD2000 less Quicksync)
    - Gigabyte GA-B75M-D3V: $70 (with PCIe 3.0 and USB 3.0 onboard)

    Can easily upgrade to a more powerful GPU/ CPU down the line..
  • cyrusfox - Tuesday, October 02, 2012 - link

    Correction to your comment
    The G850 will only support PCIe 2.0, you would need to wait for Ivy refresh of celeron/pentium.

    If it was me and I was going cheap/Intel, I would just buy a Celeron G530 for $40, why bother with the G850, its just as handicapped with a little more CPU freq.

    But trinity would be a whole lot more fun to play with then those chips even though it may be slower doing a lot of the same task. Some of us just enjoy overclocking while undervolting while rooting for the perpetual underdog.
  • CaptainDoug - Tuesday, October 02, 2012 - link

    The G850 has Intel HD 2000 graphics. Big step up. Also as far as motherboard compatibility, if you go with a celeron, your choices for boards that are also compatible with i3, i5, i7 boards is slim pickin's. Going with a pentium gives you more advanced motherboard options. I agree that the G530 is a great CPU for cheap but it's just a little harder to upgrade to anything amazing. Trinity is quite interesting though. Would be great in a small desktop for the wife. Reply
  • wwwcd - Wednesday, October 03, 2012 - link

    HD 2000 is a scrap ;) Reply
  • rarson - Wednesday, October 03, 2012 - link

    Having Intel graphics is not, in any way, a plus. Especially a turd like HD2000. Reply
  • delirious7 - Tuesday, October 02, 2012 - link

    how is the ivy bridge pentiums or celerons going to have pci express 3.0 when even the core i-3 processors don't? intel decided to omit pci express 3.0 from everything under an core i5.

    the only thing besides a slightly better clock for clock that ivy gives you with the ivy bridge pentiums is memory support up to speeds of 1600 instead of 1333 like sandy bridge.
  • delirious7 - Tuesday, October 02, 2012 - link

    also forgot to mention that one ivy bridge pentium has been out for a while now. the G2120. it is retailing for $100 on newegg. Reply
  • bill4 - Tuesday, October 02, 2012 - link

    "Can easily upgrade down the line" pretty much misses the point imo.

    Theoretically, this thing might be for casuals who are never going to upgrade, or people who need to pinch every last penny, or people who want a general purpose machine that isn't a total dog when they wanna game now and again, etc.
  • LancerVI - Tuesday, October 02, 2012 - link


    I agree 100%. Trinity is perfect for a great kids/wife general use computer. Websurfing and minecraft or wizard 101.

    As long as they stay off my machine, we're good!
  • Spunjji - Tuesday, October 02, 2012 - link

    Yes, but you shouldn't expect people who didn't even read the conclusion to the article to pay any attention to what you have to say here. :( Reply
  • parkerm35 - Tuesday, October 02, 2012 - link

    $70 will get you a nice GT630 which is slower than the 7660D! And then when it comes to threaded tasks, you will be left waiting an eternity. Nice plan.... Reply
  • ac2 - Tuesday, October 02, 2012 - link

    Actually I said "That puts a $50 saving towards...", I certainly didn't say that would be the total price!

    Highly threaded integer tasks are a rarity for me so...
  • CaptainDoug - Tuesday, October 02, 2012 - link

    why'd you pick a GT630? there's a 6670 for $65 that's roughly twice as powerful as the GT630. It would end up being like 35% more powerful than a 7660D. Reply
  • Jamahl - Tuesday, October 02, 2012 - link

    Did you see how far the Pentium was behind in multi threading? It will not beat the A10 in gaming in most cases but feel free to waste money on a dual core if you want. Reply
  • mattlach - Tuesday, October 02, 2012 - link

    Multithreading is still mostly irrelevant in most Games.

    Yes, there are a couple of games that support it pretty well well (Battlefield 3, Civilization 5) but the vast majority of games out there are still of the "load one core 100% and load a second (and possibly third) core 15-20%" variety.

    In these circumstances two fast cores are going to beat 4, 6 or 8 slow cores almost every time.

    It's not about IPC, and its not about clock speed. It's about the combination of the two. Per core performance is still king, and I don't think this is going to change any time soon due to the difficulties involved in writing good multithreaded code, and Amdahls law.
    ( http://en.wikipedia.org/wiki/Amdahls_law )
  • bji - Tuesday, October 02, 2012 - link

    Yeah, but aren't *PC's* irrelevant in most Games? Reply
  • Jamahl - Tuesday, October 02, 2012 - link

    According to kitguru, the 5800K + 7970 GHz edition beats the i3 2105 in every single game at 1080p. Reply
  • mikato - Wednesday, October 03, 2012 - link

    lolwut? is that using i3's integrated graphics? Please extend your comment a little more to the point it makes sense. Reply
  • CeriseCogburn - Tuesday, October 09, 2012 - link

    AMSheep fanboys tell themselves and everyone else lies, then they go buy the crap and the live the life of a slow and still poor dullard.
    Next chance they get they rinse and repeat.
  • kshong - Thursday, October 11, 2012 - link

    Your comments add nothing to the conversation and take up space. I wish people could ban trolls like you. Reply
  • bill4 - Tuesday, October 02, 2012 - link

    Just looked on newegg and the G850 is $70. The 5800 is supposed to MSRP for $122. I realize Obama is destroying our schools but where is 70 half of 122? Also, that leaves $50, what dedicated GPU worth a crap costs 50 dollars? Even a $100+ 7750 would not be much improvement I'm assuming.

    Somehow I knew before I checked the prices your comment would be an exaggeration, as I've seen the same type of wrong pricing on other comments like this already. Fanboys, get your facts straight.

    Plus, I dont know the exact motherboard situation, but typically AMD motherboards are cheaper as well.

    Oh and you pick out one single benchmark, what's the point of that? Looking at all the benches, what do you know, 5800 is faster than G850 easily the vast majority of times. Many times it's not even close, or 5800 can close to double G850 speed (3DSmax39, page 4 lol).

    Usually the way it works in computers is, better performing parts cost more.

    Usually the Trinity parts compare well with the similar priced Intel part, which is the i3 3220. Case closed.

    I dont even necessarily disagree with you, that it might do better to spend a few more bucks on a g850+discrete card combo if you're a gamer (and, I'd need to see actual benchmarks with that combo to be sure) just pointing out a few things. Then again, if you're a gamer you really shouldn't spend less than 700 on a rig anyway, anything less you're buying outside the sweet spot and you just bought a piece of crap. Way PC gaming works, very well defined "sweet spot", buying above or below that spot is usually stupid for gaming.
  • ac2 - Tuesday, October 02, 2012 - link

    Well I guess the reading comprehension is no better...
    I said "the G850 costs a little more than HALF the A10 suggested price"
    69.90 is a little more than half of 122, i.e. 61
    Unless we start quibbling how much a 'little more' is...

    Lets look at the 3dsmax scores on pg 4 you talk about:
    A10 - 11.5
    A8 - 11.1
    Pentium - 8.6 (double of which is 17.2 but anyway)

    My elementary maths tells me its a 33% gain, which is generally the A10's gain over a CPU that costs 42% less... And that too only on the embarassingly parallel tasks..
  • ac2 - Tuesday, October 02, 2012 - link

    Oh and Anand has benched G850 + discrete cards, see here:


    Looks 'good enough' for me...

    And I guess I'm just frustrated with AMD for completely dropping the value for money ball here, except for some highly threaded integer tasks...
  • eanazag - Tuesday, October 02, 2012 - link

    Remember that once you add a discrete part for GPU the power benefits disappear. Reply
  • Roland00Address - Tuesday, October 02, 2012 - link

    If you are adding a discrete card do not get the a10, get the Athlon X4 750K (remember it is unlocked) for $81 dollars. It is the trinity fm2 processor but it does not have the iGpu (which you don't care about since you are adding dedicated).

    It runs 600 mhz less on stock, and 400 mhz less on turbo but guess what you have an unlocked multiplier so you can easily set it at the same frequency as the a10.

    It does cost 15 dollars more (g850 goes for 66 the Athlon X4 750k goes for $81) but you will get much better multithread performance than the pentium (in exchange for higher load power consumption)

    Some games are going 4 cores (such as Battlefield 3) so the Athlon X4 750k will be better for gaming.
  • rarson - Wednesday, October 03, 2012 - link

    Yeah, because single-threaded performance is so forward-thinking. Intel excels at what everyone is trying to move away from, great. Reply
  • amd4evernever - Sunday, October 14, 2012 - link


    choke on this..... read first before trolling.. you intel lunatic.
  • Spunjji - Tuesday, October 02, 2012 - link

    Agreed. Ignoring all troll comments that followed this post... Reply
  • C1377 - Friday, July 05, 2013 - link

    Just purchased the a10-5800k with a Motherboard for a net cost of $140. Can't beat that price... The Over/Under clock ability is the ace in the hole, as is power consumption when gaming (compared to a discrete graphics solution). For those of us who don't need to game at higher resolutions than 1080p, this setup is great.

    Beyond that All I need in performance is the ability to encode DVD's faster than my DVD burner, and handle file compression... most of us do work on pc's purchased by our company, and at home we do things like photoshop and DVD authoring. Sometimes I think you guys get so engrossed in the technicalities, that you forget the real world scenarios. How much time am I going to save by the "huge phenominal difference" of 77-70 fps on a DVD encode? I am still going to spend 15-20 minutes burning the DVD, so is a processor which costs an extra $50 worth the shaving down my DVD creation time from 45 min to 42.06783 min? Is this really a "huge phenominal difference" ...only on benchmarks--not in the real world.
  • gamoniac - Tuesday, October 02, 2012 - link

    Anyone who still has not switched to Intel and has AMD's previous high-end chips (eg Phenom II x6) most likely has a high-end graphic card as well, so it makes little sense to switch to a Trinity APU unless one is building a new system or HTPC. I think this fact is hurting AMD, too.

    Besides that, with AMD having to pitch their top Trinity APU against Intel's i3 chips, I think it is time for AMD to stop any artificial market segmentation on their side. Instead, just make as many high-end APU they can for each TDP bracket and sell them at lower prices. The economies of scale might help bring A10-5800K down to $100, then they would stand a chance against Intel until Excavator cores are out in a couple of years. I don't know... Just a thought. It's painful to see how far the single-thread performance is at this point and Haswell is not even out yet.
  • fteoath64 - Tuesday, October 02, 2012 - link

    Agreed!. AMD should be selling many variants of this APU at different prices. In fact, since Trinity is so highly clocked, it makes the Llano look better, I am gunning for a A8-3870 llano which is just a tab slower than A10-5800 but if Llano can be clocked to 3.3Ghz or 3.5Ghz then it would be a real cracker for the price yet it idles at just as low power consumption. To me, AMD has not made progress since Trinity is highly clocked compared to Llano. If a Llano hits 3.8Ghz, it would blow away the top-end A10 chip which is an embarrassment. We need to compare frequency as well to make clock-for-clock improvements (if any, I can see very few). So, this is why Trinity was artificially delayed to get the Llano parts moving. I can see these Llano parts moving quicker now that the benches are out for all to see. Reply
  • Medallish - Tuesday, October 02, 2012 - link

    I don't get your logic, Llano will sell better now that Trinity is out and it's better? No one really cares about clock for clock, there are three major factors here, Price, power, and performance.
    A10-5800K comes at around the same price and performs better than Llano A8-3870 in pretty much every way.
    A10.-5800K uses on average less power than A8-3870, while delivering better performance.

    If I buy an APU it's usually based on two factors, CPU and GPU performance, on both of these it's better than Llano while staying within price and power of the previous Llano flagship.

    Let me put it in another way, if Haswell turns out to run 4GHz to achieve same speed as a 3GHz Ivy Bridge, but it cost the same, and used 20% less power, would you care at all?

    My Desktop is still running Phenom II, I plan on upgrading to Vishera once it's released, no one expected Trinity to be an upgrade for exsisting Enthusiasts using AMD systems, unless they're running something even older and just want something average. Where I plan on using Trinity is in an HTPC. If Iwanted a workstation I think Trinity would make a lot of sense, it sort of makes it all around in terms of work space tasks. For my gaming PC obviously I would stick with a higher end Desktop configuration, but that's not to say that Trinity can't work, we did see gaming reviews last week using a dedicated 7870 and still keep up once the graphics details were pushed.
  • mikato - Wednesday, October 03, 2012 - link

    "...unless one is building a new system..."

    Uh, isn't this 90%+ or something of CPU purchases? Actually building a new system (for my parents) and maybe an new HTPC (for me) is exactly why I'm interested.
  • Paulman - Tuesday, October 02, 2012 - link

    I just realized that an emerging use case for gamers is live-streaming (or even Fraps'ing) themselves while they play. I'd imagine that's a use case that makes more use of multiple cores, since most games these days only really stress two or so cores?

    What do you think about eventually introducing a live streaming / video encoding benchmark to represent that use case? (I'm most familiar with Xsplit, as that's what most of the gamers and broadcasters I watch [on Twitch.tv] use)
  • elhoboloco - Tuesday, October 02, 2012 - link

    You should try FFSplit. It's so much easier! Reply
  • Ryan Smith - Tuesday, October 02, 2012 - link

    It's worth noting that the moment someone introduces a live streamer with support for Intel's QuickSync and AMD's VCE video encoders, such a benchmark would be made redundant. Those fixed function encoders are designed in large part around real time encoding and completely offload the process, so bogging down the CPU to do real time encoding has effectively been rendered obsolete. Reply
  • Spunjji - Tuesday, October 02, 2012 - link

    That seems to be a big if rather than a matter of when, though, given the patchy support that's been forthcoming for QuickSync so far! So possibly a valid avenue of investigation anyway. :) Reply
  • eBombzor - Tuesday, October 02, 2012 - link

    Am I missing something in the benchmarks? Tom's did a CPU comparison with the 2100 and the 8120 (which isn't a whole lot different from the 8150) and the 8120 is near the Phenom CPU gaming wise.
    Something is not right here, the 2100 dominated the 8120 in Tom's benchies, the 3220 should be better.
  • Ryan Smith - Tuesday, October 02, 2012 - link

    Just thumbing through Tom's article, it looks like they're using 1920x1080 with high quality settings (GPU-limited settings) while we're mostly using 1024 and 1680 in order to ensure we're CPU-limited. Reply
  • Rezurecta - Wednesday, October 03, 2012 - link

    Who cares about CPU limiting? You're not going to play a game @ 1024. 1680 might be valid, but why not show benchmarks at 1920? It just doesn't make sense to show a benchmark that isn't at a major demographic point.

    It could be a very misleading benchmark for a substantial amount of readers.
  • CeriseCogburn - Tuesday, October 09, 2012 - link

    But it makes amd look better, so it's awesome, and irresistible. Reply
  • Rand - Tuesday, October 02, 2012 - link

    Why was the overclocking test done on Windows 8 (Image shows Win8), while the performance testing was done on Windows 7 (Test setup lists Win7)? Reply
  • nofumble62 - Tuesday, October 02, 2012 - link

    This Trinity performance didn't beat i3, let alone i5.

    AMD statement " i5 performance at i3 price" is a total lie.
  • ac2 - Tuesday, October 02, 2012 - link

    It's only true for heavily threaded integer work and AES...

    But yeah, disappointing...
  • Taft12 - Tuesday, October 02, 2012 - link

    ... and gaming on the integrated GPU Reply
  • MySchizoBuddy - Tuesday, October 02, 2012 - link

    Legit reviews state that AMD advised them to disable turbo mode else it will throttle the overclock. They were able to overclock it to 4.6 with full stability using a larger cooler.

  • MySchizoBuddy - Tuesday, October 02, 2012 - link

    with Corsair H100 water cooler they were able to overclock it to 5GHz stable. Reply
  • mikato - Wednesday, October 03, 2012 - link

    Sounds legit. Reply
  • MySchizoBuddy - Tuesday, October 02, 2012 - link

    Can the GPU be overclocked or is it just for the CPU Reply
  • Medallish - Tuesday, October 02, 2012 - link

    AMD did a preview where they showed off an overclocked APU with the GPU pushed up to 1GHz, so I'm pretty sure it should be possible, but it might be a feature exclusive to some motherboards or chipsets. Reply
  • Doby - Tuesday, October 02, 2012 - link

    I don't understand why the load power is more important than idle. I don't know about most people, but I don't turn off any of the 3 desktops/media center PCs I have in the house, but I do know the majority of the time they sit idle. I haven't done the math, but I would bet quite a bit of money that over all I will spend less on my power bill by having a lower idle than having lower at load power draw.

    I could see the issue on portable machines running off battery, but even then running full out is unlikely, and it becomes a "hurry up and wait" scenario that probably requires better analysis.

    I feel like benchmark performance is a bit over rated, we need to see value the consumer can leverage. I want to know which one "feels" faster, likelihood of running into application issues such as graphics drives, large displays, or even stability problems. I want to know what I can recommend to my parents for a basic internet PC. Even if I was doing video transcoding I'm not sitting around waiting for it, if it takes more than 30 seconds I'm up not waiting for it, it might as well be 5 min.

    I know its easier to just post a bunch of benchmarks, and I do still like to see them. But lets progress computing to the next level, user experience. I'm fine with a "doesn't make a difference" answer too, but that's still better than "12% faster for a process that you won't wait for anyway".

    Don't get me wrong, I enjoy the article, and appreciate the write up. Just looking for a bit different education.
  • Visual - Tuesday, October 02, 2012 - link

    I don't worry about power bills from running 24/7 idle. I have enough other devices at home that a PC wouldn't make a dent either way. But I do prefer if the computer cooling was not heard throughout the apartment.

    Power use under max load is important in selecting the case and cooling system, etc. You may load it that much only for a few minutes, but your setup still needs to handle that "worst case" situation. And if you are going for a quiet setup, it would be no comfort for you if it is quiet when not used, but racks up the noise and begins to bother you when you start using it.
  • halbhh2 - Tuesday, October 02, 2012 - link

    Great questions, and we can see that the idle power draw of the A10 is good, and so it would make a good choice for typical use (which is 80-85% idle) for a typical use laptop, and it would do well obviously in playing many games, and can potentially do this at a reasonable cost (depending on the OEM maker, like HP). So, just like you, I wonder how long the typical battery run time is. That's how I bought my yr-old HP laptop -- just typical battery run time and a good display, and good price. I knew enough to know those were the parameters that would matter for our laptop, and it would be okay at occasional demanding use just as you describe -- when it takes more than 20-30 seconds, you are off doing something else anyway. Reply
  • Hubb1e - Tuesday, October 02, 2012 - link

    Doby, I agree with you. I think it really comes down to a yes or no question on each use case. Benchmarks are nice, but i3s and A10s are really not enthusiast level CPUs so 12% single threaded advantage doesn't matter in the long run. The question to ask is "Can it run my applications"

    And in the case of Trinity vs i3 the main difference is that it can play games while the i3 with HD2500 graphics can't. If you look at Diablo 3, that game was played by a lot of people that are not traditionally PC gamers. An i3 with HD2500 is barely playable while Trinity is a pleasure to use on Diablo 3. For most PC buyers I think Trinity makes a lot of sense where idle power matters most, CPU performance is competitive, and gaming is possible. People who buy off the shelf PCs are not that comfortable putting a GPU in their rigs so Trinity is a good option for a general use PC. Enthusiasts who aren't AMD fanatics will stay away from this chip and that's fine. It will be a sales success for AMD if they can make enough profit on it. It is a pretty big chip...
  • Roland00Address - Tuesday, October 02, 2012 - link

    The reason why idle is unimportant for desktop is that both companies processors idle at such an insignificant amount.

    The Amd a10 idles 7 watts lower than the i3 and i5.

    1 watt used 24 hours a day 7 days a week is 8.76 kWh thus we are talking about 61.32 kWh (7*8.76) a year. The cost per kWh is different in differnt places in the US but it is about 10 cents a kWh so we are talking about 6 dollars more a less in energy savings a year.

    Aka talking about idle power is a virtually insignificant number for desktops. Now for laptops on the other hand it is a big deal for laptop idle power use affects battery life.
  • phoenix_rizzen - Tuesday, October 02, 2012 - link

    If it's 8.76 kWh per week, should you be multiplying by 52 weeks in a year? Thus 455.52 kWh per year? Reply
  • Roland00Address - Tuesday, October 02, 2012 - link

    1 watt *1000 hours equals 1 Kilowatt hour which is abbreviated 1 kWh

    24 hours a day *365 days per year equals 8760 hours or 8.76 kWh

    I should have said 24 hours a day, 7 days a week, 52 weeks a year. I apologize for leaving off the 52 weeks a year part it was a slip of the tongue.
  • KAlmquist - Tuesday, October 02, 2012 - link

    Good point. By my calculations, the break even point between the AMD A10 and the Intel i3 occurs when the amount of computation you are doing has the busy A10 doing 14.5% of the time and the i3 is busy 16.5% of the time. That assumes that the computation you are doing is similar to the second pass of x264 HD; the numbers might be different with a different work load. I do know that my computer is busy much less than 15% of the time. Right now, for example, I am using the computer to enter this comment, and the CPU is basically idle.

    Of course Visual is right that power use under max load does matter even if your system is idle most of the time. But after seeing how Bulldozer fared against Sandy Bridge, I expected Ivy Bridge to crush Piledriver (the Trinity CPU) in power consumption. You can argue that Ivy Bridge still wins, but it is a big surprise (at least to me) that it is a close call.
  • bgt - Tuesday, October 02, 2012 - link

    I also believe the experience is important too. Since I use Intel and AMD PC's I often found the bench numbers a bit meaningless. Intel CPU's may be fast at small single threaded jobs but AMD is often better at big workloads. When the A10-5800 is out here I will compare it to my 3225, used in a HTPC. I already noticed a difference in graphic display between my 3850 and 3225 when watching a HD movie on my TV. The screen looks a bit hazy,foggy when the 3225 is used. Compared to the 3850 I mean.Contrast is not as nice. Deep black is not really black. Reply
  • nevertell - Tuesday, October 02, 2012 - link

    It would've been really great if you had included Phenom II and Athlon II cpus, to see whether or not it is reasonable for someone owning those systems to upgrade to the new APU's. AFAIK, I still think these CPUs are relevant to the general consumer, as long as they are in stock. Reply
  • Medallish - Tuesday, October 02, 2012 - link

    I still run on my Phenom II X4 965@ 3.8GHz, it's a wonderful CPU, although I suspect it will be replaced by Vishera soon. Reply
  • Mickatroid - Tuesday, October 02, 2012 - link

    Me too. A C3 stepping at 3.9 with a modest over volting. I am planning a micro-ITX trinity for my new workshop computer in a custom box with a car air filter (my Athlon XP is well past it).

    The good news is that for basic stuff (and more besides) Trinity offers all the computing powered I could ask for.
  • Roland00Address - Tuesday, October 02, 2012 - link

    Click bench in the upper corner of the website and enter the two processors you want to compare.

    The A10 5800k gets roughly the same performance as the phenom ii x4 965
  • mikato - Wednesday, October 03, 2012 - link

    Wow! Glad you mentioned this. I have a Phenom II X4 965 and plan to keep it for quite a while yet and it's as fast as anything (that I would notice). With that in mind, an A10 5800k would be a beast for my parents build or an HTPC. Better yet a 5700 if I can get one. Reply
  • owlxp - Tuesday, October 02, 2012 - link

    Why even do the comparison with high end discrete cards? Everyone knew what those results were going to look like. We've known for months. Despite the "i5 performace at i3 prices" marketing, AMD can't compete with the better intel CPU's............but that's not where the value is. AMD dominates the low end market and the bang for buck. Why is there no review on what the max discrete card that can be used in the hybrid xfire set ups? What about some tests to see if Trinity can do multi-discrete + on die GPU triple and quad xfire? That's what I'd like to see. There were some reviews comparing radeon 7750 in crossfire to nvidia's 640, 650, and 660 and the dual 7750 set up was winning the 1080p with 4x and 8x aa on match up. If the a10 + 7750 can put out similar results.........that's going to be an easy way to capture the budget and midrange gamers. Especially if triple and quad hybrid xfire set ups are possible.

    Where is the value test and the chart for the hybrid xfire set ups? That's what I want to see.
  • meloz - Tuesday, October 02, 2012 - link

    >AMD can't compete with the better intel CPU's............but that's not where the value is.

    If there's no value there, please alert Intel. They didn't get the memo, nor did the consumers. Inspite of making CPUs with "poor value", Intel outsell AMD 9:1 _and_ make record profits quarter after quarter.

    >AMD dominates the low end market and the bang for buck.

    And in the process AMD make losses, quarter after quarter.

    No one wants to "dominate" the low end, broseph. Low end is where you naturally end up if you are not good enough to compete with the top and mid end.
  • SymphonyX7 - Tuesday, October 02, 2012 - link

    Idiotic assessment. On the GPU side, AMD completely dominates Nvidia from the low-end all the way up until just below the high-end. They may not have the fastest of the fastest, but volume-wise they do sell more simply because they beat the competition in performance for the same dollar. It has something to do with Intel's business practice particularly with OEMs that make them sell more at every price point. Reply
  • meloz - Tuesday, October 02, 2012 - link

    I can only assume that the bold "Idiotic assessment" declaration was to describe the diarrhea that followed in the rest of your reply, because nothing about that made any sense or correlates with reality.

    Today we learnt: AMD "dominate" and outsell nvidia, although no one is quiet sure what this fantastic fiction has to do with the discussion about CPUs.

    Today we learnt: Intel are somehow being shady with their OEM customers to sell more CPUs. And yet, these OEM customers prefer to deal with 'shady Intel' -and Intel continue to make record profits- inspite of the "fact" that AMD offers more performance / $.

    Those OEMs be crazy. The consumers be crazy. Erryone crazy, but pure and noble AMD.

    In your desperation to find anything positive about AMD -and all things negative about Intel and nvidia- you come across sounding like Comical Ali.
  • owlxp - Tuesday, October 02, 2012 - link

    I'm not trying to bash either company here. Facts are facts. Intel has more compute power and AMD is better on the GPU side of things. I'd just rather see something that pushes the limits of Trinity's benefits and see how it stacks up (dollar to dollar) to what something intel can offer. To not post the max allowable gpu for a hybrid crossfire set up seems like huge oversight to me. Strapping on high end discrete cards to each processor is a pretty useless test, IMO. It's no secret that intel has been stomping AMD in sales......I never argued that. That doesn't mean that the market can't shift. The average computer user might be ok with "good enough computing/entry to mid level gaming." It seems like that is what AMD is gambling on. If the number of power users decline and the the low end market grows, AMD stands to do very well. All speculation of course, but, it appears that this is AMD's strategy. The fact that it wasn't the right approach for the past few years is irrelevant. Trinity now gives AMD something intel cannot match (dollar for dollar) It's now in the hands of the consumers. Reply
  • Byte - Tuesday, October 02, 2012 - link

    Me and a lot of buddies still have first gen i7s and such, how does Trinity compare to to them. Any sites do a lot of benchmarks with older processors for comparison? Reply
  • Penti - Tuesday, October 02, 2012 - link

    There is i7 860 in the benchmarks, and yes a 2009 CPU still has about the same power as a modern mainstream desktop processor. In that way it has stagnated, but at least AMD tries to do architectural improvement every year now. Let's see how that goes, but they need to bring in some talent. K8 was around for more then four years without any real architectural changes. K10 was around for about 4 years too. It's also a good thing that you don't need to buy a new rig just for the performance every few years now. A three year old cpu still has about the same power as mainstream today, so as far as say gaming goes it's mostly about the GPU. All that has been introduced since 2009 is basically SATA 3/ 6gbps, PCI-e 3.0, USB 3.0 two of which you could add to a system with an add-in board. We are still on DDR3. It will take some time before we see any major increases. Most has come in the mobile/notebook form factor. AMD get's GCN integrated graphics next. That will have to wait for Kabini/Temash though, and HSA has to wait for Steamroller/Excavator. It think they set the bar a little low with 10-15% performance increases each year though, that might be fine but they also need to leap to something truly new with performance before that is a satisfactory gradual increase. Reply
  • abianand - Tuesday, October 02, 2012 - link

    how about some gaming tests at 1920x and 1440x?

    after all, there are a lot of ppl playing at these resolutions - including me at 1920x.
  • Kevin G - Tuesday, October 02, 2012 - link

    With a discrete GPU, at 1920 x 1080 you'll generally be limited by the discrete GPU. There are exceptions depending on what game/video card is used but there wouldn't be much to gain by testing this.

    Testing at 1920 x 1080 using the integrated graphics with modern games would be painful. It doesn't matter if the integrated graphics were from Intel or AMD, you'd get a slide show due to the lack of memory bandwidth on these parts. The only hope for integrated graphics to pick up performance would be using eDRAM to side step the bandwidth issue a bit.
  • abianand - Tuesday, October 02, 2012 - link

    you're right...playing at 1920x with any IGP leads to non-smooth gameplay in a few games.

    However, if someone needs the 4 integer cores and is also looking at the A8 and A10 chips as possible cheap/value gaming chips - they are just $130 - perhaps they need to know how it performs at that resolution.

    About the 1440x and the 16xx resolutions, I am from India and here many still use the 'lower' resolutions.

    I am just saying that that extra information would (have) be useful, is all.
  • Roland00Address - Tuesday, October 02, 2012 - link

    If you are a cheap gamer adding a discrete get the Athlon x4 750k which is the trinity cpu sans the integrated graphic and is to have an expected price of $81. It also has an unlocked multipler.

    Use the $40 dollar savings to get a bigger graphics or a larger size ssd.
  • abianand - Tuesday, October 02, 2012 - link

    yes, that's a good suggestion ! Reply
  • ditroia - Tuesday, October 02, 2012 - link

    Hi Does anyone know what the highest end Radeon GPU I can use to Crossfire with the 7 Series GPU on the APU?

    Thanks in Advance

  • Roland00Address - Tuesday, October 02, 2012 - link

    Cards you can do crossfire with the integrated graphics are as follows

    If you buy itself
    6450, 6570, 6670

    If you get it prebuilt (these are the same gpus as above but they have a 1000 added to the number for that is what hp, dell, acer convinced amd to do for bigger numbers sell).
    7450, 7570, 7670

    That said you might want to do your research on asymmetric crossfire before hand, based on other websites reviews that tested it you are only going to get 30 to 60 percent scaling and the 7700 series cards are usually faster than the asymmetric crossfire.
  • ditroia - Tuesday, October 02, 2012 - link

    Disappointing that you can't crossfire with a & series Discrete 7 series GPU, as I think that would have made a cheap but powerful Gaming system.

  • mikato - Wednesday, October 03, 2012 - link

    Yeah I was really hoping to see a Crossfire with a lower end card as part of this review. Reply
  • Jamahl - Tuesday, October 02, 2012 - link

    Llano vs the 2105 last year -


    "Look at our single-threaded Cinebench scores below and you'll see a 50% performance advantage."

    "CPU bound gaming performance is also an area where the A8 falls behind the i3. Here you're looking at a 25 - 50% advantage for the i3-2100/2105"

    This is partly due to Ivy Bridge being so underwhelming.

    The only problem I see is with load power which has gone backwards.
  • JKnows - Tuesday, October 02, 2012 - link

    Video Drivers: AMD Catalyst 12.3 ???

    Are you kidding? That driver cannot even use Trinity architecture...
  • aislanluiz - Tuesday, October 02, 2012 - link

    We’re power users, after all. We know how to cope with heat and noise; we can deal with a 100 W chip, even in an HTPC. But there’s no way to make the Core i3 look better unless you spring for an add-in card. AMD’s emphasis on balance makes the A10-5800K a better platform for more people than Intel’s closest competition. Reply
  • frozentundra123456 - Tuesday, October 02, 2012 - link

    Trinity may be attractive for HTPC, but I cannot imagine a "power user" who does not have a discrete card in a desktop, at least if you are at all into gaming. The igpus are pathetic for gaming and you get better performance with intel plus discrete card. Reply
  • ericore - Tuesday, October 02, 2012 - link

    I don't think Anand has a single benchmark that uses an App from AMD appStore. Even Tom's hardware was considerate enough to use Adobe CS6 products. I'm sorry but this benchmark makes Intel look better than it does. The benchmark is fully Intel optimized, System Max is Intel Optimized, no a single benchmark is AMD optimized. Disappointed Anand.

    The following review is much more fair:

  • Beenthere - Tuesday, October 02, 2012 - link

    The haters won't be happy but it's great that AMD has delivered as promised with Trinity desktop in addition to laptop. Trinity desktop delivers performance and value for those looking for an all-in-one solution at an extremely affordable price - which is a growing segment of the market as people discover the huge cost savings compared to a discrete CPU/GPU set-up. BTW, no one would ever consider Intel 2500 graphics as being usable, so the Trinity desktop APUs are literally in a class of their own for now.

    It's also worth noting that single thread performance is not the holy grail. Most people running modern software will be perfectly happy with Trinity desktop performance even if you crunch numbers now and then. Price and performance wise AMD has delivered a winning solution when you look at performance vs. cost. This will force Intel to discount it's i3/i5 products, which is good for all.

    Obviously Trinity desktop is NOT intended as a replacement for a highend discrete CPU/GPU system. That will come in a few more years but for now AMD continues to offer excellent performance and value, which is what mainstream consumers desire.
  • jamyryals - Tuesday, October 02, 2012 - link

    On the whole, mainstream consumers do not use desktop PCs. They use laptops. Reply
  • Beenthere - Tuesday, October 02, 2012 - link

    Sorry but your beliefs on laptop vs. desktop usage is incorrect. While a lot of young people, sales people etc. use laptops, the split is almost 50/50 according to a recent survey, meaning that ~50% of PC users still use desktops.

    That being said, Trinity laptop is by far the best choice for mainstream consumers who prefer a laptop over a desktop.

    Either way it's a win for AMD and consumers on the desktop or for a laptop. Servers will be next as the AMD APUs have shown excellent results for servers even though they were never intended for same.
  • silverblue - Tuesday, October 02, 2012 - link

    No, but Trinity sports a CPU architecture slanted towards servers. It's not such a strange idea in the end. Reply
  • BSMonitor - Tuesday, October 02, 2012 - link

    Could it simply be the priorities of the target customers that so greatly affect how Intel approaches the markets and SKU's??

    It seems like it would be very easy to grow the die and throw 24-32 EU's on the iGPU side of things. However, what portion of their vast corporate PC market really wants or could use that?? Application developers, end users, small business PCs, etc, etc do not benefit from extra gaming performance or video/encryption performance. Seems to me the market for fast video encoding, high FPS gaming is not this market. Those people buy horsepower. And then, power usage, cost is less important.. This group buys small footprint, low power consumption, and mass quantity..

    I think the times for the consumer retail PC space driving revenues are over. And Intel knows this. Retail markets are cheap, cheap laptops, tablets and smart phones..

    They cannot completely ignore it, but no one in that space buys "specs". Open Best Buy ad, go buy cheapest PC/laptop/tablet, have a nice day.

    AMD is not going to win over corporate markets with high power consumption, better 3D performance APU's... Just like they aren't going to win retail with more features/better specs..
  • mattlach - Tuesday, October 02, 2012 - link

    With IGP's like this, it won't be long before Nvidia is in real trouble.

    Their high end parts will still sell, but their volume shipments that keep the lights on are in the budget parts, and with better and better IGP's no one is going to need them anymore.

    I don't even know what kind of solution Nvidia may come up with for this. They can't design their own desktop/laptop CPU component (or at least this is highly unlikely). It also seems unlikely that they'd get bought up by Intel at this point.

    So what is left for them?

    Maybe a deal with Cyrix/Via/Centaur (or whatever they are called this week) or ARM or some other minor player?

    Exit the desktop GPU market all together and focus on their Tegra/ARM designs?

    I can think of a few ways this will go, and none of them are particularly good for the desktop video card market.
  • jwcalla - Tuesday, October 02, 2012 - link

    Hence why they got into the mobile market. NVIDIA had the foresight to see the changes in the market and got in on the ground floor. And while Tegra isn't exactly the best SoC out there, it's won some tablet orders and NVIDIA can always improve it as they go along.

    At least they're in the game.

    Their discrete GPUs will continue to have a presence in the HPC market, and with NVIDIA slapping an ARM co-processor on their GPUs (~2013-'14), I think they'll be even more competitive in that market in the future. And it also allows them to offer an attractive option for any potential console system down the line, or even HTPC devices.

    So I think their discrete video card business has to continue to be nurtured even if they lose a bit on the desktop side.
  • mattlach - Tuesday, October 02, 2012 - link

    I agree.

    I don't think Nvidia will be in trouble of going out of business, but they may be forced out of the desktop GPU market.

    Currently, they are able to spread the cost of R&D on a new architecture over low, medium and high end parts. (the chips are not always the same, but the base architecture, where most of the development work goes, is)

    If the low end volume parts go away, it becomes tough to see how they continue to maintain profitability on the high end GPU's.

    They will always have their mobile market, and it is a good market to be in, but I'm concerned we might be left with only one player in the high end GPU market.
  • mattlach - Tuesday, October 02, 2012 - link

    I'd be interested in building an HTPC around this platform, but I'd want a low power part.

    It's a little disappointing that there is no sub 65W part.

    A desktop version of a mobile trinity part would be perfect for this.
  • cjs150 - Tuesday, October 02, 2012 - link

    That was my thought as well. Intel have an i7-3770T, over priced, way over-powered for an HTPC (and yes I bought one!) but TDP of 45W.

    AMD need a A10 class chip with a TDP of 35W or less. The move down to 22nm fab cannot come fast enough
  • jwcalla - Tuesday, October 02, 2012 - link

    It really depends on the intended usage of the HTPC. I'm not sure what the deal is with Hulu and Netflix (i.e., if they're CPU bound), but if your usage pattern mostly centers around hardware video decoding, then most of these chips are going to be well overdone.

    A simple Zotac Zbox w/ Celeron + NVIDIA ION is going to be more than enough (as long as you don't have Hi10p content), and for video decode VDPAU is far more mature than anything I've seen from Intel / AMD at this point.

    And within 6 months I'd be looking into ARM-based solutions which should be even cheaper, smaller, quieter and cheaper to upgrade (e.g., when H.265 comes out). (Unless one insists on using non-ARM ported software, like maybe WMC.)
  • Hubb1e - Tuesday, October 02, 2012 - link

    Atom falls down with Netflix and idles around the same point as these chips. Reply
  • mattlach - Thursday, October 04, 2012 - link

    My current AMD E-350 can not handle Netflix HD, which is disappointing.

    The way Netflix has implemented DRM in Silverlight, hardware decode acceleration doesn't work on any system, so it all hits the CPU, and many lower end systems (Atom, E-350) just can't handle it above SD resolutions.

    Everything else I've tried (Youtube, custom encoded video files, etc. etc., habe not tried Hulu though) works just fine, as the on-chip GPU offloads the CPU and they play fine, but Netflix HD chokes, and chokes badly.

    Thus the need for a low end, power efficient Trinity setup.

    I would love one of those mobile Trinity chips in a desktop FM2 package. A 35W TDP Trinity or lower would be perfect for my HTPC needs.
  • stimudent - Tuesday, October 02, 2012 - link

    We need to appreciate and thank anandtech.com for being professional about a staged release. This is in stark contrast to taking part in cry baby journalism that TechReport.com engaged in. Reply
  • Pythias - Tuesday, October 02, 2012 - link

    "Professional" is a polite euphemism for prostitute in some circles. Reply
  • Pythias - Tuesday, October 02, 2012 - link

    Why did you remove the pentium from the descrete gaming chart? Reply
  • Hubb1e - Tuesday, October 02, 2012 - link

    Because he wanted to keep the forums ripe for the trolls who think a Celeron plus a 6670 are faster than Trinity. Reply
  • Pythias - Wednesday, October 03, 2012 - link

    Seems rather dodgy. Reply
  • HisDivineOrder - Tuesday, October 02, 2012 - link

    Hopefully, one of these days, Intel will be bothered enough by these APU's that they release an i3 with boost and an unlocked multiplier with the higher end version of the integrated GPU.

    Price it right at the high end of AMD's APU lineup, bam, the whole thing is dead in the water. I suspect Intel wants AMD to seem like they're competition to keep the regulators away, so they're holding back on the obvious killshot.
  • mikato - Wednesday, October 03, 2012 - link

    It would somewhat kill their higher priced CPUs also though. Reply
  • eanazag - Tuesday, October 02, 2012 - link

    I wasn't expecting AMD to close the performance gap or power usage gap with Intel. I was concerned that it might not even be able to beat Llano CPU performance consistently. I feel comfortable with the CPU performance. I think at even lower pricing AMD could dominate the low end market. I have a Core i5 860 and Core i3 Arrandale (same time frame). I certainly noticed the 860 appearing in the benchmark numbers. The A10 Trinity is not too far off the performance of my 860, especially when considering price in factors. And off the FX-8150 it has pretty good performance, which is a decent sign for the higher end AMD desktop parts coming soon (can't really call the AMD high end parts).

    Idle power usage is excellent. Power usage can make sense for AMD if you are considering a cheap Intel proc and a discrete card. I think in that situation AMD makes more sense.

    Drivers and support go to AMD on both sides of software compatibility (AMD drivers and game support).

    SATA 6Gbps on Intel with just 2 builtin ports still upsets me. This should be an advantage for AMD at the platform level.

    I want to see the desktop and server chip data next. I am glad that I don't have to rule out buying the AMD parts from the get go. The buying decision will still be that.

    I still believe AMD should make a 200W part combining their FX proc with a 7700/7800 range GPU. I think they could dominate the midrange with that and who could compete then? From price and gaming performance at least. Power and heat still makes sense considering the removal of that level GPU from the case. Hell, I could deal with 250W and be happy also.
  • creed3020 - Tuesday, October 02, 2012 - link

    I would really appreciate it if a similar test of Trinity as what was done to Llano regarding GPU Performance vs. Memory Speed was completed (http://www.anandtech.com/show/4476/amd-a83850-revi... I am curious if the trend has remained the same, improved, or decreased.

    I am in the process of building a new Office PC for family whose needs are basic and Trinity fits the bill quite well, especially the A8-5500 or A6-5400K. I want to purchase memory that compliments the GPU well.

    On another GPU note I find it strange that there was no test of AMD Radeon Dual Graphics (http://www.amd.com/us/products/technologies/dual-g... as that a native scaling of GPU platform for this APU, not a high end discrete GPU. The latter usage scenario just doesn't seem that common considering the target market for the APU.
  • Hubb1e - Tuesday, October 02, 2012 - link

    Go with 1866 or 1600. 1866 is about 5% faster GPU performance if that matter much in your use case. Reply
  • creed3020 - Wednesday, October 03, 2012 - link

    My current HTPC uses an A8-3850 with 1866 memory so I am aware of the benefits, my question is more about getting an understanding of this phenomenon with Trinity. I am curious if it was has become less important or perhaps even more so.

    I'm not gaming on my APU so there is no concern to squeeze every drop of FPS out of the GPU. I am more curious from a research and review standpoint.
  • Moizy - Tuesday, October 02, 2012 - link

    Anand, you mentioned several times that Trinity holds the integrated graphics and overclock
    advantage, while Intel holds the single-threaded and power consumption advantage. To me, though, the A10-5700 attempts to address the power consumption advantage by offering a lower tdp without cutting down the clocks too much (while sacrificing overclockability though).

    Throwing in the A10-5700 at some point in the future, assuming you can get your hands on one, would provide an interesting comparison for those interested in Trinity's gpu and competitive power consumption.
  • ewilliams28 - Tuesday, October 02, 2012 - link

    apologies if it's been covered but i would like to know exactly which cards work in this mode. I have heard that if you go too high they don't work together. it's my understanding that 7670 are OEM only and i can't believe that the 6670 that i can buy is still the best i can do. i plan to use 1080p since 1920x1200 has basically gone the way of the dodo bird. but i do like to crank up the settings. luckily the most complicated game i play is World of Warcraft. i will probably fold with this box though.

  • creed3020 - Tuesday, October 02, 2012 - link

    http://www.amd.com/us/products/technologies/dual-g... Reply
  • halbhh2 - Tuesday, October 02, 2012 - link

    Overall, a Trinity laptop would do fine during idle, which *is about 85%* of what 90% of laptops do when they are on.

    That matters.

    So, an interesting test for real-world use for *most* consumers (wife, kids, most of the people most of us know) would be a run-time battery life test for leaving the computer on, and surfing to 25 web pages, and playing a couple of modest games for 45 minutes, and then watching a streamed movie.

    That would be real world use for 90% of laptops.

    In view of that, for people that aren't using their laptop in a demanding way, a good question is how much does it cost, and how long does it run until you need to plug in. That's all.
  • jfelano - Tuesday, October 02, 2012 - link

    Why does Anandtech still use 1280x1024 and 1680x1050 as their bencmarks? Is this still 2008? Reply
  • Beenthere - Tuesday, October 02, 2012 - link

    Answer: Because most people still use these screen resolutions. This review is for a desktop APU, thus the appropriate screen resolutions. Reply
  • silverblue - Tuesday, October 02, 2012 - link

    Hehe... you try playing anything remotely recent at a higher resolution on an IGP... Reply
  • mattgmann - Tuesday, October 02, 2012 - link

    I still don't understand where these CPUs fit in the market. Sure, this line of CPUs has made some advances, but the features it relies on to succeed aren't good enough in real world applications.

    1. Gaming. It's still not fast enough to run modern games. Not to argue, but lowest possible settings and super low frame rates aren't good enough.

    2. Content creation. In certain points in your day you may run a short program that's optimized to work well. But the rest of the day, you'll be wishing you had a quad core intel.

    3. Casual home/office use. A pure waste of electricity. The intel chip decodes HD video fine and provides a quicker user experience with a much lower energy cost (and is dead silent)

    The "upgrade path" argument doesn't make a ton of sense either. In reality, not many people actually follow upgrade paths on platforms, because, in reality, you end up spending today's prices for yesterdays technology. Low end systems just aren't meant to be upgraded; they're meant to be replaced.

    I REALLY want AMD to give intel a kick in the but. I miss the days of low end, unlocked intel processors. Think of what those little i3's could probably do with unlocked multi's, 1.4V vcore and some fast memory!

    At least this is a step in the right direction for amd....sort of
  • Hubb1e - Tuesday, October 02, 2012 - link

    I'm sorry but your answers to your own question are inaccurate.

    1. Gaming. For many people who are not on this forum low / medium settings are fine and older games are cheaper to buy and are still fun. Wow and Diablo play very nicely on this APU at medium / high settings and that is where the vast majority of casual gamers are buying.

    2. Content creation. If you are wishing you had a quad core intel then you're in need of a real workstation, not an i3 competitor. I work fine on a mobile i5 -540 at 2.5ghz. 90% of the time it is idle.

    3. Casual home office use is all idle and if you go back and look Trinity idles lower than Ivy so I don't see your point about it being a waste of electricity. A quicker user experience is about the SSD and not the CPU. Users will not notice a 12% difference in CPU performance.

    4. Agree with the upgrade path, though as a builder for my family FM1 being a dead end made it a socket that I didn't want to touch.
  • mikato - Wednesday, October 03, 2012 - link

    Well said. I would also like to point out that Angry Birds and Words With Friends are also "modern games". Reply
  • vegemeister - Wednesday, October 03, 2012 - link

    1. Gaming: Look at those benchmarks. Low settings, punk-ass resolution, no AA, and STILL DROPPING FRAMES.

    HTPC: there are two kinds pf CPUs for HTPC: those that can decode 10 bit h.264 at 1920x1080 in real time, and those that cannot. Unfortunately, this review doesn't have benchmarks for that.
  • iTzSnypah - Tuesday, October 02, 2012 - link

    The A4-5300 looks promising for its price and intended use. I keep telling my brother that he really needs to upgrade his computer (8 year old HP with Single-core AMD Athlon X64) and the A4-5300 looks like it would fit his needs perfectly. I get tired of going to his house and waiting 5 minutes to open the internet. Also only being able to watch 360/480p (depending on the 'mood' of the computer) is beyond annoying. Its his birthday this month so I might surprise him. Reply
  • Hubb1e - Tuesday, October 02, 2012 - link

    I have a single core Athlon64 at 2.4 ghz that works just fine. The problem is a lack of ram, slow hard drive, OS bloat, and a lack of GPU acceleration for youtube. I have 1.5GB of memory and a good video card that offloads youtube and the single core computer runs pretty well. I am constantly amazed at how well it works for casual use.

    But yeah, an upgrade could be in order but I'd argue the Celeron G530 would be a better choice. Anand tests the Pentium and it actually beats the A10 in some benches. The G530 is still a full dual core CPU and is only a few mhz slower. The A4 drops a whole module and in benches on Toms looks pretty slow.
  • Jamahl - Tuesday, October 02, 2012 - link

    Where was the A4 benched on Tom's? From what I can tell the A6-5400K is drawing very close to the 3870K in gaming. The A4 will be further behind but it'll still be up with say, the triple core A6-3500 performance imo. Reply
  • Ananke - Tuesday, October 02, 2012 - link

    These are great for OEM, the 95% of the PC market :). You know, what people are buying from HP, Dell, Lenovo etc.

    Enthusiasts will probably not be appealed by Trinity, but enthusiasts are very small market.
  • wenbo - Tuesday, October 02, 2012 - link

    Enthusiasts are very small market, but they are very VERBAL :) Reply
  • vegemeister - Tuesday, October 02, 2012 - link

    I see that you are planning to move to a newer version of x264 for benchmarking. Since direct comparisons are going to be invalidated anyway, why not go ahead and move to a crf encode like everyone else not stuck in the last decade?

    2-pass does not compress any more effectively than 1-pass. The only reason to use it is to get very close to a particular file size. x264 is much better than you at deciding how many bits it needs for acceptable quality on a particular file. These days, most people store their video on media far larger than a single file. It no longer makes sense to benchmark the use case of sqeezing as much quality as possible out of 700 MB.
  • wenbo - Tuesday, October 02, 2012 - link

    I think for real enthusiasts building a gaming PC is going to cost 750 to 850 dollars + what ever display you buy. AMD's FX-8150 seems to be really good. it's currently sold for 189 dollars less 20 promotion. And the combo is $502.00 + about 200 graphics and 100 SSD + 30 dollar fan give an acceptable gaming PC for less than 850 (with some savings power supply and case give you about $70 off, which means a total of $780) And an equivalent Intel I5, you pay for non-usable HD graphics is going to be about 50 to 70 dollars more expensive (on the processor and motherboard).

    The difference is on processor is really not that much without much of a promotion on A10-5800K, $169 vs $122. The savings would be on graphics, the best one you can get is HD6670 (I think you can only hybrid crossfire on this), which is $70 after rebate. So the difference is about 180 dollars. and motherboard is cheaper, at 80 dollars, That means you have an entry level gaming PC for a little more than $600 ($607 according to above numbers).

    With similar configuration getting a intel i3-2100 is 119.99 at newegg + 129.99 motherboard + 70 graphics + 150ssd + 50 case + 50 power + 55 memory gives you 624.98 ( you don't need a fan, because you CANNOT overclock i3s).

    So there is really not much of a difference.
  • wenbo - Tuesday, October 02, 2012 - link

    It would be good to know the price for each of the PC build that was bench marked. Reply
  • owlxp - Tuesday, October 02, 2012 - link

    Your math seems a bit off here:

    i3-2100 set up:
    cpu - 120
    mobo - 130
    gpu - 70
    ssd - 150
    case - 50
    psu - 50
    memory - 55
    total = 625

    trinity 5800k set up:
    apu - 130
    mobo - 60
    ssd - 150
    case - 50
    psu - 50
    memory - 55
    total = 495

    If we're talking entry level gaming...........I'll take trinity and the extra $130
  • wenbo - Wednesday, October 03, 2012 - link

    I meant if you want trinity, don't you want crossfire? So that is 70 dollars at most. And A10 is unlocked that means overclocking, so you need a bigger fan which is 30 dollars. Motherboard is a bit more than 60 dollars, at least I couldn't find one that cheap.

    But I bet you are right, as soon as the holiday season motherboard price will drop to 80 - 90 dollars. If you don't care about overclocking, that's another 30, and yes, you can play most games in medium to low settings with respectable framerate without crossfire.

  • Tech-Curious - Friday, November 02, 2012 - link

    Why are you paying $130 for a motherboard to be used in a budget i3 rig? Do you really intend to pair a locked, budget CPU with a z77 chipset? Grab up a B75 for $60-70.

    I realize that AMD-chipset motherboards have a long-standing reputation for cost-efficiency relative to their Intel counterparts, but the facts just don't conform with that reputation right now.

    I'm also not sure why you're paying $55 for memory; I grabbed up a pair of 4GB DDR3 1600 Kingston modules a week or so ago for $30 on Newegg. Granted, that was a sale price, but the same modules are only $40 normally. In fact, if we're to compare apples to apples, the memory subsystem will tend to cost you more on current-gen AMD platforms because the AMD platform wants for higher-speed memory.

    (I am, of course, assuming that you're using USA-market pricing. If not, I apologize, but I still don't see why you think it's fair to compare a random AMD bundle deal against your seemingly hat-picked price scheme for a comparable Intel box. There are bundle deals for Intel products too.)

    For what it's worth, the build I just ordered about a week ago (mostly from newegg, unfortunately the shipping's been delayed by Hurricane Sandy) consists of the following:

    Intel Core i3 3220 - $100 (in-store pickup deal at Microcenter)
    ASUS P8B75-M (B75 chipset) - $70
    HIS Radeon HD 7850 (1 GB version) - $165
    2*4GB Kingston HyperX XMP DDR3 1600 (cas 9) - $30 with discount promo code
    Seasonic M12II 520 Bronze power supply - $69
    Intel 330 SSD 240GB - $180
    Windows 7 x64 Home Premium - $80 with discount promo code
    Random cheapo DVD drive - $15

    Total - 709

    Granted, I didn't buy a case because I already have one. Granted, I found a number of limited-time deals. I also splurged (a lot) more than your average budget builder on the SSD and power supplies. (For me, finally, the extra convenience of the larger SSD was worth going over budget by a fair amount. I'm also a shameless fanboy of Seasonic power supplies, and I like to have a little extra power head room.)

    I'm sure more savvy system builders could do more with the money I spent, too. The only reason I'm laying out what I bought is to give people an idea of what can be done if you're willing to hunt around a bit and be patient. You can wring a great deal of performance out of either an AMD or Intel system on a pretty tight budget.

    Sorry for the mini novel.
  • Tech-Curious - Friday, November 02, 2012 - link

    Check this out, for a budget Intel bundle, btw:


    Intel Core i3-3220 Dual-Core, Gigabyte GA-B75M-D3V Motherboard, Mushkin DDR3 8GB Memory, Mushkin 120GB SSD, Gigabyte ATX Mid Tower Case, Rosewill 400W PSU SuperCombo
  • Bob Todd - Tuesday, October 02, 2012 - link

    Looking through the narrow lens of A10 vs i3, I would absolutely choose Trinity since it fits my needs better. However, there is no upward scalability and there never will be. With LGA 1155 you could start with a < $50 Sandy Bridge based Celeron G530. You can go from there to any price/performance point up to an i7-3770K for $329. Then there's the enormous LGA 1155 motherboard footprint. Newegg has 254 LGA 1155 motherboards listed right now! That kind of competition has put a lot of downward pressure on pricing.

    Even for a cheap HTPC build I'm not sure I could justify an FM2 build vs. LGA 1155, the market is just flooded too heavily in Intel's favor giving you so many options to repurpose whatever you build. Trinity just doesn't make sense for me on a desktop. Mobile Trinity where I won't ever upgrade the CPU/GPU...now I've got something to get excited about. In smaller form factors without room for a discrete GPU (somewhere between 10" and 13"), they could have a very competitive product. Give me an X230 successor with Trinity and a 1080p IPS panel and I'll throw money at you.
  • owlxp - Tuesday, October 02, 2012 - link

    Well AMD did promise to keep the FM2 socket for their gen 3 cpus. I think Trinity is good enough to get you to 2014. If the gains in the next gen apu's are anything like what we saw from llano to trinity, that extra $130 I saved in going with the trinity build now, just paid for my upgrade a couple years from now.

    However, I completely get where you're coming from. I'm struggling with the same problem as I prepare for my next upgrade.
  • Aone - Tuesday, October 02, 2012 - link

    We have been hearing many times from AMD that iGPU inside APU are so powerful that they would kill off cheap dGPU. That what I have expected to see in the review: Pentium+cheap dGPU vs. APU(w/o dGPU).

    As to cheap prices for APU, we have to keep in mind that 1155 MBs are ~20$ cheaper than FM2 MBs, and APU systems need faster and therefore more expensive RAM than Pentium+d.GPU systems. to get better GPU performance.
  • mikato - Wednesday, October 03, 2012 - link

    It will get there in time. I suggest you check how fast the iGPU performance has grown with AMD's APUs. And they're just getting started. You know they do make high end GPUs and now they've proven they can put their GPUs on die with the CPU, so they will be putting this tech into their APUs. Reply
  • Aone - Wednesday, October 03, 2012 - link

    Who being in his right buys high-end discrete GPU for cheap CPU or APU?

    Plus, those who buy cheap CPUs usually don't have money for high-end discrete GPU.
  • Gaugamela - Wednesday, October 03, 2012 - link

    Here are benchmarks that test the importance of faster RAM in these APUs. The difference in performance is astonishing. http://hexus.net/tech/reviews/cpu/46073-amd-a10-58...

    By overclocking and using 2133Mhz RAM the A10-5800 can get approximately a 30% increase in 3DMark and some games.
    These Trinity APUs seem to be really interesting to tinker with.
  • creed3020 - Wednesday, October 03, 2012 - link

    Thanks so much for posting that. I've been looking for this exact testing of Trinity. AT did this previously with Llano but forgot this crucial test with Trinity.

    It really helps system builders to set expectations for performance if a client doesn't want to pay for faster memory, or if they do want more performance we can quantify how much an improvement faster memory will have.
  • mikato - Wednesday, October 03, 2012 - link

    Holy moly Reply
  • vozmem - Wednesday, October 03, 2012 - link

    Keep encouraging AMD, guys. Reply
  • rarson - Wednesday, October 03, 2012 - link

    Why in the world did you not mention which video card you were using on this page? I see that it's mentioned in the test bed, but why the heck do I have to go back and check that when you could have easily mentioned it on the discrete test page?

    Also, why are you using a 5870 with this? Who the hell is going to pair a new A8 or A10 Trinity with a 5870? That's completely illogical. Couldn't you have tried something newer, perhaps something within the same architecture? Extremely puzzling.
  • etamin - Wednesday, October 03, 2012 - link

    And why was an FX-8150 thrown into the DISCRETE PROCESSOR GRAPHICS benchmark? Reply
  • Hardcore69 - Wednesday, October 03, 2012 - link

    HA! Glad I went with an i3 3220 for this office box. Look at the power consumption at load, look at the single threaded benchmarks, even look at the multi threaded benchmarks. AMD is crap. It still hasn't caught up. And there are very few upgrade options compared to Intel. If you want to play games, a dedicated GPU is still vastly better. For other basic tasks, FAIL. Reply
  • rarson - Wednesday, October 03, 2012 - link

    You paid more money.

    They're called trade-offs. That's reality.
  • Nil Einne - Friday, October 05, 2012 - link

    Has anyone come across real world power consumption figures for either the A8-5500 vs A8-5600K or the A10-5700 vs A10-5800K. These have different TDPs, 65W vs 100W and slightly different clocks. But I'm wondering whether the K ones are really that bad in general or it's partially that they wanted more headroom since the K ones are to some extent designed to be overclocked. Of course the different ratings means that you may get unlucky and get a fairly high consumption K processor because of binning but still may be relevent. I'm somewhat out of date and not familiar with how turbo works, but I'm guessing the higher binning means it will stay at turbo for longer so a proper test should also try limiting the K to be the same as the non K just to see if that's the primary reason for any differences. (Ideally also limit the frequencies.)

    Most reviews including this one seem to be of the Ks I presume because that's what AMD sent out for testing.

  • Kaggy - Saturday, October 06, 2012 - link

    It would be nice to see some benchmarks on HTML5 and video playback load on the browsers, since nowadays people spend lots of time on their browsers. Reply
  • Silverbuckle - Saturday, October 06, 2012 - link

    It looks to me that we are seeing a race to the bottom, and Intel isn't playing. After seeing the results, if I have to start with entry level computer (what ever that means today) the best course would be Intel, i3 with a z77 chip set and a modest card.

    I have to update one of two computers. The older one will be replaced by this one running the Athlon IIx4 630, stock (It didn't OC worth a damn). It runs CS6 very well. So for my seat, I'll build a new box. I've been waiting to see what to follow. I have two cards available Ati 5670 and Ati 7750. So I can with no additional cost, run the 7750. That makes Intel the only game, because even if I have to lo ball the price, the i3 will equal or surpass the Trinity, using the card. I can upgrade any time all the way to i7 K series.

    Given the price of the card if you must include it, you are still positioned well to upgrade the computer with a simple exchange of the processor. Not so for several years with AMD, (best guess here).

    Waiting was ok, even though my gut was to go Intel. What I learned is to not be afraid to include the lowly i3.
  • Chipman1969 - Monday, October 08, 2012 - link

    Please redo some of the Trinity benchmarks with 1866 or 2133 ddr3 ram.

    See http://www.phoronix.com/scan.php?page=article&...

    There you can see that Trinity is doing much better with faster ram.

    On Amazon G.Skill Ripjaws X DDR3-1600 is 42$ and 2166 is 52$.
  • medi01 - Tuesday, October 09, 2012 - link


    "Although likely not the target market for someone buying a Trinity APU, we looked at performance of AMD's latest APU when paired with a high-end discrete GPU. The end result is a total loss for Trinity."

    brand new way to piss on AMD's cookies, by Anand The Shameless...
  • sdoraisw - Friday, January 04, 2013 - link

    If you want to compare the idle numbers you should disable the Gfx in IVB processor because trinity doesn't have integrated graphics.

    not just disable, you need some BIOS switch to power gate the Gfx engine completely. then you can do comparisons.
  • x7y9 - Saturday, January 12, 2013 - link

    I just got a Toshiba S875D-S7350 laptop with AMD A10-4600M processor and tried some
    simple benchmarks -- the result was pretty disappointing. Running 4 identical processes
    (Ubuntu 12.04) causes the completion time of each process to degrade substantially.

    I do some work with generating graphs and analysing them, so basic integer performance
    is the most important thing for me.

    I ran this simple benchmark and the comments show the results I got:

    #!/usr/local/bin/ruby -w

    # M1 -- AMD Phenom 2 X4 945 Quad core desktop
    # M2 -- Intel Core i3-2310M HP laptop
    # M3 -- Toshiba S875D-S7350 laptop with AMD A10-4600M
    # M4 -- HP AMD A6 3400M Quad core CPU 1.4-2.3 GHz

    # Machine M1 M2 M3 M4
    # -------------------------------
    # Single run: 158 178 234 292
    # 2 parallel: 169 195 283 313
    # 4 parallel 205 241 642 354
    # (All times in seconds; averaged over the parallel runs)

    start = Process.times
    cnt = a.permutation.inject(0) { |m, _| m + 1 }
    finish = Process.times
    user = finish.utime - start.utime; sys = finish.stime - start.stime
    puts "cnt = %d, user = %ds, sys = %ds" % [cnt, user, sys]

    Needless to say, I returned the Toshiba.

  • bomer08 - Tuesday, October 15, 2013 - link

    Hello, I would like to know which configuration was used for the 3220 I3. Motherboard, HD (ssd?), RAM, Graphic ... If you have changed something in the BIOS. thanks Reply
  • robytzel - Thursday, December 26, 2013 - link

    hello guys, i have a big question, i own a8-5600k, along with 8gb ddr3 1600mhz, my question is why does my computer lag, but really really lag in games on facebook like farm heroes saga???i cannot explain it to myself, why i can run even crysis 2 on medium-high, but cannot run normally a stupid flash player game!!!please help.happy hollidays! Reply

Log in

Don't have an account? Sign up now