POST A COMMENT

139 Comments

Back to Article

  • deontologist - Thursday, September 27, 2012 - link

    Anand - always 3 months late to the party. Reply
  • Devo2007 - Thursday, September 27, 2012 - link

    What are you talking about? AMD is just now lifting the NDA on the Trinity A10-5800K & A8-5600K desktop CPUs (and even then, sites can only talk about GPU performance).

    If any site had reviewed a Trinity APU several months ago, it was the mobile version (A10-4600M). Anandtech even reviewed it here:

    http://www.anandtech.com/show/5831/amd-trinity-rev...
    Reply
  • karasaj - Thursday, September 27, 2012 - link

    I believe he was referring to this:

    http://www.tomshardware.com/reviews/a10-5800k-a8-5...
    Reply
  • Samus - Thursday, September 27, 2012 - link

    None of those numbers compare Trinity to the competition. They're mostly worthless. Reply
  • Samus - Thursday, September 27, 2012 - link

    Engadget has word the A10 is aiming at i3 prices and i5 performance on the CPU side. We've already seen A8 and A10 cream the i3 and i5 in GPU. I'm excited. I haven't built an AMD system in years, and the A8 65w might be a perfect HTPC CPU. Reply
  • jwcalla - Thursday, September 27, 2012 - link

    Tom's has benchmarks against a Core i3-2100 if you'd like to see how it stacks up. Reply
  • Samus - Thursday, September 27, 2012 - link

    i can't find any of tom's benchmarks showing a comparison of THESE chips against any Intel chips. They all compare the A10 and A8 to eachother. Reply
  • GazP172 - Thursday, September 27, 2012 - link

    If its anything like the Lano, the top end 65w's will basically only be released to the OEM's. Which to me are the only ones worth having. Reply
  • Taft12 - Thursday, September 27, 2012 - link

    That was because of AMD's lousy yields and contracts which prioritized access of the supply to the likes of HP and Acer over the retail channel.

    OEMs still have first dibs, but yield issues are apparently better now. I have high hopes for the 65W parts (which includes actually being able to buy them on Newegg!) The A10-5700 could be the best of all worlds.
    Reply
  • mikato - Monday, October 01, 2012 - link

    Agree! I want to A10-5700 probably. No brainer. Reply
  • mczak - Thursday, September 27, 2012 - link

    If early listings at merchants are any indication, they should be available. I think though the problem is that the top 65W parts seem to cost as much as the top 100W parts (so a A10-5800k costs the same as A10-5700, same story for the A8), which probably makes them a hard sell at retail (quite similar to intel, and I don't think the low power parts exactly fly off the retail desks there neither).
    But I agree the 65W parts are nice. On the cpu side you lose around 10% but the gpu actually has the same clocks. Of course if you tinker with it manually it should be easily possible to undervolt/underclock the 100W parts to the same level as the 65W parts.
    Reply
  • medi01 - Thursday, September 27, 2012 - link

    Oh yeah. Comparing it to nVidia 640 makes so much more sense...

    Of course nobody would think about Anand finding yet another way to piss on AMDs cookies...
    Reply
  • dawp - Friday, September 28, 2012 - link

    I believe that was included for a comparison to a low end discreet card. it could have just as easily been an hd7750 or hd7770. Reply
  • Aone - Friday, September 28, 2012 - link

    Good point. I've reached the same conclusion. Reply
  • shin0bi272 - Tuesday, October 02, 2012 - link

    its a 100 dollar gpu beating the pants off of amd's latest and greatest APU.

    You could buy an intel celeron g530 for 48 dollars (with free shipping) and an asus gt640 (or galaxy 650 with MiR for the same price) and beat the living snot out of AMD's amazing new APU that everyone just has to love because its brand new from AMD and all the fan boys have to fall all over themselves to get it... sounds like apple. Hell if you hate nvidia so much you can get an amd 7750 for 99 bucks on newegg.

    Either way you go the price is between 30 and 50 dollars more than the APU and it will get about twice the FPS... who's going to buy an APU with stats like that? Oh yeah fanboys...
    Reply
  • CeriseCogburn - Thursday, October 11, 2012 - link

    They never respond when the truth hurts too much.

    Just think, without them, amd is double epic fail and already gone. I bet that statement made their hearts so warm, and feel so heady, they "saved competition".

    But you know what they really saved ? Saved the world from real innovation and forward movement, as all those resources and programmers and engineers were wasted on amd crap. Saved us all from the truth and reality. Saved us from sanity and believing fellow human beings had a clue.

    I'm going to go find a trinity w discrete bench so I can LMAO as soon as the overlord control freak in fear of their own life amd releases the death grip on the nda bench rules.

    You just know all the little pliable as ruibber amd fanboys gonna get their new squeeze trinity - they're looking in the mirror now saying "My name is not Mr. Anderson!"

    Watch we'll get an HTPC article now, or maybe it's already posted. Hope I don't have to laugh and shake my head about how cracked and crap and functionless and problematic it turns out to be. Flash will probably rip it a new one. LMAO
    Reply
  • Devo2007 - Thursday, September 27, 2012 - link

    That article was dated yesterday....just like the Anandtech one. Reply
  • Wolfpup - Friday, September 28, 2012 - link

    Yeah, I have no idea what that's supposed to mean. They're giving performance data for a product that hasn't launched yet and they're three months late? Reply
  • chowmanga - Thursday, September 27, 2012 - link

    Same could have been said about Tom's when Anandtech had the Sandy Bridge preview before anyone else. Reply
  • r3loaded - Thursday, September 27, 2012 - link

    Really? No other site does such in-depth analysis of new chip architectures and such rigorous testing and benchmarking (though Ars comes close). This is the wrong site if you want tech tabloid journalism. Reply
  • ananduser - Thursday, September 27, 2012 - link

    ARS ? Please... it's insulting to Anand to compare it to Ars. You're also insulting of Tom's. Techreport is better than them all as they've "innovated" in the benchmark area via microstuttering tests. You should read more and stop being so high strung about your fav sites. Reply
  • coder543 - Thursday, September 27, 2012 - link

    don't know what side of the Internet you're from, but Ars Technica has some of the highest quality reviews of anywhere on the net. Anandtech is good, but they're heavily biased against both Linux and OpenGL, so that bothers me about them.

    Please, show me some of this low quality Ars material you speak of. I would also have you note that Ars and Techreport cross-post on occasion... so, praising one and not the other is a strange concept.

    Tom's does *okay* reviews... but compared to Anandtech, their stuff is usually lacking.
    Reply
  • ananduser - Thursday, September 27, 2012 - link

    Oh please...Ars only knows mainstream. They cite more than they review. The only exception is Siracusa's yearly 100 pages OSX review which every Apple fan reads religiously and extensive Apple related coverage. Anand's, Tom's and Techreport are in a league of their own. Techreport recently impressed me with their "into the second" approach to benchmarking. Reply
  • Wolfpup - Friday, September 28, 2012 - link

    Biased against Linux and OpenGL? How so? There's not much stuff USING OpenGL anymore, but that doesn't mean they're biased, and most people, even enthusiasts like most people here, aren't running Linux, sooo again doesn't mean they're baised against it. Reply
  • rarson - Wednesday, October 03, 2012 - link

    Ars Technica is a freaking joke in everything that they do. Reply
  • medi01 - Thursday, September 27, 2012 - link

    No other site uses 1000 Watt power supplies when testing HTPC CPUs either.... Reply
  • damianrobertjones - Thursday, September 27, 2012 - link

    ...Unless it's Apple hardware Reply
  • mattlach - Thursday, September 27, 2012 - link

    I don't trust Tom's Hardware as far as I can throw them.

    After they were caught taking kick backs from hardware vendors for better reviews, and caught stealing content for their articles from other review sites, anyone who still reads that website is either a moron or ignorant.

    Tomshardware wasn't bad back when Tom Pabst still ran it in the late 90s, but these days its a shell of its former self and completely and totally unreliable.

    For me it's all HardOCP and Anandtech.
    Reply
  • Homeles - Thursday, September 27, 2012 - link

    You're going to have a very biased view of hardware if you only check two sources. I personally don't care for [H]ardOCP (I don't like their site design and the way they present their data), but AnandTech does try to keep things objective.

    Still, you can't pretend that AT is infallible and 100% trustworthy. If you do, then you deserve to be misled.

    Like all media, it's best to check as many sources as possible before developing an opinion of something.
    Reply
  • mikato - Monday, October 01, 2012 - link

    Please link for the kickbacks! After reading this, I just searched Google for "tom's hardware kickbacks" and your comment was 3rd and the most relevant, lol. Need the info. Reply
  • leexgx - Saturday, October 06, 2012 - link

    T** Hardware i have made sure i never goto there web site again (even saying there website name as google counts that towards stats)

    most stuff on there cant be trusted
    Reply
  • blackmagnum - Thursday, September 27, 2012 - link

    I hope AMD aim their products for first place in the best price/ performance race with Intel. This seems the only way they will be bought other than for the health of competition or nostalgic sense of pity. Reply
  • duploxxx - Thursday, September 27, 2012 - link

    pity for what? do you really think you need more CPU perfromance then a Piledriver core deliver? Do you really think that the Trinity isn't good enough power consumption wise.

    Its fools who believe they need an i7 to run daily desktop usage. spend the wasted money of an i7 on a fast disc (SSD) and your overall platform experience and performance will be much higher then your so called fixed brand name.
    Reply
  • daos - Thursday, September 27, 2012 - link

    are you serious? people use computers for more than "daily desktop usage". Video editing, graphics design, multi-threaded apps, benchmarking, even gaming...

    Every bit counts the larger the scale. CPUs can make a huge difference in all of the above mentioned except for maybe gaming unless at an enormous resolution like I am.

    And you have to remember that everything is relative. You are concerned with wasting your money whereas the next guy could care less about an extra 2 or 3 hundred dollars for the best. Thats a drop in the bucket for him. Hell, thousands more can be a drop in the bucket if the best is what your after...its simple. Go Intel.
    Reply
  • dagamer34 - Thursday, September 27, 2012 - link

    When the average selling price of a computer is $450 in the US, I don't see how that really includes any of the things you listed above. Reply
  • Alexvrb - Thursday, September 27, 2012 - link

    Heck, for most people, a Tegra 3 or similar in a tablet is enough computing power. A lot of people just stroll into a store and buy the advertised on-sale laptop for $300-400. For these people an APU might not be a bad choice, given that the lower-end Intel chips all have horribly crippled GPUs. Reply
  • lwatcdr - Thursday, September 27, 2012 - link

    "Video editing, graphics design, multi-threaded apps, benchmarking, even gaming..."
    Some do but more and more of those tasks really benifit from a GPU boost. Most Adobe products now use openCL so they can take advantage of the GPU. That will cover Video editing, graphics design and gaming. multi-trheaded apps benifit more form core count than raw cpu and most multi-threaded apps will do just fine on th A10.
    Benchmarking? Really that is called a hobby unless you are doing it to test systems for a living. You do not buy hardware just to bet a higher benchmark score you Benchmark hardware to find the cheapest way to get a task done in a reasonable amount of time. Anything else is a hobby and while that is all fine and good it is a tiny fringe element of a fraction of a percentage of the PC market.
    Reply
  • Denithor - Thursday, September 27, 2012 - link

    But this is the desktop market. It's simply too easy to install a discrete GPU that is tons faster than any iGPU, even this new Trinity. Integrated GPU just doesn't cut it for most of those applications.

    Maybe for an HTPC. But that's honestly the only place I'd even consider pointing anyone toward an APU over a CPU+GPU.
    Reply
  • chrnochime - Thursday, September 27, 2012 - link

    There are always people who make stupid blanket statements like yours. "People" would mean everyone. You not writing "some" in there basically means you think everyone think the way you do. If that were not the case your sentence would not have been written that way.

    Don't like my nitpicking? Don't write stupid blanket statements then.
    Reply
  • mikato - Monday, October 01, 2012 - link

    what do you mean "you people"? Reply
  • OCedHrt - Thursday, September 27, 2012 - link

    The Adobe suite is quite well GPU accelerated now. I'll admit Intel still wins on video encoding by far though. And we just saw how Trinity won at gaming, what are you saying? Reply
  • StevoLincolnite - Thursday, September 27, 2012 - link

    Actually... The higher the resolution the smaller the increase in performance a processor provides as you quickly become GPU limited.

    I game at 5760x1080 and I noticed zero, I mean zero difference in games between a Phenom 2 x6 1090T and my current Core i7 3930K.
    Granted I use that CPU grunt for other things, but in gaming and at super high resolutions, the difference is absolutely negligible.
    I would have been better off using the $800 that I spent upgrading for another 2 graphics cards for Quad-Crossfire if the sole purpose was gaming.
    Reply
  • CeriseCogburn - Thursday, October 11, 2012 - link

    Don't you love it how with the amd HD7970 etc all their videocards, all we heard about was future proofing and having 2G+ 3G+ ram on board so that WHEN the time came and games were released the dear fanboy amd card could finally show it's unlocjked potential and futureproofiness !

    LMAO - now a crap trinity slug is all anyone will ever need.

    It's amazing what size brain farts the amd fanboy mind can flatulate.
    Reply
  • daos - Thursday, September 27, 2012 - link

    Not trying to knock AMD at all here either. I want to make that clear. I am simply saying that Intel is the performance king. Once can argue that power consumption is not a concern but performance is. Reply
  • tecknurd - Thursday, September 27, 2012 - link

    What blackmagnum means is back in the old days AMD suit the best price vs performance ratio.. These days AMD is not doing this. It nice to get a processor that is as powerful or fast as Intel high-end processor which is i7 for the price of an i5 or even an i3. This provides a good selling point. If user went with a lower end AMD processor, people will be paying less than a Pentium, but get the same performance as an i5 or an i3. It was like this in the past, but again it is not now. AMD's processors have a poor price to performance ratio that is making them more expensive than Intel's processors. Intel has the best price to performance ratio.

    If you do not believe me, an AMD K7 processor compared well against Intel Xeon processor. Also an AMD K8 processor compared well against Intel Xeon processor. The AMD K10 processor and now compared well against Intel low-end processors like the Pentium, but at a rip off price that is a few times more.

    It is already given that an SSD increases performance of a computer because the latency is less than a millisecond and throughput is more than 100 megabytes per second. A HDD latency is about 10 millisecond and throughput is average around 40 to 60 megabytes per second.
    Reply
  • vgray35@hotmail.com - Thursday, September 27, 2012 - link

    The bottom line here is graphics fusion onto the CPU chip is moving extremely slowly. AMD is now as slow or slower than Intel in moving the technology forward. The incremental improvements from one generation to the next are certainly not spectacular. Intel is slow as they traditionally are too conservative and GPU design has never been their forte. Now AMD is slow as big improvements will eat into their graphics cards business.

    AMD is backing itself into a corner that it may not ever recover from if it does not make big moves soon. My advise - skip 22nm lithography and go straight to 14nm to 18nm with 5 billion transistors, and increase graphics performance by at least 150% to 200%. Bite the bullet and move low to medium end discrete graphics onto the APU as originally promised. Intel will not be able to match this for at least 2 to 3 years (unless they buy NVidia). Forget power consumption beyond 65W.

    They do not need to beat Intel in CPU performance, but it is ridiculous to produce a chip with only 1.1 billion transistors that offers both CPU and GPU cores at this late stage of the game. When Intel moves to 14nm the state of AMD's development at that time will determine AMD's fate once and for all. They had better get a clue and skip one lithographic generation.

    I have always favored AMD for its agility in the past, but for years I have been sadly watching the death of this company. It's not too late, but 2 years is all they have left to provide a large incremental jump. It's now or never. The next generation on the new FM2 socket is just not going to cut it either, as the next chip should integrate the south bridge, as well as audio to create a true SoC offering.

    Can you believe a new platform that does not even offer PCI Express 3.0? Ridiculous really. Forget 3Gbps SATA. Forget USB 2.0 ports. Produce what we have all been waiting for - a SoC for the desktop, which is what fusion was all about in the first place. If it needs a new APU socket then just bite the bullet and introduce another one, and give it some spare pins for future growth.

    Time to get serious. before the money runs out.
    Reply
  • mikato - Monday, October 01, 2012 - link

    Well just wait until some of the bigger name applications out there start using the GPU a bit. Could be interesting. Hopefully it's sooner rather than later. They do have a way to go with sorting out the best ways to do this. Reply
  • jjj - Thursday, September 27, 2012 - link

    I had one question about desktop Trininty: how does it do at 1080p? and i guess i'll have to look for an answer elsewhre.
    It's a desktop part,you just can't not even try testing at 1080p.
    Bah!
    Reply
  • Mathieu Bourgie - Thursday, September 27, 2012 - link

    I'm also disappointing in the lack of testing at 1080p. Many folks, myself included, are considering Trinity for a HTPC solution and needless to say, the vast majority of hooked to HDTVs, most of them being 1080p. Not to mention that 1080p screens for desktop PCs are quite inexpensive and pretty common nowadays.

    Also, shame on AMD for those shady marketing tactics. To me, it sounds like CPU performance and overclocking are poor and pricing will most likely depend on reaction to
    Reply
  • Mathieu Bourgie - Thursday, September 27, 2012 - link

    Sorry about that, I meant to say in my last sentence:
    "To me, this marketing strategy from AMD is telling me that CPU performance and overclocking potential are most likely poor and that pricing will most likely depend on the reaction of the public and the PC industry after reading these so called "reviews"."
    Reply
  • coder543 - Thursday, September 27, 2012 - link

    It's just a marketing strategy. The CPU performance is fine -- and look at the Starcraft 2 benchmarks in this very article if you want further confirmation of that. Anandtech was subtly hinting that the CPU performance is a step in the right direction. Reply
  • ganeshts - Thursday, September 27, 2012 - link

    The HTPC oriented review is coming up at 11:00 AM EST Reply
  • jwcalla - Thursday, September 27, 2012 - link

    Hopefully there will be some commentary on Linux driver support for those of us who take an interest in XBMC as an HTPC platform. :)

    In particular, hardware-accelerated video decoding.
    Reply
  • Taft12 - Thursday, September 27, 2012 - link

    Linux driver support will be the same as it always has been for brand new platforms. Non-existant. Give it a few months. Reply
  • coder543 - Thursday, September 27, 2012 - link

    that's simply not true. The proprietary graphics drivers for Linux use the same code that their Windows brethren do. Open source drivers? yeah, that's months away... but Linux does have support. Reply
  • coder543 - Thursday, September 27, 2012 - link

    Check Phoronix -- they'll be posting some stuff soon. Reply
  • coder543 - Thursday, September 27, 2012 - link

    I'm also with jwcalla -- can we see some Linux stuff? Reply
  • MrSpadge - Thursday, September 27, 2012 - link

    Did you see the frame rates at low resolution and detail? Game performance will absolutely tank at 1080p, no need to test this. And other HTPC duties haven't been tested here anyway. Reply
  • JNo - Thursday, September 27, 2012 - link

    +1

    1080p benchmarks are essential for a desktop part.

    Also, whilst you're at it, you may as well make quality a minimum of 'medium' for 1377 and poss also include medium for 1680.
    Reply
  • juampavalverde - Thursday, September 27, 2012 - link

    Well maybe im biased by the view of Scott, but read by yourself:
    http://techreport.com/blog/23638/amd-attempts-to-s...

    AMD is telling what can be shown and what not... F off AMD, this aint a review, its a preview tailored by AMD Marketing, far of a whole fully product review, and the tailoring its exactly to offer a biased view of the product. Please make it clear Anand, the quality of your site is better than this AMD marketing bs.
    Reply
  • Voldenuit - Thursday, September 27, 2012 - link

    Agreed. If anything, AMD's PR moves are a faux pas, because it leads the reader to wonder just how bad their x86 performance might be if they have to go behind readers' backs to manipulate public perception by forcing reviewers to cherry pick benchmarks.

    In reality, I expect Trinity's x86 performance to be acceptable, but AMD's tiered benchmark and preferential treatment of reviewers sets a bad precedent for the industry that Anand should not have agreed to.
    Reply
  • Hubb1e - Thursday, September 27, 2012 - link

    Think of this like the launch of the AMD 7660 D graphics card then have a cup of tea, and maybe you will feel better about it.

    I'm happy to have something to read about. They didn't have to release anything until Oct 2nd.

    It sounds like a pretty decent GPU. It's something I can toss in the wife's computer and guests could have a pretty decent expeirence when they come over and I don't even have to worry about that extra idle power consumption of a graphics card that goes unused by my wife.

    But, look at those die sizes. Anand compares them to the i5 since it shows them in a more favorable light. Drop the Ivy i3 in that chart at 116mm2 and Trinity looks like the fat girl at a wet T-shirt contest. She's got the biggest jugs but the sacrifice to get there may not be worth it.
    Reply
  • CeriseCogburn - Thursday, October 11, 2012 - link

    AMD is an evil, vile corporate pig filled with lies and spinmeisters toying with the pliable minds of the immensely ignorant fanbase of haters it has carefully developed over many years.

    Now it has turned fully to the darkside and launched an attack from it's crumbling deathstar, an attack never seen before. It's the head pig destroyer of review sites.
    Be proud amd fans, all those years you fed on your hatred of nVidia and Intel, yesss... feel the hatred... let it flow through you...
    You're no Luke Skywalkers, you have given in to the darkside....
    Reply
  • Anand Lal Shimpi - Thursday, September 27, 2012 - link

    It's a bolder move than manufacturers have tried in the past, but it's not unheard of. Most company sanctioned pre-release reviews have some sort of restriction put on them. No pricing discussion is not unusual, the GPU-focus here is unusual but not totally out of the ordinary. The original Intel sanctioned Conroe performance previews were done similarly (Intel set and controlled the only applications that could be tested).

    The original press embargo for Trinity was in October. If you wanted to show something earlier, these were the rules. I'm not a huge fan of staggered embargoes, but they exist and this wasn't the first one. Had AMD tried at all to influence the benchmarks we ran, tests we used or conclusions we came to I guarantee you that things would've unfolded very differently.

    Take care,
    Anand
    Reply
  • juampavalverde - Thursday, September 27, 2012 - link

    Its a known move from the manufacturers, and knowing what happens with the NDA's its hard, but also its the case of a product with a performance hard to measure at least in the individual metrics, because has everything onboard, so i have no chance of getting any conclusion just knowing half of the data, i dont know how much of the power comes from the IGP, or the CPU, or how powerul is the GPU in comparison with others in an apples to apples test (i dont know if every GPU was tested over the APU or in another test rig), and if the piledriver quad its holding back the whole APU or it is doing right. From the performance measured in the IGP, its just a meh comparing to Llano, and not far enough from the HD 4000.

    AMD its MAD if they want to play Intel games. Intel can do it because they control the market, but being the underdog, performing unspectacular and behaving mad and bad... oh dear... they will be bought by Qualcomm when their stock go further down.
    Reply
  • Voldenuit - Thursday, September 27, 2012 - link

    Anand, thank you for taking the time to respond with your thoughts.

    I do agree with you that this behaviour is not unprecedented in the industry, and all camps (red, blue, green) have done it at some point or another.

    However, when Conroe previews were released, they were clearly marked as such.

    The Trinity reviews out now might be more properly termed as previews, since the part is not available and will not be for a couple more weeks.

    At the end of the day, the performance will be nothing unusual or unexpected, since we've had mobile trinity parts benchmarked for a while.

    What worries me is that staged and staggered reviews under NDA and exclusivity clauses may become the norm if the rest of the industry picks up on this practice and runs with it.

    I also don't think this PR stunt has worked in AMD's favor - rather than highlighting its (well-earned) GPU prowess, it's raising questions about its x86 performance (which are probably very close to the mobile part's documented performance) and the (probably unintended) ethics of attempting to influence the review process.
    Reply
  • tecknurd - Thursday, September 27, 2012 - link

    After reading that article at techreport.com, I can not believe that Anandtech have resorted to obey AMD's rules of reviewing the latest products. I was wondering where is the general usage benchmarks and where is the encoding benchmarks. I thought this is a review site and not some teaser. If I was a reviewer, I would post the benchmarks of general usage which means going against the email. A reporter has every right post the truth. Where is the truth on this site?

    Like I comment on other sites, AMD arrogance is ruin their reputation and the reputation of reviewers.

    BTW, I doubt the email will hold up in court, so it is OK to disobey it unless the contract is ambiguous.
    Reply
  • SleepyFE - Thursday, September 27, 2012 - link

    OMG.

    Forget the e-mail, they signed an NDA!!
    And there is truth in the article, just not the whole truth. You don't post everything that happens to you on facebook do you? Some thing are left out for whatever reason Consider the NDA as AMD's privacy settings.
    Reply
  • Jamahl - Thursday, September 27, 2012 - link

    Scott also "previewed" conroe under the kinds of limitations Anand spoke about.

    http://techreport.com/review/9538/intel-conroe-per...

    The main difference is he didn't complain bitterly about it. Techreport doesn't have the staff to get these kinds of reviews done in time, so instead they take the easy option by slagging off AMD.
    Reply
  • MrSpadge - Thursday, September 27, 2012 - link

    It's not review of the product, but it's a review of the GPU performance. Period. Reply
  • dishayu - Thursday, September 27, 2012 - link

    Hate to be offtopic here, i wanted to ask what happened to this weeks Podcast? Was really looking forward to a talk about IDF and Haswell. Reply
  • Ryan Smith - Thursday, September 27, 2012 - link

    Busy,. Busy busy busy. Perhaps on the next podcast Anand will tell you what he's been up to and how many times he's flown somewhere this month. Reply
  • idealego - Thursday, September 27, 2012 - link

    I don't think the load GPU power consumption is fair and will explain why.

    The AMD processors are achieving higher frame rates than the Intel processors in Metro 2033, the game used for the power consumption chart. If you calculated watts per frame AMD would actually be more efficient than Intel.

    Another way of running this test would be to use game settings that all the processors could handle at 30 fps and then cap all tests at 30 fps. Under these test conditions each processor would be doing the same amount of work. I would be curious to see the results of such a test.

    Good article as always!
    Reply
  • SleepyFE - Thursday, September 27, 2012 - link

    True.
    But you are asking for consumption/performance charts. You can do those yourself out of the data given.
    They test consumption under max load because noone will cap all their games at 30fps to keep consumption down. People use what they get and that is what you would get if you played Metro 2033.
    Reply
  • idealego - Thursday, September 27, 2012 - link

    Some people want to know the max power usage of the processor to help them select a power supply or help them predict how much cooling will be needed in their case.

    Other people, like me, are more interested in the efficiency of the architecture of the processor in general and as a comparison to the competition. This is why I'm more interested in a frames per watt or watts at a set fps, otherwise it's like comparing the "efficiency" of a dump truck to a van by comparing only fuel economy.
    Reply
  • CeriseCogburn - Thursday, October 11, 2012 - link

    LMAO - faildozer now a dump truck, sounds like amd is a landfill of waste and garbage, does piledriver set the posts for the hazardous waste of PC purchase money signage ?

    Since it's great doing 30fps in low low mode so everyone can play and be orange orange instead of amd losing terribly sucking down the power station, just buy the awesome Intel Sandy Bridge with it's super efficient arch and under volting and OC capabilities and be happy.

    Or is that like verboten for amd fanboys ?
    Reply
  • IntelUser2000 - Thursday, September 27, 2012 - link

    We can't even calculate it fairly because they are measuring system power, not CPU power. Reply
  • iwod - Thursday, September 27, 2012 - link

    I think Trinity is pretty good chip for low cost PC. Which seems to be the case for majority of PCs sold today. I wonder why is it now selling well compared to Intel. Reply
  • Hardcore69 - Thursday, September 27, 2012 - link

    I bought a 3870K in February. I've now sold it and replaced it with a G540. APU's are rather pointless unless you are a cheap ass gamer that can't afford a 7870 or above or for a HTPC. Even there, I built a HTPC with a G540. You don't really need more anyway. Match it to a decent Nvidia GPU if you want all the fancy rendering. Personally I don't see the point for MadVR and I can't see the difference between 23.976 @ 23.976 or 23.976 at 50Hz.

    All that being said, I bet that on the CPU side, AMD has failed. Again. CPU grunt is more important anyway. A G620 can compete generally with a 3870K on the CPU side. That is just embarrassing. The 5800K isn't much of an improvement.

    Bottom line, a Celeron is better for a basic office/pornbox, skip the Pentium, skip the i3, get an i5 if you do editing or encoding, i7 if you want to splurge. GPU performance is rather moot for most uses. Intel's HD 1000 does the job. Yes, it can accelerate via Quicksync or DXVA, yes its good enough for youtube. Again, if you want to game, get a gaming GPU. I've given up on AMD. Its CPU tech is too crap and its GPU side can't compensate.
    Reply
  • Fox5 - Thursday, September 27, 2012 - link

    A 7870 goes for at least $220 right now, that's a pretty big price jump.

    AMD has a market, it's if you want the best possible gaming experience at a minimum in price. You can't really beat the ~$100 price for decent cpu and graphics performance, when it would cost you at least half that much (probably more) for a graphics card of that performance level. Also, in the HTPC crowd, form factor and power usage are critical, so AMD wins there; I don't want a discrete card in my HTPC if I can avoid it.
    Reply
  • parkerm35 - Monday, October 01, 2012 - link

    First of all you have never owned a 3870k, as your just an Intel fan boy wanting some attention. The simple fact is you have chosen to look for an AMD review to feed us this rubbish, this just shows how obssesed you are. If you don't like AMD parts, that's fine, but it's because of people like you why AMD is in this kind of mess to start of with. I bet your one of these people who went out and bought a P4 as well?

    This review has just shown you this APU competing with discrete graphics cards, and doing a damn good job at it too. How much was your G620? add the price of a discrete card that is capable of matching the trinity, maybe looking at a GT630 (which i think will be slightly slower), $70? + $65 for the CPU $135 for a dual core, slower CPU and in all a more power hungry setup. Do me a favor.

    " A G620 can compete generally with a 3870K on the CPU side. That is just embarrassing. The 5800K isn't much of an improvement."

    How do you know the 5800k isn't much of an improvement? This hole review is about GPUs, no CPU data what so ever.

    Could you please list these HD1000 parts with quicksync.
    Reply
  • kpo6969 - Thursday, September 27, 2012 - link

    Anand if you went along with this your stock as one of (if not the best) reviews to trust site has gone way down. Just my opinion. Reply
  • rhx123 - Thursday, September 27, 2012 - link

    I agree. They should have done the same as TechReport and called AMD out on this.
    I have been a long time lurker, and I nearly posted about Anandtech's spin on the Enduro Update, but now it really feels like there's something going on between the two.

    It's obvious that in making AMD hold this information back, it's confirmed to everyone in the know that piledriver is going to be rubbish , and has probably done AMD more damage than just letting people release the benchmarks.

    Just hoping a Chinese reviewer somewhere can get his hands on the parts and release some real CPU benchmarks.
    Reply
  • jaydee - Thursday, September 27, 2012 - link

    Fortuanately, Anand has more class than to be a blatant hypocrite like "Tech Report" in happily preview Intels chips under certain parameters, but complaining about it when AMD does it.

    http://techreport.com/review/9538/intel-conroe-per...
    Reply
  • cobalt42 - Thursday, September 27, 2012 - link

    You're simply pointing out the difference between a PRE-view and a RE-view, not pointing out any supposed hypocrisy.

    A preview is often done on the manufacturer's terms. Compare to what is often done in gaming; you get to see what they show you, and you're careful not to draw conclusions. (To quote TR's conclusions in that article you cite, they start with "Clearly, it's way too early to call this race.") Previews are also often done when you're offsite and in their controlled conditions. Plus, the article you write about it is called a "preview" in the title, not a "review". Look at the title of these articles versus the one you cite.

    What AMD is trying to do here is control the output of REviews.
    Reply
  • Visual - Thursday, September 27, 2012 - link

    The high-end GPU version seems nice, its disappointing there are weaker versions though. Especially the mobile version, with not nearly enough performance to distinguish itself from the intel offering. Reply
  • Jamahl - Thursday, September 27, 2012 - link

    Can you point out that the GT 640 in this review is in an Ivy bridge powered system? It would have been nice to have it running in the 5800K system, just to see how close the graphics portion of Trinity really is to it. Reply
  • Rick83 - Thursday, September 27, 2012 - link

    "Note that this test fails on all Intel processor graphics, so the results below only include AMD APUs and discrete GPUs."

    Well, down to the i5's they all have AES acceleration in the CPU pipeline.
    Would be interesting to see a direct comparison of that to the results in the table.

    Of course, for the i3s and below, this is a bit of a let-down.
    Reply
  • DanNeely - Thursday, September 27, 2012 - link

    What's with the pair of USB1 ports that AMD still puts on all their chipsets? Reply
  • jasomill - Thursday, September 27, 2012 - link

    A cost-saving measure, perhaps, intended for use with integrated devices? Many devices don't benefit from speeds in excess of 12Mbps: keyboards, pointing devices, digitizer tablets, Bluetooth adapters, infrared ports, fingerprint readers, GPS receivers, accelerometers, ambient light sensors, switches, buttons, blinkenlights, fax modems, floppy drives, . . . Reply
  • DanNeely - Thursday, September 27, 2012 - link

    Does AMD share chipsets between their desktop and mobile platforms? AMD's done this for years (all of their desktop chipsets?); and all the legacy embedded devices you listed are typically connected via the LPC (low pin count) bus, a semi-parallelized implementation of the '80's era 8 bit ISA bus. Reply
  • jamyryals - Thursday, September 27, 2012 - link

    I liked the sneak peak. I don't care if amd wants to hold off on the CPU benchmarks, they'll be out shortly anyways. It was already hinted at as what to expect by Anand in the article. The only thing that's troublesome is the people who take this as an opportunity to besmirch someone's credibility. Take a deep breath and in a few days you'll be able to justify your own viewpoint no matter what the numbers say anyways.

    At this point, it's more about the direction amd is headed that is interesting than this product. What is the target goal for this family of chips and will that be more successful than competing head to head with Intel.
    Reply
  • Torrijos - Thursday, September 27, 2012 - link

    In august an interesting article, treating of the influence of CPU was posted :
    http://techreport.com/review/23246/inside-the-seco...

    The idea was not to measure average FPS, but instead to measure millisecond/frame for all the frames in a benchmark in order to see if performances were constant or would fall harshly for some frames (having a clear impact on playability).

    The thing is with the current waive of CPUs with iGPUs it might be time to switch benchmarks to a similar methodology, in order to see which architectures handle the memory work better.
    Reply
  • taltamir - Thursday, September 27, 2012 - link

    I just did a double take, had to look twice, and indeed this is 100% a GPU benchmark with not a single test about the CPU.

    The only test relevant to the CPU might have been the AES acceleration (a fixed function test) and the power test (where intel still spanks AMD).
    Reply
  • Jamahl - Thursday, September 27, 2012 - link

    This is what happens when you look at the graphs without actually reading anything. Reply
  • Torrijos - Thursday, September 27, 2012 - link

    I read the fact that they can't talk about CPU now, I was trying to say that FPS is an antiquated metric...

    My point was that APUs tend to share memory bandwidth between the CPU and GPU resulting in unreliable peak performances (even when coupled with a discreet GPU) while still maintaining a good average FPS.

    In the end the FPS metric isn't the best available number to clearly evaluate performance of these chips. a full plot of milliseconds per frame for the entire test run offers a clearer vision.

    An alternante measure would be % of frames that took more than XX milliseconds to generate.
    Reply
  • James5mith - Thursday, September 27, 2012 - link

    I know that the Desktop CPU has had more and more integration, but when did Anandtech decide to start calling them SoC's, as if they were the all-in-one packages inside a smartphone?

    It's still an APU, or CPU+GPU+IMC, or whatever you want to call it. It is not a complete system. It still needs a southbridge chipset for all the sundry interconnects.
    Reply
  • SleepyFE - Thursday, September 27, 2012 - link

    Will everyone please give up on the measuring competitions (referring to mine is bigger). I'm using Phenom 2 x2 555 and it works just fine 3 years running. I'm an average price conscious gamer. I look for 100€ CPU's and 150€ GPU's (right now i have 6870 Radeon). Everything i do works just fine with very high 2xAA settings. Having an i7 would make no difference in performance because games don't put more cores to good use and every other program i use can't even put a single core to good use.

    I will say again:"I AM AVERAGE!!" And it all works for me. ALL the CPU's right now are sufficient for the average man (or woman).

    The reason AMD is stressing the GPU side of APU's is because that's what matters. When you can buy an APU for 200€ that has a HD Radeon x870 (x being the generation number) class GPU in it that saves me money and cancels one very loud fan. It's a win win.
    Reply
  • jwcalla - Thursday, September 27, 2012 - link

    "Average" people don't need a GPU any more powerful than what you'd need to drive a simple display. Because "average" people are nowhere near interested in PC gaming.

    And this is why AMD's strategy is a little silly.

    The key to marketshare is making sweet deals with Dell, HP, etc.
    Reply
  • jaydee - Thursday, September 27, 2012 - link

    I noticed the motherboard has 3 digital video outputs and VGA. Can all three (DVI, HDMI, DP) be used at the same time with the APU? Reply
  • ganeshts - Thursday, September 27, 2012 - link

    Yes, they can be used simutaneously Reply
  • mikato - Monday, October 01, 2012 - link

    Eyefinity! Reply
  • MrSpadge - Thursday, September 27, 2012 - link

    Seeing how the A10-something has 1.5 times the raw GPU horse power than the A8-something it seems obvious that the A10 is badly choked for memory bandwidth, even at DDR3-1866. Since even DDR3-2400 ist rather cheap these days it would be interesting to see performance scaling with memory speed. I expect the value the A10 provides can be considerably increased. Reply
  • rscoot - Thursday, September 27, 2012 - link

    Wondering if the IMC is rated for DDR3-2400 though. And yeah it's patently obvious that the APU on these chips are starved for memory bandwidth. Reply
  • wwwcd - Thursday, September 27, 2012 - link

    Intel HD 4000, looks like a wrist of gay, against the presentation of graphics controllers set in Trinity ;) Reply
  • formulav8 - Thursday, September 27, 2012 - link

    This preview showed me how pathetic Intels junk is. Doesn't even come close overall. Reply
  • Aone - Thursday, September 27, 2012 - link

    Would you say what cpu you used for GT 640? Reply
  • wheeqo - Thursday, September 27, 2012 - link

    AMD please take Trinity to Windows 8 Tablet! Reply
  • Roland00Address - Thursday, September 27, 2012 - link

    but is it possible for you to add these three cards before the final review next week

    1) 6670 1gb gddr5
    2) 6570 ddr3 or 6670 ddr3
    3) 7750 1gb gddr5

    I ask for these are new cards that trinity will be competing against in a similar price bracket. You can't find a 5570 on neweeg right now since it has been discontinued and replaced with 6000 and 7000 series part. The only 5570 on newegg right now is a proprietary one from visiontek with a custom output that has a dongle that allows it to do 4 monitors in a low profile slot.

    Thank You
    Reply
  • herrdoktor330 - Friday, September 28, 2012 - link

    To add on to this one, can you test the APU with those cards IN CROSSFIRE?

    I seriously think that you could get decent enough HD gaming from this setup with some eye candy with a little help from a discrete GPU. I've been waiting on this platform for a while and, while the CPU performance will be similar to the mobile variant (thus underwhelming), this APU is going to be awesome for the HTPC that's also gaming capable.
    Reply
  • Arbie - Thursday, September 27, 2012 - link


    For all the reasons you listed, Crysis Warhead is very much worth keeping in the mix. Personally, it's one of the few games I return to and easily the best of all of them. I'm very interested in how the new chips run it.

    Thanks.
    Reply
  • SanX - Thursday, September 27, 2012 - link

    Make these processors capable of 2, 4, 6, 8-chip configurations and make appropriate cheap motherboards to sell the processors by shovels.

    They will be happy, we will be happy. Intel will be in trouble.
    Indeed, 32-core PC for less then $1000 !
    Reply
  • calzahe - Thursday, September 27, 2012 - link

    The main issue with Trinity is that it is basically almost the same as Liano, just cosmetic improvements in architecture like VLIW5 -> VLIW4 in GPU and new X86 Piledriver cores... But the number of streaming processors reduced from 400 to 384 and memory controller has still only 2 channels.

    The problem for AMD is that they don't understand that people who could buy APU to play games don't want to stick with low graphics settings in games and prefer to add extra $ to buy external graphics card and set everything in High in games. And the people who don't play games buy Intel Ivy Bridge because it consumes less energy and is less noisy.

    To make next gen Kaveri APU attractive AMD should make it with minimum 800 streaming processors and memory controller should have 4 memory channels with DDR4 support. Otherwise Intel's Haswell will destroy AMD completely next year...

    As for Laptops Market the Ivy Bridge has similar performance as Trinity but provides much longer battery life. So the solution for AMD again to make APU with 800 or more streaming processors and 4 channel memory controller - it will not give 10 hours battery life but anyway combined with effective idle cores switching-off will be more effecient in power saving than CPU + descrete graphics card. So many people will buy these laptops for gaming and HD Movies.

    Regarding the Tablets/Smartphones market, AMD should accept the fact that the GloFo/TSMC 32nm/28nm manufacturing processes are inferior to Intel's 22nm. So unless GloFo will be on par with Intel in 14nm in 2014 (what is highly unlikely) AMD has no chances against Intel. That's why instead of wasting a lot of money and resources on Brazos they should licence ARM architecture and combine it with Radeon cores what can be quite competitive or even better than Tegra or Snapdragon.

    If AMD doesn't make improvements quickly than in 1-2 years they will be sold out or bankrupt.
    Reply
  • silverblue - Thursday, September 27, 2012 - link

    Do you realise that once AMD implements its HSA initiative (along with perhaps on-die memory), it won't actually need a 4-channel memory bus? Faster clocked RAM is a must, though.

    In any case, people who buy APUs aren't in fact after bleeding edge performance but something affordable that doesn't perform like a dog. Add an external GPU if you like but that's really Vishera's area (and the dual module CPUs have no GPUs and as such will overclock better - Trinity's CPU cores could be more of a hindrance here).
    Reply
  • calzahe - Thursday, September 27, 2012 - link

    HSA will not help much if used with 2 channel memory, on-die memory or 3D memory stacking will happen best case at 14nm due to transistor budget restriction. But AMD woud be able to use faster clocked DDR4 with 4 channel memory controller even next year without much effort.

    Those who buy discrete graphics cards usually buy them together with Intel CPUs and Vishera will not change this, and also lots of people prefer Nvidia cards over AMD. So to make some competition AMD should combine middle or even high end GPUs with 4-8 x86 cores into APU and use faster clocked DDR4 with 4 channel memory controller and sell the APUs for 200-400usd - it'll be more energy efficient and cheaper than paying 200-300usd for Intel CPU plus 250-500usd for good graphics card and what's the most important is that AMD has all the technologies and resources to make this happen even next year just a correct management decision is required...
    Reply
  • wenbo - Thursday, October 04, 2012 - link

    You make it sound so easy. If it is that easy, people would have done that already. Reply
  • wwwcd - Thursday, September 27, 2012 - link

    I agreed for that AMD's all desktop platforms need of 4 channel memory controller, but I thing than this option must release immediately...Fact is DDR4 for desktop have not will before Y2015. Four channel with high frequency it's enough for for the present. Reply
  • wwwcd - Thursday, September 27, 2012 - link

    I agreed for that AMD's all desktop platforms need of 4 channel memory controller, but I thing than this option must release immediately...Fact is DDR4 for desktop have not will before Y2015. Four channel with high frequency it's enough for the present....

    Edit some errors;) ...With DDR3
    Reply
  • silverblue - Friday, September 28, 2012 - link

    It's also not cheap to implement. One of the reasons the top-end Intel boards are so expensive, I expect. I think it'd be better to go for higher speed first and foremost.

    The extra bandwidth could let the CPU breathe a little better as well as open up GPU performance at higher detail levels, however I'm not sure it'll be the massive boost people are hoping for. Keeping a 384-shader GPU means you'll get potentially HD 4830/4770 performance, with the added bonus of more RAM than either of those two cards, however Trinity isn't THAT bandwidth constrained - adding more shaders would certainly alter that picture.
    Reply
  • kyuu - Friday, September 28, 2012 - link

    "The problem for AMD is that they don't understand that people who could buy APU to play games don't want to stick with low graphics settings in games and prefer to add extra $ to buy external graphics card and set everything in High in games."

    This sentence makes no sense. If someone is looking at buying an APU, then they aren't looking at a discrete GPU setup and obviously aren't looking to run games at max settings. And, contrary to what a lot of people seem to think, a lot of people don't care about running the latest-and-greatest at max settings.

    Obviously, for an enthusiast gamer, Trinity doesn't make a whole lot of sense on the desktop (unless possibly they get asymmetrical crossfire working really well). But in the mobile arena, Trinity makes a lot of sense, giving respectable gaming prowess for significantly cheaper than an Intel CPU and discrete GPU combination as well as superior gaming battery life.

    What I'm most looking forward to is a tablet of Surface quality with a low-voltage Trinity powering it.

    No doubt more memory bandwidth would be greatly beneficial to AMD's APUs, but it's not as simple as just going to 4 channel memory. That increases the cost of the motherboard as well as paying for four sticks of memory, and it may not be practical in the mobile arena (which is where Trinity most shines, asides from HTPC duty).
    Reply
  • kyuu - Friday, September 28, 2012 - link

    "What I'm most looking forward to is a tablet of Surface quality with a low-voltage Trinity powering it."

    I should have said Trinity or, even better, one of its successors.
    Reply
  • calzahe - Friday, September 28, 2012 - link

    Memory is quite cheap now, you can find good DDR3 2133MHz 4GB 2x2GB modules for around 40usd for current 2 channel memory APUs, so you'll need to add just 40usd for another 4GB 2x2GB modules for the 4 channel memory APUs, but if done properly these APUs will be able to use 8GB of Memory. It means that for extra 40USD the new APU would be able to use 8GB of memory what is much more than 3-4GB in current monster video cards which cost 500-600usd. Also for around 100-150usd you can get DDR3 2133MHz 16GB 4x4GB.

    Can you imagine the level of next-gen graphics if APUs will be able to fully utilise 8GB, 16GB or even 32GB of 4 channel system memory!!!
    Reply
  • Marburg U - Thursday, September 27, 2012 - link

    So, Anand, you've just called this a "Review".

    Yes, you named it "part 1", but the fact is that at the moment you are publishing a review with only what AMD HAS TOLD YOU you are allowed to publish and which they are pleased to read.

    How the hell can i trust this site's reviews anymore?
    Reply
  • silverblue - Thursday, September 27, 2012 - link

    You could always go to TechReport and join in the AMD bashing if you prefer. Whilst I don't completely agree with the idea of partially lifting the NDA in a specific fashion, it's clear that AMD wants to highlight the strengths of Trinity without possibly clouding the waters with middling x86 performance.

    Piledriver is not AMD's answer to Intel, even Vishera won't be an i7 competitor in most things and might struggle to stay with the i5s sometimes, and Zambezi was definitely underwhelming as a whole, so I can understand why they wouldn't want to focus on CPU performance. Additionally, if Vishera is due out at the same time as Trinity and you get an early idea of Trinity's CPU performance, even though Vishera will be generally faster than Trinity it may be classed at the same performance level.
    Reply
  • cmdrdredd - Thursday, September 27, 2012 - link

    What's clear is AMD cannot compete in benchmarks that matter to most people who read these sites(how fast does it transcode my video vs an i5). So they try to hide that behind GPU performance charts.

    It's like Apple misleading people about the performance of their CPUs back in the day.
    Reply
  • silverblue - Thursday, September 27, 2012 - link

    Amusingly, you'd think it would easily beat an i5 at transcoding... :P Reply
  • Taft12 - Sunday, September 30, 2012 - link

    Uhh the benchmarks the readers of this site care about are the ones that ARE here - the gaming benchmarks. AT readers are intelligent enough to know CPUmark, Sandra, etc mean less than nothing. Reply
  • torp - Thursday, September 27, 2012 - link

    The A10 65W looks like it has the same GPU and about 10% less CPU clock. Now THAT part could be really interesting for a low cost PC... Reply
  • rarson - Thursday, September 27, 2012 - link

    Crossfire? Pairing one of these with a mid-range card in a hybrid Crossfire setup would be pretty awesome in an HTPC setup. Almost like a next-gen console, but much better. Reply
  • RU482 - Thursday, September 27, 2012 - link

    looking to upgrade a couple of lower power SSF systems with one of those 65W CPUs. wonder how much an ITX mobo will run Reply
  • Kougar - Friday, September 28, 2012 - link

    Given no mention of a "preview" was mentioned in the title, it would have been nice if the The Terms of Engagement section was at the very top of the "review" to be completely forthright with your readership.

    I read down to that section and stopped, then went looking through the review for CPU benchmarks which didn't exist. Can thank The Tech Report for posting an editorial on AMD's "preview" clause before I realized what was going on.
    Reply
  • Omkar Narkar - Friday, September 28, 2012 - link

    would you guys review 5880k crossfired with HD 6670 ???
    because I've heard that when you pair it with high end GPU like HD7870 then integrated graphics cores doesn't work.
    Reply
  • TheJian - Friday, September 28, 2012 - link

    Why was this benchmark used in the two reviews before the 660TI launch, and here today, but not in the 660TI article Ryan Smith wrote? This is just more stuff showing bias. He could have easily ran it with the same patch as the two reviews before the 660TI launch article. Is is because in both of those two articles the 600 series dominated the 7970ghz edition and the 7950 Boost? This is at the very least, hard to explain. Reply
  • plonk420 - Monday, October 01, 2012 - link

    are those discrete GPUs on the charts being run on the AMD board? or a Sandy/Ivy? Reply
  • seniordady - Monday, October 01, 2012 - link

    Please,can you make some test to the CPU vs... not only to the GPU? Reply
  • ericore - Monday, October 01, 2012 - link

    http://news.softpedia.com/news/GlobalFoundries-28n...

    Power leaking reduced by up to 550%; wow.
    What an unintended coup for AMD haha all because of Global Foundries.
    Take that Intel.

    AMD is also first one working on Java GPU acceleration.
    Reply
  • shin0bi272 - Tuesday, October 02, 2012 - link

    This is cool if you want to game at 13x7 at low res... but who does that anymore? When you bump up games like BF3 or Crysis2 (which you didnt test but toms did) the FPS falls into the single digits. This cpus is fine if you dont really play video games or have a 17" CRT monitor. The thing that I think is funny about this is that in all the games a 100 dollar nvidia gpu beat the living snot out of this apu. Other than HTPC people who want video output without having to buy a video card or someone who doesnt play FPS games but wants to play farmville or minecraft no one will buy this thing. Yet people are still trying to make this thing out to be a gaming cpu/gpu combo and its just not going to satisfy anyone who buys it to play games on and thats disingenuous. Reply
  • Shadowmaster625 - Tuesday, October 02, 2012 - link

    When you tested your GT440, you didnt do it on this hardware right? If you were to disable the trinity gpu and put a GT640 in its place, do you think it would still do better? Or would its score be pretty close to that of the iGPU?? Reply
  • skgiven - Sunday, October 07, 2012 - link

    No idea what the NVidia GT440 is doing there; where are the old AMD alternatives?

    Given the all to limited review I don't see the point in comparing this to NVidia's discrete GT640.
    Firstly, it's not clear if you are comparing the APU's to a DDR3 GT640 version (of which there are two; April 797MHz and June 900MHz) or the GDDR5 version (all 65W TDP).
    Secondly, the GT640 has largely been superseded by the GTX650 (64W TDP).
    So was your comparison the 612GFlops model, the 691, or 729 GFlops version?
    Anyway, the GTX650 is basically the same card but has is rated as 812GFlops (30% faster than the April DDR3 model). Who knows, maybe you intended to add these details along with the GTX650Ti, in a couple of days?

    If you are going to compare these APU to discrete entry level cards, you need to add a few more cards. Clearly the A10-5800k falls short against Intels more recent processors for most things (nothing new there), but totally destroys anything Intel has when it comes to gaming, so there is no point in over-analysing that. It wins that battle hands down, so the real question is, how does it perform compared to other gaming APU's and discrete entry level cards?

    I'm not sure why you stuck to the same 1366 screen resolution? Can this card not operate at other frequencies, or can the opposition not compete at higher resolutions?
    1366 is common for laptops. I don't think these 100W chips are really intended for that market. They are for small desktops, home theatre, entry level (inexpensive) gaming systems.

    These look good for basic gaming systems and in terms of performance per $ and Watt, even for some office systems, but their niche is very limited. If you want a good home theatre/desktop/gaming system, throw in a proper discrete GPU and operate at a more sensible 1680 or 1920 for real HD quality.
    Reply

Log in

Don't have an account? Sign up now