Comments Locked

271 Comments

Back to Article

  • codedivine - Tuesday, May 15, 2012 - link

    Does the GPU support fp64?
  • JarredWalton - Tuesday, May 15, 2012 - link

    I would assume so, though like most consumer GPUs it's going to be 1/16 FP32 performance or something similarly dire. If you have a quick test I could run, I'll be happy to report back.
  • codedivine - Tuesday, May 15, 2012 - link

    Thanks. Can you post the relevant output from "clinfo.exe"? This utility should either be part of new Catalyst releases, or you can alternately install AMD's APP SDK and it is a prebuilt utility. This will list a lot things, including extensions supported in OpenCL. It will show two devices available: one for CPU and one for GPU. If the GPU side lists cl_khr_fp64 (or the less-compliant cl_amd_fp64) then it supports FP64 in OpenCL.

    Not sure about how to test the rate.
  • JarredWalton - Tuesday, May 15, 2012 - link

    Both cl_khr_fp64 and cl_amd_fp64 are listed as supported on both the CPU and GPU. Full CLinfo output is available here:

    http://images.anandtech.com/reviews/mobile/Trinity...
  • codedivine - Tuesday, May 15, 2012 - link

    That is great! Your help is greatly appreciated!
    So I guess I am buying a Trinity then, as it will simplify my OpenCL development workflow.

    Sidenote: I have some OpenCL code under development (as part of my grad research) that might be useful for you for benchmarking purposes for as well. Will get back to you about that in a few weeks.
  • JarredWalton - Tuesday, May 15, 2012 - link

    Cool -- I'd love to see more OpenCL benchmarks, especially if they're actually meaningful to other people!
  • GullLars - Tuesday, May 15, 2012 - link

    This may not be relevant, but i think the next generation APU with GCN graphics architecture is where we will really start to see a benefit of using GPGPU on mainstream integrated GPUs.
    Various GPGPU benchmarks show a signifficant increase going from VLIW4 to GCN.

    I hope to get into openCL, GPGPU and parallel programming too at a later point. Currently i'm styding a BS in Computer Engineering; Embedded Systems. I love it, as it gives insight into both software and hardware at all levles from logic gates to complete systems.
  • Brutalizer - Tuesday, July 10, 2012 - link

    WOW! Does the Trinity only have two cores? That is brutal!

    AMD's version of hyperthreading is a piledriver core: it has duplicated several components which is much better than Intel hyperthreading. So one piledriver core, corresponds to one Intel core with hyper threading.

    So, what Anandtech is actually testing, is four Intel core cpus, vs two AMD core cpus. There is no surprise that four cores beats two cores, but the cool thing is that AMD two cores does very well compared to four Intel cores!

    If I was AMD, I would say that the Trinity only has two cores, and still it gives a four core Intel cpu a match! That is much better marketing than claiming Trinity is four cores (which is not) and get beaten by Intel four core cpus.
  • moozoo - Tuesday, May 15, 2012 - link

    Thanks!
    I too was after the CLInfo for trinity to find out its fp64 support.
  • MySchizoBuddy - Tuesday, May 15, 2012 - link

    so it only supports OpenCL 1.1 not the newer 1.2
  • AlB80 - Tuesday, May 15, 2012 - link

    No.
    It's official information.
  • JarredWalton - Tuesday, May 15, 2012 - link

    Except according to CLInfo, it does. Nice try?
  • AlB80 - Tuesday, May 15, 2012 - link

    Oops. It was.
    Now it has fp64 = 1/16 fp32.
  • princehamlet - Tuesday, May 15, 2012 - link

    I was constantly refreshing the page at 12 AM! Couldn't wait for the reviews to be posted after the embargo was lifted :D.
  • BSMonitor - Tuesday, May 15, 2012 - link

    Why? Nothing earthshattering here. AMD is scalping resources from the CPU to add TDP room and die space for more of it's bulky Radeon shaders.

    It's like buying a laptop from 2004, with a DX11 upgrade.

    AMD has the "good enough" part backwards. People want their laptop to be responsive when doing work, watching movies and browsing, etc. CPU intensive tasks. The good enough part, in regards to laptops would be the gaming. No one expects 60fps at 1080 out of laptop sitting on a plane flying somewhere.

    Way to capture the hearts of the 1% of the 1% of people looking for great gaming from their $500 laptop.
  • Articuno - Tuesday, May 15, 2012 - link

    Considering the CPU part is better than mobile Core 2 Duo parts (and thousands upon thousands of people are still using laptops with C2Ds) and the GPU part is several orders of magnitude better than Intel's best, I'd say buying an Intel laptop is like buying a laptop from 2004: expensive and extremely low price/performance for what you get.
  • JarredWalton - Tuesday, May 15, 2012 - link

    Whoa... several orders of magnitude? So, like, 1000X better? Because if anyone can offer up a GPU that's 1000 times faster than even GMA 4500, I'd take it! Turning down the hyperbole dial: AMD still has better drivers than Intel, but it's more like 20% better (just to grab a nice number out of thin air). Trinity's GPU is about 20% faster than HD 4000 as well, so that makes Trinity's GPU a whopping 44% better than "Intel's best".

    Now if you want to talk about the best Core 2 era IGP, then we'd be looking at more like an order of magnitude improvement. GMA 4500MHD scores around 1000 in 3DMark06, in case you were wondering (http://www.anandtech.com/show/2818/6). I know, it's only 3DMark -- still, call it 500 as a penalty for lousy drivers and HD 7660G is still "only" about 20X better.

    /meaningless debate
  • Articuno - Tuesday, May 15, 2012 - link

    Fair enough, kind of a knee-jerk reaction out of me there. Though I'm guessing the APU will be cheaper than the i7s it's going up against even without a discrete card added on top of them, so it's got very nice price/performance potential.
  • jensend - Tuesday, May 15, 2012 - link

    Yes, his "orders of magnitude" was hyperbole- but Intel's benchmark scores esp 3dmark really haven't reflected how awful their GPUs have been. The performance difference in real games was usually much bigger than that in synthetic benchmarks. You already mentioned driver issues. Even if you could get halfway decent performance out of some games, image quality was often a huge problem. If AMD or nV had offered that crappy of image quality they would have been totally excoriated in the press for cheating in order to inflate benchmarks; people didn't do that to Intel- probably because it would have felt like beating a handicapped child.

    But Sandy had some real improvements and then Ivy Bridge really turned things around for Intel. Beyond the performance improvements, after years of making excuses for their AF and saying that AA was unnecessary, they finally stopped making excuses and fixed them. Trinity is faster, but anybody who says that Ivy Bridge's graphics don't offer Trinity's any competition is badly mistaken.
  • Spunjji - Tuesday, May 15, 2012 - link

    Indeed - I was honestly pleasantly surprised to see HD4000 sitting so high in the charts. Finally I won't need to start warning people against Intel notebooks!

    ...except for the small problem of HD2500. Still, improvement is improvement.
  • Wierdo - Tuesday, May 15, 2012 - link

    I'll start recommending an integrated Intel GPU once I feel more confident about their driver support, which is more important than performance.

    At least now the IGP in Ivy Bridge is a decent solution for basic gaming needs, but they really need to work on their drivers, no more of that "driver update five years after product is obsolete" bs.
  • Spunjji - Wednesday, May 16, 2012 - link

    I agree, although even that seems to have improved somewhat; at least from the base standard of "what works to begin with". Here's hoping for further progress.
  • medi01 - Wednesday, May 16, 2012 - link

    Except there is one thing which isn't visible on charts: quality. Check how horrible Intel's AF is on toms Trinity review.
  • fumigator - Tuesday, May 15, 2012 - link

    "AMD still has better drivers than Intel, but it's more like 20% better "

    Unluckly I had an Ivy Bridge HD4000 notebook sitting for a week, and out of 29 games, only 60% were barely playable (performance), 15% crashed, and the rest run with strange artifacts but stable enough though.

    While I don't worry alot about gaming in a laptop, the true fact is that intel is way behind AMD in this, and we are not talking about 3D render quality. Oh my, you have to take a look at that and you won't doubt it a second. AMD and Nvidia renders are better.

    While I still hoped more from trinity, not sure to make a judgement until I grab one and put a decent super fast ram module on it and go testing.
  • JarredWalton - Tuesday, May 15, 2012 - link

    Could you please provide a list of the games that were unplayable and those that crashed? I'd love to be able to confirm problems.
  • frozentundra123456 - Tuesday, May 15, 2012 - link

    The gpu is "several orders of magnitude" better than intel's best? You do realize that an order of magnitude is 10x right? So how many orders of magnitude better 2, 3, 4 (100x, 1000x, 10000x). Overstate much?

    In fact trinity is on average what, 20-30% better than HD4000? Hardly one order of magnitude much less multiple orders of magnitude.

    Overall the chip is OK, but I was actually hoping for more improvement on the GPU side. Yes, it is improved, but not enough to really make a game go from unplayable to playable.
  • BSMonitor - Tuesday, May 15, 2012 - link

    Most laptops from that time did not use Intel's IGP on the chipset. Most had AMD or nVidia dGPU's. And these new Trinity APU's probably compete pretty closely to that.
  • RussianSensation - Tuesday, May 15, 2012 - link

    Core 2 Duo came out in 2006 not 2004.
    Intel doesn't sell Core 2 Duo laptops anymore in mainstream segments. They sell IVB laptops.

    Are you implying that C2D user will want to upgrade to Trinity?

    Most users would be far better off getting an Intel laptop with low end Kepler GPU such as GT640 than this.
  • Lugaidster - Tuesday, May 15, 2012 - link

    Assuming equal pricing sure. But there's no way an IVB with a low end Kepler will be near a trinity laptop in terms of pricing. Most likely it will be a few hundred dollars more, which depending on the target, can make a difference.

    Sure you can recommend that, but not everyone will see that as worth the extra couple of hundred dollars. Considering that aside from CPU performance, a trinity notebook is roughly equal to an IVB one, with price in their favour, they can sell lots of these.
  • JarredWalton - Tuesday, May 15, 2012 - link

    I expect we'll see dual-core IVB with Kepler going for around $800-$850 at the low end of the scale. I also expect we'll see a lot of the early Trinity laptops with A10-4600M selling for closer to $700. Hopefully I'm wrong on the Trinity side, but they did the same thing with Llano. "It's new! Charge more for it!" Not AMD's fault at all, obviously, but still irritating.
  • zepi - Tuesday, May 15, 2012 - link

    You've got it backwards.

    Stuff is priced according to the value it has for customers. To get as much money from their product as possible, regardless of manufacturing costs. Or that's what everybody is aiming for. Trinity is going to be cheap only because it's not good enough to get sales if priced higher.

    Best possible outcome for everybody would have been that cheapest Trinity-based laptops would cost about $1500, but they'd be about as fast as Ivy Bridge Quadcore-desktops with Geforce GTX680 and still achieve a battery-life of about 8min per Wh. And performance & price would both have only gone upwards from there on.

    That kind of performance-dominance would force Intel and Nvidia to drop their prices considerably (getting us the cheap laptops regardless of trinity being pricey) and we'd still have to option to go for über Trinity's if we'd have the cash.

    And it would save AMD from bankruptcy, ensuring that we'd have competition in future as well.

    Llano, Brazos and Bulldozer are all horrible products for AMD. Good product is characterized by the fact, that it has considerably more worth to the customer than it costs to manufacture it. If a product is good, it's easy to price it accordingly, and people will still buy it. AMD's CPU's are apparently very bad products, because AMD is making huge losses at the moment. And I don't think it's the GPU-division that's causing those losses.
  • JarredWalton - Tuesday, May 15, 2012 - link

    Products are priced according to where the marketing folks think they'll sell. All you have to do is walk into Best Buy and talk to a sales person to realize that they'll push whatever they can on you, even if it's not faster/better. And I think the bean counters feel they can sell Trinity at $700 or more--and for many people, they're probably right. We'll see $600 and $500 Trinity as well, but that will be the A8 and A6 models, with less RAM and smaller HDDs.

    As far as competition, propping up an inferior product in the hope of having more competition isn't healthy, and if AMD has a superior product they simply charge as much as Intel. NVIDIA is the same. If someone came out with a chip that had the CPU performance of IVB and the GPU performance of a GTX card, all while using the power of Brazos...well, you can bet they'd charge an arm and a leg for it. They wouldn't sell it for $1500, they'd sell it for $2500--and some people would buy it.

    Ultimately, they're all big businesses, and they (try to) do what's best for the business, so I buy whatever product fits my needs best. I wish Trinity were more impressive, particularly on the CPU side of the equation. I think if Trinity's CPU were as fast as Ivy Bridge, the GPU portion would probably end up being 50% faster than HD 4000; unfortunately, there are titles that require more CPU work (Skyrim for instance) and that starts to level the playing field. But wishing for something that isn't here, or playing the "what if" game, just doesn't really accomplish anything.
  • Targon - Wednesday, May 16, 2012 - link

    And you can get a quad-core A6 laptop for under $500 right now. If you pay attention, you generally get what you pay for. For most users, going with an AMD quad-core laptop does provide a decent product for the price. For some, CPU power is more important, and for others, a more well rounded machine is more important. I expect that A10-4600 laptops will start closer to $600 than $700, unless you are looking at machines with a large screen, discrete graphics, or something else that increases the prices.
  • CeriseCogburn - Wednesday, May 23, 2012 - link

    What you're all missing is all the then second tier Optimus laptops that will have much deflated pricing, as well as the load of $599 amd discrete laptops that will sell like wildfire and please those who waited - just like the amd fans are constantly waiting for nVidia to release so they can snag a second tier deflated price amd card.

    Since the "cpu doesn't matter !" as we have been told, there's no excuse to not snag a fine and cheap Optimus that won't have an IB.

    This is the "best time in the world" for all the amd fans to forget all prior generations of laptops and pretend, quite unlike in the video card area, that nothing else exists.

    I love how amd fans do that crap.
  • evolucion8 - Tuesday, May 15, 2012 - link

    Also remember that Penryn was launched on 2007-2008 and until late 2009, several Core 2 Duo laptops were released. I have a Gateway MD7309u and it was launched on October 2009 and still feeling very snappy and has good battery life, I hate its GMA 4500M with my whole heart.....
  • Nfarce - Tuesday, May 15, 2012 - link

    Yeah well I don't understand the point of buying a low-mid range laptop expecting to be enjoying playing games at basic laptop 1366x768 resolutions. What's the point?

    You can spend around $1,200 on a mid-range i5 turbo boost laptop with a discreet GPU and 1600x900 resolution screen that plays games decently without completely shutting down the eye candy sliders. Save up and get a better laptop - and Intel with a dedicated AMD or Nvidia GPU. If you can afford $600 now, you can afford $1,200 down the road and enjoy things much better.
  • CeriseCogburn - Wednesday, May 23, 2012 - link

    I agree but the famdboy loves to torture itself and claim everyone else loves cheap frustrating crap - often characterized as a "mobile employee on the road, in the airplane, or at the hotel spot" needing a "game fix"...(in other words someone flush enough to buy +discrete) as you pointed out.
    The rest of the tremendous and greatly pleased "light gamers" will purportedly be playing at work( no scratch that) or on their couch at home (that sounds like the crew) ... and then one has to ask why aren't they using one of the desktops at home for gaming... a $100 vidcard in that will smoke the crap out of "the light gamer".

    That leaves "enthusiasts" who just want to play with it and see for a few minutes if they can OC it, and "how it does" with games... and after that they will want to throw it at a wall for how badly it sucks - not to mention their online multi-player avatar will get smoked so badly their stats will plummet... so that will last all of two days.

    So we get down to who this thing is really good for - and I suppose that's the young teen to pre teen brat - as a way to get the kid off mommy's or daddy's system so they can have the reigns uninterrupted... so the teeny bopper gets the crud low $ cheap walmart lappy system that should also keep them tamed since being too rough with it means the thing snaps in half a the plastic crumbles.

    Yep - there it is - teeny bopper punkster will just have to live with the jaggied pixelized low end no eye candy crawler - and why not they still love it much more than homework and have no problem eyeballing the screen.
  • Latzara - Tuesday, May 15, 2012 - link

    While i agree with the 'nothing earthshattering' part I have to wonder what kind of average Internet browsing usage are you commenting on when you say 'People want their laptop to be responsive when doing work, watching movies and browsing' -- Most of the CPUs on the entire board presented here are enough for work - not graphics modelling mind you - excell, DB, mail, presentations, average calculation load, and even smaller programming projects - which constitutes most of the workload an average worker is gonna get, movies stopped being an issue way before, and what kind of browsing are we talking about that will make your platform unresponsive (i don't mean frozen)? 25 tabs at once? Cause i've done that with a much weaker platform and had no issues...

    The main problem i see is that the plaform hasn't moved as much as ppl hoped, but enough to be a new iteration in terms of progress - and with the right pricing it could be the sweet spot for many of the broader average consumers - not just the '1% of the 1% of people looking for great gaming" ...
  • BSMonitor - Tuesday, May 15, 2012 - link

    Load up a couple Java runtime environments in those browsers. Some flash. I did have an etc in there. I am a multi-tasker, and cannot stand waiting any amount of time. For the majority of real laptop owners, a late Pentium M, Athlon 64/X2, is not enough power for any real work.
  • Spunjji - Wednesday, May 16, 2012 - link

    Please define a "real" laptop owner? I own an Alienware and I don't do any of that sort of crap. Mind you, most users I have met express more patience than you do, too. regardless, in none of these metrics do you appear to represent the majority, which is the target market for this chip.
  • medi01 - Thursday, May 17, 2012 - link

    That's simply BS, my dear.
    Most of the "starting" task is HDD bottlenecked.
    Having a lot of apps loaded is RAM constrained.

    It has ABSOLUTELY NOTHING to do with CPU power.
  • B3an - Thursday, May 17, 2012 - link

    Yeah because the CPU just does nothing, at all, ever. It just sits there totally idle all the time right.
  • medi01 - Friday, May 18, 2012 - link

    That's exactly what modern CPUs do (for quite a while) in a majority of PCs most of the times: IDLE. You didn't know that? Oh well. (Gamers being an exception)

    I encode video a lot (so far the most CPU extensive task besides gaming that you can imagine at home, which seems to be shifting towards GPU though) and even that is a "batch" task, I couldn't care less whether it is finished at 2pm or 3pm at night.
  • sviola - Tuesday, May 15, 2012 - link

    Well, load eclipse or Visual Studio, a local DB server, a local application server, a few browser windows, some spreadsheets and documents, a couple of IM and some terminal windows and you got my usual work setup running. It can be demanding on the CPU and sometimes make the system slow down. And waiting for the system to respond can be frustrating in these situations, not mentioning having to compile all the project...
  • mpschan - Tuesday, May 15, 2012 - link

    Are you really expecting to do all that processing on a 500-700 dollar laptop? You're clearly doing work-related activities, which is not the target consumer here.
  • Spunjji - Wednesday, May 16, 2012 - link

    +100. Tired of this ass-hattery.
  • mikato - Wednesday, May 16, 2012 - link

    I think your biggest problems there would be Eclipse/Visual Studio. Also if you're using MS SQL Server as the DB (with Management Studio?) then definitely that too. And I don't know what app server you mean. Compiling projects? This stuff can be slow on a desktop, come on. How about using a lighter IDE or Notepad++ or MySQL for the DB.... or doing this work on a desktop like everyone else.
  • medi01 - Thursday, May 17, 2012 - link

    Mentioned example sickens me. I do similar stuff. Except I also have virtual machine running on top of things.

    NOT ENOUGH RAM is what you get in these situation, not slow CPU!!!!
  • CeriseCogburn - Thursday, May 24, 2012 - link

    Well then all core 2 / duo users can just upgrade their ram no need for this trinity laptop crap.
    No buying anything but some cheap ram.
    Glad to hear it.
  • xd_1771 - Tuesday, May 15, 2012 - link

    BSMonitor, none of those tasks are CPU intensive and will consume less than 50% of that CPU.

    Your analogy is completely backwards. With programs INCLUDING video playback programs (seriously, I want you to name a popular video codec that isn't going to be accelerated by this GPU - this standard has existed for a number of years), office programs (Office 2010 and up) and pretty much every major web browser in existence having moved onto making large use of GPU acceleration, CPU will start ceasing to matter. The Trinity APU is very well balanced for the performance on both sides that it offers.

    With more TRUE cores to balance non-intensive CPU workloads around than the competition, it arguably offers much better multitasking ability.
  • Stas - Tuesday, May 15, 2012 - link

    I agree with xd_1771. A mid-range CPU from 2 years ago is plenty for any CPU requirements an average user might have (Office, browser, IM, pr0n). The one thing that's been limiting laptops for generations is the GPU. AMD has brought serious graphics to laptop. Not only do you benefit through improvements in gaming and 3D software, with almost every resource intensive application becoming GPU-accelerated, you get better performance all-around.
  • zephxiii - Tuesday, May 15, 2012 - link

    I am using this old Thinkpad T60 with C2D 1.66ghz built in 4/2007 and it is plenty fast enough for regular use lol. The only thing that really sucks about it is the spinning HD in it.

    I use a T61p at home with C2D 2.2ghz and Quadro something HD with SSD and that thing does everything I need it to (Photoshop, lightroom, Internet, flash video etc.).

    Both running Windows 7.
  • BSMonitor - Tuesday, May 15, 2012 - link

    Nope, I promise that C2D 1.66Ghz lags for Flash enhanced./Java Runtime environments. Especially multitasking.

    Please, quit defining regular use with acceptance of slow. Drop even Core 2 Quad 9550 in that home PC, and I promise you would not go back.
  • Belard - Tuesday, May 15, 2012 - link

    I have a ThinkPad R61 with PDC (Bottom end Core2 with missing cache) at 1.6Ghz. I bought for $550 off the shelf new when VISTA was about 8 months old. It came with XP-Pro, 1GB RAM and more importantly, a matte screen. I use it almost every day and since then I've added 1GB and Windows7 and it runs better than it ever did when it was new.

    Its slow compared to more C2Q Q6600, but the R61 does what I need for a mobile system. I sure don't like using Photoshop on it. But its mostly for browsing, Office apps and xfer of data/work.

    It still faster than ANY Pentium4 class CPU.

    I have an urge to go IvyBridge this year... but my Q6600 doesn't really keep me waiting much (other than video encoding) with 4GB / Win7. Nope, going on vacation this summer out-weighs a new computer. :)
  • BSMonitor - Tuesday, May 15, 2012 - link

    Stop telling everyone what CPU is GOOD enough. There truly is software out there that my Core 2 Duo at work lags behind. My Core i7 system at home is remarkable smoother and more responsive. Neither with an SSD.
  • tfranzese - Tuesday, May 15, 2012 - link

    For a user who can't stand to wait, you've got your priorities screwed up if you're not using an SSD on those system.
  • evolucion8 - Tuesday, May 15, 2012 - link

    I wonder which kind of sofware tuns too slow on a C2D. I have a i7 2600K at 4.5GHz, much faster on WinRaR, media encoding, gaming etc. But running everyday tasks like web browsing, office, media playback etc, doesn't feel much different from my Core 2 T9300 and my i7 machine. My laptop does have very good encoding power which is very tolerable, but definitively my i7 destroys it, but considering that my C2D has a 35W TDP I don't loosing some performance for the sake of lower heat dissipation and battery consumption.
  • vegemeister - Tuesday, May 15, 2012 - link

    We were just getting to the point where a CPU could be good for 6-8 years, but then the web developers started making applications and desktop environments. Not to mention the horrors of flash and Java. What Intel giveth, web 2.0 taketh away.
  • medi01 - Thursday, May 17, 2012 - link

    Bullshit.

    Most of the web 2.0 is nowadays "also gotta run on tablets" and no way inhell it's "java based", or "flash based" or CPU intensive.
  • seapeople - Tuesday, May 22, 2012 - link

    You people are a little crazy coming up with exotic applications that stress CPUs. It's much simpler than that.

    I'm running a Q2720m with Intel SSD and fiber optic internet, and I notice immediately if I turn turboboost off while browsing standard webpages with Chrome + Adblock. My browsing is noticeably CPU limited, especially in cases where I'm clicking through dozens of large webpages to find a specific page I'm looking at (such as browsing backwards through poorly designed blogs).

    I would detest running something with the single-threaded speed of AMD's latest offerings. Of course, that's why I'm not in that target market.
  • CeriseCogburn - Thursday, May 24, 2012 - link

    And thus anyone who has a laptop already and especially anyone with an old Optimus has no need to spend $700 or $800 on a trinity laptop - just a few bucks on some ram.
    If they have a lower end MXM discrete they can just upgrade that and not have the hassles of setting up another whole system and spending days transferring data and configuring and then crossing their fingers the amd trinity doesn't glitch out.

    We can also forget any sort of "future proofing" with the trinity, unlike the optimus systems or any discrete laptop, many available for a couple hundred less now because their core 2 processors are considered "old" but thanks to you brilliant amd fans we now know it doesn't matter at all might as well get a real laptop gaming machine.

    Thanks for the hot tips.
  • BSMonitor - Tuesday, May 15, 2012 - link

    Sure it is. If you browse Flash enhanced, Java runtime pages, Adobe Reader embedded, etc.. Do that and be working in a Java runtime software package, Visual Studio, Office App with a bunch of the fancy tables and drawing tools.

    It is NOT enough CPU. I would not tolerate it.
  • Taft12 - Tuesday, May 15, 2012 - link

    OK you win. You've convinced us that Trinity is too slow for 1% of users.

    In other words, you've convinced us that this product is a winner. I just knew you were an AMD fanboi at heart!
  • Spunjji - Wednesday, May 16, 2012 - link

    LOL
  • medi01 - Thursday, May 17, 2012 - link

    Minimized Visual Studio NEEDS NO CPU POWER.
    Inactive Adobe Reader "embedded" NEEDS NO CPU POWER.
    Inactive Office App with tables and what not NEEDS NO CPU POWER.
    Miraculous "Java runtime software package" as well as flash would only consume more than a tiny bit of CPU only if it was written by drunken monkeys.

    RAM is what you need when multitasking. A lot of it.
    And fast HDD.

    Slapping SSD hard drive into your pc/notebook would speed mentioned things up tremendously. Faster CPU would not.
  • kshong - Tuesday, May 15, 2012 - link

    I have read many of your comments on AMD and Intel and I have a slight suspicion that you're a hard-to-please person.
  • Spunjji - Tuesday, May 15, 2012 - link

    That was putting it kindly. ;)
  • BSMonitor - Tuesday, May 15, 2012 - link

    Should I expect a brand new PC to behave like one from 6 years ago for some applications??
  • Spunjji - Wednesday, May 16, 2012 - link

    Thing is, it doesn't. Thus your argument fails.

    Good day to you, sir! :D
  • mikato - Wednesday, May 16, 2012 - link

    Spunjji - Anandtech comments area would like to hire you as the new BS Monitor! We had a problem with the old one.
  • duploxxx - Tuesday, May 15, 2012 - link

    if you want a more responsive laptop then buy an SSD, no one needs a laptop like the recent released Intel QC that turbo up all the way to 3.xxx whalhalle GHZ... its disc performance that is lacking most.
  • jabber - Tuesday, May 15, 2012 - link

    Exactly.

    My main laptop that I use most of the day is a 1.3Ghz CULV dual core from Intel. It does everything I need as a roving PC engineer. It benches the same as an old PentiumD 2.8Ghz.

    If I put a SSD in it (which I want to) then I'd probably struggle to tell it apart day to day from one with a i5 cpu in it. CPU makes no difference to me.

    Since dual cores came out in 2005/6 customers dont notice CPU upgrades all that much. You put a SSD in though and they notice that. Oh and if they can play WoW or Sims with the trimmings then they love that too.

    Once again is it possible to add the latest Sims game to the benchmarks list? I dont play it myself but nearly every teens laptop I get has it on and I do get asked about good laptops for such a game.
  • BSMonitor - Tuesday, May 15, 2012 - link

    Roving PC engineer? You mean you go to meetings and look at Word Documents. That hardly makes you the "average" user. The rest of us do the actual work.
  • Spunjji - Wednesday, May 16, 2012 - link

    Look up the meaning of the word average.
  • CeriseCogburn - Thursday, May 24, 2012 - link

    I'll monitor this one - since really old cpu's are way more than enough grunt since the trinity cpu sucks so badly and amd needs another gigantic hand of endless applause and praise by many frenetic amd fans flavoring up the place, we have learned one thing -

    All the old laptops are just fine. If you have a discrete of any sort, just use that. Add a little cheap ram, or add an SSD and it will beat trinity no problem all the time.

    I thank all the trinity lovers for the hot tip - no one needs a trinity laptop.
  • BSMonitor - Tuesday, May 15, 2012 - link

    Not for facebooking and trolling no. But I do real work on my PC.
  • Spunjji - Tuesday, May 15, 2012 - link

    Name the part of their CPU performance that doesn't meet the "good enough" metric. Seriously, convince me of your argument. So far it sounds like the usual crap I hear from people who don't understand the average user's needs.
  • BSMonitor - Tuesday, May 15, 2012 - link

    And you speak for "average" user like you know 100 million business users. I work in an IT shop. It took years of convincing to get off our P4's to Core2Duo. Because management takes the "good enough" stance like you. Waiting for email client base on JAVA runtime is NOT acceptable. Having Adobe reader pages lag when browsing them, re-sizing, is NOT acceptable. It STILL lags behind my Core i7 machine at home. And is VERY noticeable.

    Again, ebay, FB, yahoo... These are not the majority of users who care about performance.

    In regards to 14 y/o's and housewives, yes, trinity rules. But then so did those laptops from 6 years ago. For them.
  • sumguy+4 - Tuesday, May 15, 2012 - link

    So you are claiming that Intel's performance has improved by only 20% over the past 6 years??!?

    The article's summary stated that Trinity's CPU deficit vs Intel is 20-25% (while being cheaper and offering better video perf)

    Show us the mobile CPU from 6 years ago that could even come close.
  • juampavalverde - Tuesday, May 15, 2012 - link

    Wait... youre using and email client? haha... and its based on java? LOL!. And youre using adobe reader? ROTFL!

    A great part of the performance comes using software that gets all the juice from the hardware you have, there is no need even in the newest systems to use so bloated software.

    By the way, i have an old HP 6910p with a C2D and 2 GB DDR2, GM965, it kicks butt and im thinking to throw a cheap SSD on it because is good enough to do any task i need to do (office apps, web apps, remote connections).
  • CeriseCogburn - Thursday, May 24, 2012 - link

    The average home user needs adobe reader guy. They won't get by without it - the home user always has some adobe pdf they have to be able to open. So that's the whole base.
    You have failed.
  • Spunjji - Wednesday, May 16, 2012 - link

    You have quite massively misinterpreted the type of person I am. I would not have a laptop running an SB i7 if all I wanted was "good enough" and I hate management who ignore the genuine requirements of their users.

    The point I am making is that Trinity does *not* perform like a 6 year old CPU. It is about 2 years behind Intel, granted, but most users I have encountered are 2 years back from that in their requirements. There are edge cases and you appear to be one. Good for you. I do not wish for this to be a pissing match, I do not know 100 million business users, I have just spent a long time mediating between ordinary people and I.T. and I do it for a living.

    I will be recommending Trinity to people looking for a new laptop that is good for multi-use roles, won't break the bank, and oh yes is *new*. 6 year old processors are irrelevant when you're buying a new machine, in case you hadn't noticed, and people don't just replace laptops because they want something *faster*. Your inability or unwillingness to realise this puts you at a significant distance from the majority of buyers.
  • CeriseCogburn - Thursday, May 24, 2012 - link

    No, core 2 duos are all over the place still brand new, so maybe your customers are getting really screwed.
  • yyrkoon - Tuesday, May 15, 2012 - link

    "It's like buying a laptop from 2004, with a DX11 upgrade."

    Not even close. I actually own a Liano system. An A6-3400m system that in reality only operates at 1400Mhz. Despite the marketing hype that it has turbo boost to 2300Mhz. Once the CPU heats up a little, 1400Mhz is all you get. The lack of responsiveness you mention is all in your head.

    Besides, when / if there were application loading slow downs, I would have to question the application its self, the storage medium which I am using, or the fact that my own system is currently only running single channel memory - First.

    Even if there were some noticeable performance hit. It is not as if I had A. D. D, and could not stand to sit another 5-10 seconds.

    Key word. "Priorities". We all have them. College tuition vs a laptop that will play ( next latest greatest game ) *really fast*. You have your priorities. You make the call. For your own self.
  • potatochobit - Tuesday, May 15, 2012 - link

    you are doing it wrong.

    if you need more than 1500mhz you need to turn on AMD overdrive.

    it is not a problem with the computer, it means your bios is probably locked by the manufacturer. you might also try googling about k10stats.
  • yyrkoon - Tuesday, May 15, 2012 - link

    Never said anything about needing more than 1400Mhz,. I said the responsiveness "issue" was all in the original posters mind. The point I was eluding to was that the A10 should be perfectly fine with a clock speed of 2300Mhz. Even better if it can be clocked higher while remaining in a decent temperature range. Which I am thinking it should.

    And yeah, K10STATs works just fine, for running the clock up to 2300Mhz. *If* I did not care about the laptop CPU temps running at 89C, and above. Which I would add, is the wrong way.

    @rarson

    I can retype it just for you, if my point was not clear. Sorry, the A6-3400M part must have thrown you. EL EL A EN OH - Better ? Or still too confusing for you ?

    Seriously? Is that the best you've got ?
  • Roland00Address - Tuesday, May 15, 2012 - link

    but can we have a follow up with how mobile trinity and ivy bridge handle diablo 3 since many laptop purchasers will be purchasing laptops with the express purpose of playing that game.
  • A5 - Tuesday, May 15, 2012 - link

    I can't imagine it'll be too different from Starcraft 2.
  • dwade123 - Tuesday, May 15, 2012 - link

    Meh. Just grab an Intel and at least a GT 640m. That's what most people would do.
  • kshong - Tuesday, May 15, 2012 - link

    The major problem that I will have with trinity is price. If Jarred's price estimate is correct (600-700 dollars), I think I will have to buy an ivy bridge cpu + nvidia geforce gt 640 computer.

    I wish AMD would sell an A-10 powered laptop for 450-550 dollars. Now that is a price that I am willing to pay for. Otherwise, I think I will pay a bit more to get something else.
  • Kaggy - Tuesday, May 15, 2012 - link

    And at lower power consumption.
  • Pantsu - Tuesday, May 15, 2012 - link

    Unfortunately for AMD, the processor is only a small part of the laptop BOM. Even if Trinity is cheaper, the difference in the end product is not that big. Still, the mainstream 15" cheapo laptop is perhaps the last refuge AMD has.

    Looks like Trinity is still the way to go for low end gaming, but it's disconcerting to see HD4000 actually beat it in few tests. AMD really needs to get a better CPU architecture going.

    HSA is still long ways to go too, and I just don't see AMD pulling off the necessary developer support for it to really work, unless Intel goes along the same lines, in which case they might as well just give up.
  • iwod - Tuesday, May 15, 2012 - link

    I am guessing Haswell will have some CPU improvement with FMA, New AVX, etc... and 3 times faster Graphics..... I dont see Trinity or what ever behind it having too much hope.

    I am also wondering if Intel could further improve those games performance with better drivers. Again Intel may have caught up with the hardware department ( The easier part ), their drivers are still lacking behind.
  • Spunjji - Tuesday, May 15, 2012 - link

    I'd like to think that an AMD CPU with their current GCN graphics on a 28nm-or-smaller process would at least be GPU competitive with Intel. We live in interesting, times, though.
  • duploxxx - Tuesday, May 15, 2012 - link

    you can always believe marketing that they will be able to perform that kind of GPU leap while in the past they have always struggled to enhance. Sure the 4000 is already getting better (near Liano perf) it only shines in the few synthetical benchmarks and where the AMD is limited to CPU performance.

    perhaps try a few higher res games and see how much is left from the Intel GPU...

    AMD is limited to memory bandwidth, Intel will suffer from the same.
  • cyrusfox - Tuesday, May 15, 2012 - link

    So when will I be able to find a good trinity laptop with backlit keyboard in a small 13" form factor or preferably smaller. I would like to upgrade from brazos, and brazos 2.0 isn't going to hack it.
  • JarredWalton - Tuesday, May 15, 2012 - link

    This is the hard part: just about every OEM that makes an AMD laptop ends up targeting the budget market, and in that market they cut every corner possible. Forget quality LCDs (which would add around $10 to the BoM for most laptops), forget better build materials, and forget amenities like backlit keyboards. Oh, I'm sure someone will make a Trinity laptop with a backlit keyboard, but then they'll probably ask $800 for it. HP has their Envy 14 coming in "Sleekbook" trim that might get you a backlit keyboard, though--and it's price starts at $700.
  • FrontBoard - Tuesday, May 15, 2012 - link

    Trinity is the name of a river in Texas, just as Comal and Brazos are. That is likely where the codenames come from.
  • kevith - Tuesday, May 15, 2012 - link

    Thanks!
  • JarredWalton - Tuesday, May 15, 2012 - link

    For the record, I *did* know that AMD is grabbing names of rivers. I guess it could be better stated: "I wonder if AMD was trying to have a secondary meaning with their codename." And it was a nice hook for the conclusion. :-)
  • raghu78 - Tuesday, May 15, 2012 - link

    AMD needs to do much better with their CPU performance otherwise its looking pretty bad from here.
    Intel Haswell is going to improve the graphics performance much more significantly. With some rumours of stacked DRAM making it to haswell it looks pretty grim from here. And we don't know the magnitude of CPU performance improvements in Haswell ? AMD runs the risk of becoming completely outclassed in both departments. AMD needs to have a much better processor with Steamroller or its pretty much game over. AMD's efforts with HSA and OpenCL are going to be very crucial in differentiating their products. Also when adding more GPU performance AMD needs to address the bandwidth issue with some kind of stacked DRAM solution. AMD Kaveri with 512 GCN cores is going to be more bottlenecked than Trinity if their CPU part isn't much more powerful and their bandwidth issues are not addressed. I am still hoping AMD does not become irrelevant cause comeptition is crucial for maximum benefit to the industry and the market.
  • Kjella - Tuesday, May 15, 2012 - link

    Well it's hard to tell facts from fiction but some have said Haswell will get 40 EUs as opposed to Ivy Bridge's 16. Hard to say but we know:

    1. Intel has the TDP headroom if they raise it back up to 95W for the new EUs.
    2. Intel has the die room, the Ivy Bridge chips are Intel's smallest in a long time.
    3. Graphics performance is heavily tied to number of shaders.

    In other words, if Intel wants to make a much more graphics-heavy chip - it'll be more GPU than CPU at that point - they can, and I don't really see a good reason why not. Giving AMD and nVidia's low end a good punch must be good for Intel.
  • mschira - Tuesday, May 15, 2012 - link

    Hellooouuu?
    Do I see this right? The new AMD part offers better battery life with a 32 nm part than Intel with a spanking new 22nm part?
    And CPU performance is good (though not great...)?
    AND they will offer a 25W part that will probably offer very decent performance but even better battery life?

    And you call this NOT earth shattering?

    I don't understand you guys.
    I just don't.
    M.
  • JarredWalton - Tuesday, May 15, 2012 - link

    Intel's own 32nm part beats their 22nm part, so no, I'm not surprised that a mature 32nm CPU from AMD is doing the same.
  • Spunjji - Tuesday, May 15, 2012 - link

    ...that makes sense if you're ignoring GPU performance. If you're not, this does indeed look pretty fantastic and is a frankly amazing turnaround from the folks that only very recently brought us Faildozer.

    I'm not going to chime in with the "INTEL BIAS" blowhards about, but I do agree with mschira that this is a hell of a feat of engineering.
  • texasti89 - Tuesday, May 15, 2012 - link

    "Intel's own 32nm part beats their 22nm part", how so?

    CPU improvement (clk-per-clk) = 5-10%
    GPU improvement around 200%
    Power efficiency (for similar models) = 20-30% power reduction.

  • JarredWalton - Tuesday, May 15, 2012 - link

    Just in case you're wondering, I might have access to some other hardware that confirms my feeling that IVB is using more power under light loads than SNB. Note that we're talking notebooks here, not desktops, and we're looking at battery life, not system power draw. So I was specifically referring to the fact that several SNB laptops were able to surpass the initial IVB laptop on normalized battery life -- nothing more.
  • vegemeister - Tuesday, May 15, 2012 - link

    Speaking of which, why aren't you directly measuring system power draw? Much less room for error than relying on manufacturer battery specifications, and you don't have to wait for the battery to run down.
  • JarredWalton - Wednesday, May 16, 2012 - link

    Because measuring system power draw introduces other variables, like AC adapter efficiency for one. Whether we're on batter power or plugged in, the reality is that BIOS/firmware can have an impact on these areas. While it may only be a couple watts, for a laptop that's significant -- most laptops now idle at less than 9W for example (unless they have an always on discrete GPU).
  • vegemeister - Wednesday, May 16, 2012 - link

    You could measure on the DC side. And if you want to minimize non-CPU-related variation, it would be best to do these tests with the display turned off. At 100 nits you'll still get variation from the size of the display and the efficiency of the inverter and backlight arrangement.
  • texasti89 - Tuesday, May 15, 2012 - link


    A10-4600M's TDP = 35W
    I7-3720QM's TDP = 45W

    I'm pretty sure that Intel's 22nm is more power efficient that any 32nm process available in the industry. The efficiency of Intel GPU architecture is what makes their graphic solution appears to be comparable to AMD fusion parts.
  • Lolimaster - Tuesday, May 15, 2012 - link

    As obviously with the biased reviewers.

    Yeah GJ. Compare a top of the line UBER-expensove IB quad core with the highest TDP and the highest frequency vs A10 Trinity wich costs 3times less(if not more) thant that i7 3720QM.

    HD4000 performance is craptastic. Don't fool people with biased comparisons, at medidum detail and low res, cpu take advantage. For mobile each Mhz towards the 3Ghz and above improve performance.

    BUT WE ARE TALKING ABOUT AN i7 IB 3x times MORE EXPENSIVE than Trinity with WAY HIGHER MHZ. It's not the pathetic HD4000 that is shining is just the cpu, you can put an HD6450M and it will appear "faster" than Trinity if you pair with a high end expensive cpu.

    It's like the moronic reviews with a i7 3770K ($300+) vs A8-3870K ($120).

    Everyone knows that the real competion are the dual core i5 and similar price.

    And again, medium details when APU's prooved to offer high quality in most games.
  • JarredWalton - Tuesday, May 15, 2012 - link

    http://www.anandtech.com/bench/Product/600?vs=580

    I've got Mainstream and Enthusiast performance results in there for the games, but there's not much point in running games at 1600x900 High settings at <30 FPS is there?

    I have a whole section stating why we're including the systems we're including. Are you seriously delusional enough to suggest that we not show HD 4000 performance? There are no other HD 4000 results available for the time being, so either I use the i7-3720QM or I omit Ivy Bridge entirely. For you to imply its inclusion (with the note--italicized even!--that "these two laptops do not target the same market") is somehow biased is in fact far more bias than anything I've shown. And the pricing is twice as high for the ASUS system, not three times -- in fact I'd guess the Trinity laptop would be closer to $800 as configured, since it has Blu-ray and an SSD.

    What's more, throughout the review, I've included dual-core i5-2410M results and discussed how AMD's Trinity stacks up. Judging by Sandy Bridge, dual-core Ivy Bridge will be within 10% of the quad-core scores for gaming--it's not like many games can use more than two CPUs, and so it's really just a matter of the HD 4000 clocks being slightly lower on i5 models. You fail to grasp this fact with your ranting and biased outlook, unfortunately.

    In other words, I think your "moronic reviews" comment reflects your reading comprehension skills--or lack there of. Better luck next time. You might want to sign up for the remedial math and basic reading classes at the local community college.
  • kyuu - Tuesday, May 15, 2012 - link

    "I've got Mainstream and Enthusiast performance results in there for the games, but there's not much point in running games at 1600x900 High settings at <30 FPS is there?"

    Is that that the FPS you get? Did you actually test this or just assuming? Also, you can run 1600x900 without automagically turning up the detail settings to High at the same time. I, for one, am interested to see if the performance advantage increases over Llano/HD4000 when you shift more of the burden to the GPU side. At x768, it seems like the CPU would still be handling enough to make the CPU a substantial bottleneck.
  • JarredWalton - Tuesday, May 15, 2012 - link

    Yes, the scores in Mobile Bench are all actually tested -- including the 5 FPS average score of Trinity at 1920x1080 with 4xAA in Battlefield 3. (Yes, watching that made me feel a bit nauseous....) I could test 1600x900 at medium detail, but I don't expect any major changes from what the existing scores show.
  • Denithor - Tuesday, May 15, 2012 - link

    Actually those facts are very interesting to some of us! It lays out what the system can/cannot handle in practical terms. Now, granted, BF3 @ 1080p/4xAA is kinda an obvious fail scenario, but 1080p medium detail might be good to know.

    One real question that I haven't seen mentioned yet - how come there were no Intel cpu + nVidia gpu systems included in this testing? That seemed like a no-brainer to me...
  • JarredWalton - Wednesday, May 16, 2012 - link

    I thought the Acer TimelineU was a good choice. The only other recently tested laptops with Intel + NVIDIA are the Razer Blade (if people complain that N56VM is too expensive, what would they say about a $3500 laptop!?) and the Alienware M17x R3 (completely different class of hardware and again over $2000). The others like Dell XPS 15z came before we changed our game list, so we don't have some of the results for such laptops.
  • vegemeister - Tuesday, May 15, 2012 - link

    CPU speed doesn't become significant at low resolution because the resolution is low, but because the frame rate is high. The CPU must create the scene to be rendered at much higher temporal resolution.
  • bji - Tuesday, May 15, 2012 - link

    I think this was a well written article and that you laid out the facts about as clearly as could be laid out. I agree that Lolimaster has poor reading comprehension and needs some remedial education.
  • raghu78 - Tuesday, May 15, 2012 - link

    OEM laptop pricing is what changes the discussion. Also the sandybridge stock clearing firesale is a crucial factor. Given that core i7 2630qm with nvidia GT 555M is at USD 800 and entry level core i5 laptops at USD 550

    http://www.newegg.com/Product/Product.aspx?Item=N8...
    http://www.newegg.com/Product/Product.aspx?Item=N8...

    The A10 trinity laptops need to come at USD 600 with a max of 650 for the best designs, with the A8 at 500- 550 and the A6 / A4 at USD 400 - 450.Then they can clearly avoid competing core i7 with discrete GPU configs and be considered good alternatives for the low end Intel core i5, core i3 and pentium/ celeron dual cores with crappy intel HD 3000 graphics. Not to forget the the GPU drivers advantage which AMD has, very good image quality and a rapidly growing GPU accelerated apps ecosystem.
  • Khato - Tuesday, May 15, 2012 - link

    Really? The A10-4600m is going to be a $126 chip? 'Cause that's what a third of the tray price for an i7-3720QM is.
  • BSMonitor - Tuesday, May 15, 2012 - link

    You get 1/3 the performance on the CPU side.
  • bji - Tuesday, May 15, 2012 - link

    I don't know why I am bothing to respond to you, because your comments are all worthless, but I'd like to point out to anyone else who might be reading, that the CPU performance numbers are alot closer to 1/2 to 2/3 of the performance on the CPU side than 1/3.

    And 1/2 to 2/3 of Ivy Bridge CPU performance is *definitely* fast enough for 95% of users in 95% of circumstances, despite what trolls are claiming.
  • bji - Tuesday, May 15, 2012 - link

    Sorry, forget I said 2/3. That was just one benchmark. Let's just leave it at 1/2.

    I think my point is still valid. 1/2 of Ivy Bridge performance at 1/3 cost is going to be very acceptable to the vast majority of people.
  • JarredWalton - Tuesday, May 15, 2012 - link

    But the problem is you have to buy the whole laptop. If IVB goes for $350 and Trinity for $115, but the rest of the laptop ends up being $400, that means you get half the performance for 70% of the cost. And when Intel ships DC IVB chips for $150, we might be looking at 70% of the performance for 90% of the cost.

    My biggest fear with Trinity (if you couldn't tell from the conclusion) is that the laptop OEMs will price it too high. I think A10 is a decent part, provided you can get a reasonable set of laptop hardware for $600 or less. Anyway, we'll have to see what actually comes out and how much it costs.
  • bji - Tuesday, May 15, 2012 - link

    Very good points. Then we have to throw in the question of how much the extra performance is worth to the user. We'd all take extra performance for free (assuming that it didn't come at a cost of heat or battery life or other features), but would you pay 10% more for more performance that you knew you didn't need? I don't think most consumers really think in these terms of course, marketing will sell these parts, not logic, but if we're trying to make price and value comparisons, we need to be aware that the goal is to get what you need for the least money, not more than you need for the least amount more money.
  • JarredWalton - Tuesday, May 15, 2012 - link

    I'd take Trinity with an SSD over Sandy Bridge with an HDD, provided I could get a good LCD and build quality thrown into the mix. Maybe HP will deliver with the upcoming Envy Sleekbooks?
  • mrdude - Tuesday, May 15, 2012 - link

    HP offered this with the Llano, granted they charge $150 for a 1080p screen... You can also opt to buy an aftermarket 1080p screen and DIY. The Asus Llano line was extremely popular because you can buy a $70 1080p matte finish screen and upgrade a crossfired Llano. For ~$600 you got great gaming performance and a 1080p screen. Those things sold like hotcakes too.

    Jarred, I think you neglected quite a bit in this review. The improvements we've seen in Llano > Trinity actually outweigh the improvements we've seen in SB > Ivy yet the latter also has the advantage of a die shrink. The perf-per-watt improvements are by far the biggest shocker here and are nothing short of unbelievable if you consider Bulldozer's power consumption.

    While I understand using the 3720QM for the HD4000 benchmarks, why not delve into examining the Piledriver cores? There's very little info at all there with respect to what changed and what got better. What we got instead were synthetic benchmarks and a re-cap of the scores instead of some actual info. Hell, a monkey can run a benchmark but can that monkey run some meaningful benchmarks that test cache latency? AVX performance? Stress the IMC?Instead you're stating something that should be obvious (the weird multi-threaded cinebench score that actually makes sense when you consider it's a CMT design in Trinity therefore it lacks 2 FPUs compared to Llano) and that's supposed to be surprising?

    I can understand wanting to get a review out in time and giving us a rough idea of performance, but this is Anandtech. We expect a bit more than "these are the scores and these are the numbers. Onto the next benchmark."
  • Spunjji - Wednesday, May 16, 2012 - link

    I hear this.
  • mikato - Wednesday, May 16, 2012 - link

    I agree too (though the monkey part was a bit much). Maybe we can see a more in depth analysis of results, similar to Anandtech's treatment of AMD's new architecture but with hard results leading the analysis.
  • medi01 - Thursday, May 17, 2012 - link

    Half of CPU performance, but better GPU performance and power consumption.

    And I doubt that for an average consumer the first matters more than the latter.
  • wsaenotsock - Tuesday, May 15, 2012 - link

    I mean to summarize all of this, it's a low performance processor with average graphics performance (comparing to an i7 and a discrete GPU), with above average battery life. I think with the right pricing AMD can put out a great rounded mobile product. The best thing it has going for it needs to be the price since it doesn't really lead the pack in any other area. I bought a Zacate netbook and I found it very useful for the price.

    I wonder if a few generations from now, AMD can begin to outperform CPU + discrete GPU's with it's combined APU chips, or at least on a per-watt basis really deliver uncomparable performance. That would really make the AMD fusion platform a new beast but it doesn't seem ready for that yet. Still it must be a goal for AMD and it is really one that will become more important as time goes on with computing solutions miniaturizing every year.
  • Denithor - Tuesday, May 15, 2012 - link

    They won't ever surpass CPU + discrete GPU for one simple reason: TDP.

    You simply cannot match the performance of a 75-100W CPU + 100-200W GPU with a 75-100W APU.
  • Kaggy - Tuesday, May 15, 2012 - link

    Any chance of Windows 8 comparison someday, just curious.
  • Kaggy - Tuesday, May 15, 2012 - link

    i meant processor comparison on windows 8
  • EyelessBlond - Tuesday, May 15, 2012 - link

    I'm wondering how the memory clocks are affecting Trinity performance. We saw with Llano that the processor was memory constrained, such that you saw a significant improvement with DDR3 1866 as opposed to DDR3 1600; now that the chip is shuffling bits even faster I'm curious to see if more memory throughput (triple channel DDR3, the new DDR4, or even a relatively simple memory overclock) can boost APU performance even more with these new chips.
  • JarredWalton - Tuesday, May 15, 2012 - link

    Officially, mobile Trinity only supports up to DDR3-1600, which is what we have for testing. The clocks on the desktop version will likely be higher, making more bandwidth potentially useful, but unfortunately we can't test anything higher than DDR3-1600 on Trinity. And as an aside, I tried using DDR3-1600 with the Llano laptop, only to find that the BIOS won't allow anything higher than DDR3-1333 speeds. I wouldn't be surprised if this prototype Trinity laptop is the same way with regards to RAM speeds -- the BIOS on this sort of laptop is always pretty bare.
  • Khato - Tuesday, May 15, 2012 - link

    Does the prototype Trinity laptop BIOS support running the memory at DDR3-1333? That would at least offer an indication of whether certain benchmarks are memory bandwidth limited or not which would then imply whether or not higher memory frequency actually would help. Would also be interesting to see performance with just single channel memory seeing as how all too many manufacturers are still shipping laptops with that configuration.
  • JarredWalton - Tuesday, May 15, 2012 - link

    I don't think so, but I can always stick in DDR3-1333 RAM if needed. Might be interesting, but you'll have to wait a day or two for me to get something like that tested.
  • Khato - Tuesday, May 15, 2012 - link

    Good idea, so long as it doesn't either overclock the memory to 1600 anyway or simply refuse to boot, haha. Hopefully it'll work though since I'm quite curious to see the results! It'll not only give us an idea of whether higher speed memory will help, but also how much of a hit the lesser versions of the iGPU on the other SKUs will take.
  • KompuKare - Tuesday, May 15, 2012 - link

    What AMD really needs is a better IMC. Look at this recent computerbase article on IB and RAM:
    http://www.computerbase.de/artikel/arbeitsspeicher...
    http://translate.google.co.uk/translate?hl=en&...

    Unlike Llano IB doesn't perform a lot different as long it's running at DDR3-1600. Which to me says that Intel's IMC is a lot more efficient than AMD's. Ironic I know since AMD brought IMC to the x86 space, but the problem is that Intel have such a huge R&D budget compared to AMD.

    I'll be interested in Jarred trying that review sample with DDr3-1333 too.
  • Assimilator87 - Tuesday, May 15, 2012 - link

    A piece of my soul was crushed when I saw the HD 4000 ahead of Trinity. I don't care what the situation is, that should NEVER happen. 22nm and GCN can't come soon enough.
  • ET - Tuesday, May 15, 2012 - link

    While the HD 4000 still gets too low performance on occasion, I was generally impressed with it in this comparison. Intel GPUs have come a long way.

    As for Trinity, it looks like a well rounded solution. I'm more interested in the 17W version, as an alternative to Brazos. I like my Thinkpad x120e, and even run games on it occasionally, but it's still pretty weak.
  • ET - Tuesday, May 15, 2012 - link

    "PCI Express support in Trinity remains at PCEe 2.0"
  • althaz - Tuesday, May 15, 2012 - link

    I'd pay $700 for an ultrabook-type laptop with this in it - provided it had a 128Gb Samsung SSD, a decent keyboard and an above-average screen.

    Ideally I'd like to get a convertible tablet with windows 8 on it powered by this. In laptop mode it's a portable gaming machine, in tablet mode a pretty nice tablet. Something 11.6"ish akin to the Transformer Prime from Asus in form factor
  • Riek - Tuesday, May 15, 2012 - link

    Would it be possible to include power consumption (or batterly life) while gaming?

    Would it also be possible to add a cpu frequency - gpu frequency graph during some workloads? To see how turbo3.0 shapes up and how well the gpu turbo is implemented and used?

    From my understanding gpu gets priority in turbo, but a trinity at 2.3GHz will just limit the performance in gaming. I wonder if the turbo is shaped up like that.
  • JarredWalton - Tuesday, May 15, 2012 - link

    I noted battery life while looping 3DMark06 was 77 minutes. If we calculate power draw, that works out to around 43W. I also just checked AC power draw at the outlet during the same test, and the average power use was around 50W (45-55W range, but mostly right around 50). Accounting for AC adapter inefficiencies, if it's 80-85% efficient the laptop is using around 40-43W.
  • Riek - Tuesday, May 15, 2012 - link

    Yeah looked over that. So gaming wise llano is still a better option in regards to batterly life (alot better).

    Sandy bridge would be the worst choice. (as it gives the worst performance).

    I'm suprised trinity and IvyB are so close in terms of 3dmark06 performance and battery life.
  • deathpony - Tuesday, May 15, 2012 - link

    Hi, i have an AMD A6-3500
    http://valid.canardpc.com/show_oc.php?id=2369937

    Gigabyte GA-A75M-UD2H

    AMD Radeon HD 6530D
    http://www.techpowerup.com/gpuz/63we6/

    For some reason i was able to Undervolt the APU and push the clock to 2 Ghz while retaining stability. Also by raising the clock this much i got performance increase in double digits, can you test it out with Trinity? Thanks!
  • JarredWalton - Tuesday, May 15, 2012 - link

    Unfortunately, the Trinity BIOS is pretty much useless as far as undervolting/overclocking/etc. are concerned. I'd be interested in testing such items, but we'd need a far more capable BIOS/motherboard than this laptop has.
  • deathpony - Tuesday, May 15, 2012 - link

    Also the Gpu overclock wasn't limited by the chip itself but by a bios limitation, by staying at stock voltage i could have reached 2.5 Ghz realistically. But with such high clocks i think the Gpu was starved by the main memory bandwidth.
  • Rookierookie - Tuesday, May 15, 2012 - link

    Let's not forget though, A New Hope came BEFORE The Empire Strikes Back...
  • JarredWalton - Tuesday, May 15, 2012 - link

    But it also came after the Phantom Menace, Attack of the Clones, and Revenge of the Sith. *Shudder* Thanks for making me remember baby Anakin and Jar Jar, you cruel, cruel person.... ;-)
  • wintermute000 - Tuesday, May 15, 2012 - link

    looks like a nice well rounded product for the budget to mid range.
    the CPU is fine for 'normal' workloads, and the IGP can handle AAA titles @ medium detail 720p.
    Agreed the devil is in the pricing, it needs to be substantially cheaper than i5+optimus otherwise theres no point.
  • phatboye - Tuesday, May 15, 2012 - link

    When referring to AMD Bulldozer based architectures please use the term "module" instead of "core" as it gets confusing as to which one you are talking about.
  • JarredWalton - Tuesday, May 15, 2012 - link

    If you know enough to post this comment, you should also know enough to understand from context whether we're talking about a Piledriver core (e.g. the INT core) or a Piledriver module (e.g. two INT cores and the shared FP). And trust me, it's just as confusing trying to write about Bulldozer architectures as it is to read about them. But if there's a specific place in the article where we say "core" when we should say "module", let me know where and I'll be happy to correct it.
  • phatboye - Tuesday, May 15, 2012 - link

    Actually I was confused at first. On the first page you stated that the A10 chip was a 4 core chip. Before reading this article I was under the impression that the first trinity chips would have 8 cores (4 modules). It took a bit of reading before I realized that what you said was actually correct in that in that it is a 4 core (2 module) CPU.

    Yes the misunderstanding was totally my fault but it would make things a lot easier if you made a convention when referring to AMD bulldozer "cores" in terms of modules instead.
  • Cow86 - Tuesday, May 15, 2012 - link

    You seem to have forgotten to mention the 'resonant clock mesh' technology that AMD recently acquired, and uses in Piledriver (and therefore Trinity)...said to account for a ~10% saving in power consumption at the same clockspeed alone already.

    Anyway, I'm rather pleased to see this part arrive :) Seems to put AMD in a much better position in the mobile space again, and I'd be very interested to see how they can compete with ultrabooks this time around with their 17W chips. Now to wait how they measure up on the desktop as well...And of course the new FX CPU's later this year...bodes well for those too, looking at the reduced power consumption, as their high power draw is the biggest problem right now.
  • mikato - Wednesday, May 16, 2012 - link

    Yeah nice catch. I forgot that was going in Trinity.
  • ltcommanderdata - Tuesday, May 15, 2012 - link

    In one of the AMD slides on the first page, it mentions Trinity supports AVX1.1. I haven't heard of that standard before, only AVX and AVX2. What does AVX1.1 add over AVX and is AMD the only one to implement it right now?

    I also saw you tried WinZip 16.5 OpenCL and Ivy Bridge OpenCL which I commented on before. Thanks for that. Too bad the results didn't turn out that exciting.
  • meloz - Tuesday, May 15, 2012 - link

    Quiet underwhleming. :\

    On the CPU side, IVB absolutely clobbers Trinity. People keep saying [FONT="Arial Narrow"]"CPU don't matter"[/FONT], but if you have a SSD the 'bottleneck' immediately shifts back to the CPU for certain tasks. It is always nice to have good single thread performance, and Intel enjoys a huge advantage in this area.

    On the iGPU side, the much derided and seemingly impotent HD 4000 manages to actually narrow the lead AMD enjoyed on the graphics side. If only Intel drivers were written by people who had any clue, they would get even better performance out of their silicon.

    So it will all come down to pricing. This is as much in Intel's hand as AMD's. AMD can potentially sell a lot of these things, but at a price where they won't be making any profits.

    Congratulations to JW on another fine review, but a very underwhelming new product from AMD, as I said earlier.
  • jabber - Tuesday, May 15, 2012 - link

    Trust me, no one applauds if I hold up my laptop and shout "BEHOLD THE WONDEROUS SINGLETHREADED PERFORMANCE! BASK IN IT'S POWER!!!"

    You are making too much of too little really.
  • meloz - Tuesday, May 15, 2012 - link

    "You are making too much of too little really. "

    The marketshare and sales performance of the two rival products immediately proves you wrong. But keep digging the hole. Need a Bulldozer (or is it time for an Excavator) to speed up the processs?
  • jabber - Tuesday, May 15, 2012 - link

    You're right, I'm obviously just not taking the issue as seriously as you are.

    Enjoy the endless benchmarking!

    Ahem....
  • Taft12 - Tuesday, May 15, 2012 - link

    AMD can't produce enough chips to meet demand and is improving marketshare every quarter. Their <1% quarterly increases are more like digging with a spoon, but it's digging in the right direction.
  • medi01 - Thursday, May 17, 2012 - link

    Market share means nothing, silly. Prescott had much bigger share than Athlon 64, while being inferior on all fronts.
  • Spunjji - Wednesday, May 16, 2012 - link

    Who's deriding the HD 4000? :/
  • medi01 - Thursday, May 17, 2012 - link

    If you have SSD bottleneck is your brain, that is too slow to distinguish between 22ms and 24ms.

    There is no way 20% faster version of an ALREADY SUCCESSFULL product is "underwhelming".
  • cjs150 - Tuesday, May 15, 2012 - link

    Am aiming to start an HTPC build in the next couple of months. Mostly for movies but a little light gaming from time to time (Total War: Shogan on a 47" HD screen!)

    Low power of Trinity looked very promising...then I read Jarred's review.

    Back to the i7-3770T even though the CPU+motherboard will be at least twice the price of the AMD versions.
  • Assimilator87 - Tuesday, May 15, 2012 - link

    The desktop version of Trinity should fare better because of the TDP headroom, although it's extremely disappointing that it won't be released 'till Q3. I guess my mini-ITX case will have to sit in its box for another few months.
  • bji - Tuesday, May 15, 2012 - link

    What exactly about the review turned you off from Trinity for HTPC purposes?

    Was it the good-enough-for-HTPC CPU performance?

    The superior GPU performance?

    The better power efficiency?

    The lower power use?

    The cooler GPU?

    The lower price (than the i3 you mentioned)?

    Honestly just curious about why you so summarily concluded that Trinity wouldn't be a good choice from your HTPC when I can't see anything in the article that would allow you to draw that conclusion.

    Unless it was the CPU chart comparison against at 3x as expensive CPU that you would never use in an HTPC anyway?
  • cjs150 - Wednesday, May 16, 2012 - link

    The main purpose will be as an HTPC, video transcoding is very important, look back to Jarred's review, Intel are winning by a large margin.

    For the case I intend to use as long as TDP is under 50W there are no concerns.

    For me the i7-3770T will be the way to go. But you may want to try AMD. If I had built the HTPC last year there would have been no question that AMD would have been the right choice - Atom is/was useless. Maybe next year the table will turn again.
  • Spunjji - Wednesday, May 16, 2012 - link

    Soooo... AMD are good against Atom, so you would have bought them then... but now you're buying an i7 with which they (obviously) cannot compete? That particular argument is a little difficult to follow, unless your requirements have drastically altered. :)

    To be fair to AMD, most people I know don't do transcodes while at the machine so speed isn't terribly relevant past a certain point, but I understand this may be different for you.
  • DellaMirandola13 - Tuesday, May 15, 2012 - link

    I have really been looking forward to this review, nice to see a reasonably competitive CPU-market. Thank you.

    I am not a (serious) enthusiast, my best suit is lurking, but I was very excited to learn about the heterogeneous aspect. Is it possible you could elaborate or perhaps link to a comprehensive elaboration on the prospects of heterogeneous computing.

    As far as I could tell, it seemed very useful in navigating within GIS-applications (particularly when you have to load roads or other kinds of grids on a map), but that's pure speculation.
  • Jedibeeftrix - Tuesday, May 15, 2012 - link

    I would be interested in a 14" hp sleekbook if it comes with the 25w 4655M APU, as the full 384 shaders would be nice, otherwise I will wait till 28nm APU's arrive in 2013 sporting GCN.

    Ps test trinity with blender cycles.
  • Veroxious - Tuesday, May 15, 2012 - link

    I am rather baffled by the fact that Ivy Bridge did not bring substantial battery life improvements with it 22nm process. Is it as a result of the trigate transistor tech? By extension when AMD moves to 22nm will it also be trigate seeing that they no longer have their own foundry? Has trigate given the purported benefits that were punted when it was announced, or is it simply a case of the 22nm node not being mature enough?
  • JKnows - Tuesday, May 15, 2012 - link

    It would be nice to see those game test in High settings! My personal tests shows in high details settings Llano faster than HD4000, Trinity should be much faster.
  • tipoo - Tuesday, May 15, 2012 - link

    This is true. The HD4000 falls further and further behind as you increase the detail settings; that's where AMDs implementation shines.
  • JarredWalton - Tuesday, May 15, 2012 - link

    Go look at Mobile Bench: the results for our "High 900p" testing is in there.
    http://www.anandtech.com/bench/Product/600?vs=580

    But just to summarize, at 900p "High" settings, Trinity's lead over HD 4000 grows to 26% in our seven "2012 Suite" games. Also worth note is that Trinity is >30 FPS in three titles (DiRT 3, Portal 2, and Total War: Shogun 2) while Ivy Bridge is >30 FPS only in DiRT 3 (but comes very close in Skyrim).
  • Alexko - Tuesday, May 15, 2012 - link

    "I’d say AMD GPUs as well, but I’m still waiting for a better switchable graphics solution."

    You mean like Enduro?
  • Taft12 - Tuesday, May 15, 2012 - link

    He said "better".

    http://ir.amd.com/phoenix.zhtml?c=74093&p=irol...

    "Linux OS supports manual switching which requires restart of X-Server to switch between graphics solutions."

    They ain't there yet!
  • JarredWalton - Tuesday, May 15, 2012 - link

    Enduro sounds like it's just a renamed "AMD Dynamic Switchable Graphics" solution. I haven't had a chance to test it yet, unfortunately, but I can say that the previous solution is still very weak. And you still don't get separate driver updates from AMD and Intel.
  • Spunjji - Wednesday, May 16, 2012 - link

    Drivers is the big deal here. I like that I get standard drivers using my Optimus laptop.

    What I don't like is that it f#@!s up Aero constantly and occasionally performs other bizarre, unpredictable manoeuvres.
  • ToTTenTranz - Tuesday, May 15, 2012 - link

    Greetings,

    Is it possible to provide some battery life results with gaming?

    It's true that an Intel+nVidia Optimus solution should be better for both plugged-in gaming and wireless productivity (more expensive too, but that's been covered in the review).
    However, a 35W Trinity should consume quite a bit less power than a 35W Intel CPU + 35W nVidia GPU, so it might be a worthy tradeoff for some.

    Furthermore, when are we to expect Hybrid Crossfire results with Trinity+Turks? Is there any laptop OEM with that on the roadmap?
    That should give us a better comparison to Ivy Bridge + GK107 solutions, as it would provide better gaming performance at a rather small price premium ($50 the most?).
  • x264fan - Tuesday, May 15, 2012 - link

    thanx for the nice review author, but let me write you some very important information regarding your test.

    1. x264 HD Benchmark Ver. 4.0 you used is using quite old x264.exe for encoding. It is important for Bulldozer/Piledriver to replace it with the newer once which contain specific assembler optimisation, which gives nice performance boost for AMD processor by using new instructions introduced in those CPUs. You can find how many they are here:
    http://git.videolan.org/gitweb.cgi?p=x264.git;a=sh...

    I would suggest to download new x264 build from x264.nl and replace it, then run the benchmark again. It would also show you how beneficial new isntructions are.

    Another suggestion would be to run this benchmark using x64 build of the x264 throught x86 avisynth wrapper avs4x264mod.exe In this way you can see how much difference x64 uinstructions give.

    iN FACT X264 IS SO NICELLY OPTIMISED IT CAN BE USED FOR CPU TESTING.

    2. You have used Media Player Classic Home Cinema Edition for measuring playback of h264 streams and battery life. So am I, unfortunatelly every time I want to use it with DXVA acceleration on my i7-2630 laptop I end up with terrible artefacts on smaller bitrate content. Blocks are floating and destroying picture quality. It is not as much visible on Blu-Ray content where the picture is more recommpressed than recreated using x264 transformations, but it is still there. My point is that if the INTEL decoding/drivers are so buggy which makes this dxva mode so unusable, how can anyone would like to measure battery life with this mode?
    Without DXVA intel numbers would not be so good, but so far this mode is only usable.

    3. I must say i am amased how good hd4000 is, but what about picture quality. From time to time we see the reports that nvidia or amd has cheated in drivers sacrifacing picture quality, so how about intel...

    I hope you read my comment and update your test.
  • JarredWalton - Tuesday, May 15, 2012 - link

    So, help me out here: where do I get the actual x264 executables if I want to run an updated version of the x264 HD test? We've tried to avoid updating to newer releases just so that we could compare results with previously tested CPUs, but perhaps it's time to cut the strings. What I'd like is a single EXE that works optimally for Sandy Bridge, Ivy Bridge, Llano, and Trinity architectures. And I'm not interested in downloading source code, trying to get a compiled version to work, etc. -- I gave up being a software developer over a decade ago and haven't looked back. :-)
  • x264fan - Wednesday, May 16, 2012 - link

    http://x264.nl it is newest semi-official build. It contains all current optimisations for every CPU, but since its command line you can turn on and off them. I also heard that this week there will be new hd benchmark 5.0 which would have the newest build in it.
  • plonk420 - Monday, July 9, 2012 - link

    the problem with this is that then the test isn't strictly "x264 hd benchmark version x.00" ... and would be harder to compare to other runs of the same test.

    if they did this in ADDITION to v4.00 or whatever (and VERY clearly noted the changes), that might be some useful data.
  • jabber - Tuesday, May 15, 2012 - link

    ....how about adding a line/area to the benchmark graphs that stands for "Beyond this point performance is pointless/unnoticeable to the user".

    That way we can truly tell if we can save ourselves a boat load of cash. All out performance is great and all but I don't run benchmarks all day like some here so it's not so important. I just need to know will it do the job.

    Or would that be bad for the sponsors?
  • bji - Tuesday, May 15, 2012 - link

    It is an interesting idea but it would such incredible fodder for fanboys to flame about, and even reasonable people would have a hard time deciding where that line should be drawn.

    I think the answer to your basic question is that, any mobile CPU in the Llano/Trinity/Sandy Bridge/Ivy Bridge lines will be more than sufficient for you or any other user *unless* you have a specific task that you know is highly CPU intensive and requires all of the CPU you can get.
  • mdeo - Tuesday, May 15, 2012 - link

    " For some of these applications, we don’t have any good way of measuring performance across a wide selection of hardware, and for some of those where benchmarks are possible I’ve run out of time to try to put anything concrete together"

    Please wait and spend the required time before you post results.
    Also, where are the graphs for WinZip, GIMP filters (19 of them ..you deemed 5 that you would use). Graphs make it easy to read that Trinity beats Intel chips in GIMP and equals in WinZip.

    This makes me wonder why should I trust Anandtech more then tomshardware reviews ...
  • JarredWalton - Tuesday, May 15, 2012 - link

    You apparently have no idea how much time goes into putting together a review and running all the benchmarks. Let's just say that after running (and rerunning) benchmarks for much of the last month on a variety of laptops, I finished a couple of graphs right at the 12:01 AM NDA time. That was after getting about ten hours of sleep total over the weekend, and never mind the fact that I've had a horrible cold the past week.

    Every new benchmark needs to be created and evaluated to see if it's useful. GIMP's new "Noise Reduction" and "Blur" functions can use OpenCL, but so can "Checkerboard". Um, really? We need OpenCL to fill an image with a checkerboard?

    Here are a few GIMP numbers (from Noise Reduction):
    A10 CPU: .396 MP/s
    A10 OCL: 4.10 MP/s
    IVB CPU: 1.49 MP/s
    IVB OCL: 4.04 MP/s
    SNB CPU: .689 MP/s
    SNB OCL: 3.56 MP/s
    DC SNB CPU: .586 MP/s
    DC SNB OCL: 2.01 MP/s
    Llano CPU: .321 MP/s
    Llano OCL: 2.39 MP/s

    I had a graph for WinZip, but then we pulled it because apparently WinZip's OpenCL performance is best using the legacy compression. I used their newer Zipx compression, which results in a smaller file but isn't as optimized (yet). So now I need to spend about two hours retesting WinZip and 7-zip. Thanks for understanding.
  • Beenthere - Tuesday, May 15, 2012 - link

    As expected Trinity delivers in all areas and should meet most people's needs quite well. Good job AMD. You get my money!
  • tipoo - Tuesday, May 15, 2012 - link

    I wonder what causes these odd results? The 7660 winning by a wide margin in most games, but losing by a small margin in some? Is it whether the games are pixel fill vs pixel shader (hd4000 is good at the former, bad at the latter) bound, or is there a driver issue with the 4000, or what?
  • Wolfpup - Tuesday, May 15, 2012 - link

    That has to be the most surprising thing in the review to me. While I know technically today's GPUs are really CPUs geared towards less branchy, more parallel code, it still caught me off guard that someone had thought to run a file compression utility on it!

    Also surprised Intel has OpenCL drivers at all...not surprised they're bad though. I wonder what they do? Like is their "GPU" portion of Sandy/Ivy bridge actually capable of doing that type of work, or are they mostly just using the CPU?

    Still hate "quicksync" and the graphics portion of those CPUs as it's wasting at least enough transistors for a 5th core.
  • jwcalla - Tuesday, May 15, 2012 - link

    File compression can be parallelized, but there are some unfortunate limitations... final compressed size is generally less optimal with parallel compression, and compressing large volumes of data become memory-bound fairly quickly. But for "ordinary" compression tasks it's quite effective.

    The article didn't indicate if the CPU compression tests were single-threaded or multi-threaded.
  • Brazos - Tuesday, May 15, 2012 - link

    Can I assume the improvements seen here will be implemented in the next version of Bulldozer (Vishera)?
  • mikato - Wednesday, May 16, 2012 - link

    Yes, it's supposed to use Piledriver modules, not Bulldozer.
  • silverblue - Thursday, May 17, 2012 - link

    Plus even the Piledriver implementation in Trinity should, clock for clock, be faster than Bulldozer even without L3 cache. This isn't a repeat of Llano vs. Phenom II where Husky cores were technically faster than Stars albeit lacked L3 cache which brought the performance back down again.

    Some information about the caches plus their latency would be really appreciated; if it bodes well here, Vishera might be a very decent chip.
  • jwcalla - Tuesday, May 15, 2012 - link

    Is there really a large market for gaming on a lower-end laptop? I can see us techies being interested in that sort of thing, but what percentage of PC buyers is actually concerned about gaming performance, let alone on a laptop? In real world terms, I'm not seeing AMD's strategy giving it much of an advantage.

    I'm willing to entertain the reality that Intel has been "overselling performance" to casual users for some time now, and so maybe low-end is more than good enough... but, if true, AMD seems to be focusing on a segment that is going to have an enormous amount of competition in the next 2-3 years.
  • Beenthere - Tuesday, May 15, 2012 - link

    Many students game on laptops and that's a large market segment with many people desiring portability by necessity these days.
  • aliasfox - Tuesday, May 15, 2012 - link

    Your average kid going into undergrad who doesn't care enough to spend more than $1k (or more than $700) for a computer will be pleasantly surprised when he/she fires up some random game - and releasing these machines right now is perfect timing to get product on the shelves for college back-to-school season.

    It's really all about 'good enough' on SC2 or Portal or whatever else people will pick up for a few hours a week. They don't care about 100 fps at insane external monitor resolutions with megapixels worth of textures, but if they can get >20-30 fps at 768p or 900p on whatever they might throw at it, they're happy enough.

    I used to be of them.
  • mikato - Wednesday, May 16, 2012 - link

    Me too. I played Quake 3 on an 8MB video card for quite a while on my desktop and everything worked well enough for me to kick butt with the rail gun :) It was great to play for a few minutes or an hour to unwind a bit.
  • Caltek9 - Tuesday, May 15, 2012 - link

    jwcalla,

    I'm actually totally interested in gaming on a lower-end laptop right now, and am trying to decide whether to wait for Trinity or not. I'm going to grad school (undergrad and building gaming towers was many years and 2 children ago), and need a light laptop with good battery that I can play recent games on (Diablo III, Kingdoms of Amalur, Saints Row the Third, Borderlands). The reason the Trinity setup is intriguing to me is because if the GPU works out, I can get (supposedly/hopefully) a very slim laptop that can do this, instead of a heavier one. I'll be going to grad school in Europe, and every pound I can shave off before I travel is a good thing! I've been a console gamer for far too long (since abandoning the PC after undergrad), so I'm used to not having the best looking graphics. As long as it can play a game smoothly, and at decent graphical settings, I'm fine with that.

    I get a bit sad when seeing the Trinity CPU numbers, but keep trying to convince myself that it won't matter to me, since I'll mainly be typing papers, and surfing the Internet, and not transcoding anything with gaming on the side. I'm writing this on an Intel-based Mac, so I'm not a fanboy of either AMD.

    Bottom Line: I want cheap, light, good battery, and the ability to play recent games at medium settings. Trinity seems to be able to do this better than Ivy Bridge, at least in these early reviews.

    SIDE NOTE: Until I see an actual laptop with Trinity in the wild, this is my current choice for a replacement laptop (Sager NP6110/Clevo W110ER): http://www.xoticpc.com/sager-np6110-clevo-w110er-p...

    I'm a bit worried about the screen and keyboard sizes for papers, but suppose I could hook them up to externals.
  • Gigantopithecus - Tuesday, May 15, 2012 - link

    PCPer shows the A10-4600M absolutely trouncing the i7-3720QM (http://www.pcper.com/files/review/2012-05-13/gamet...

    What explains the dramatic discrepancy between their results and Anandtech's?
  • Gigantopithecus - Tuesday, May 15, 2012 - link

    ...in Skyrim, and here's a working link: http://www.pcper.com/files/review/2012-05-13/gamet...
  • tipoo - Tuesday, May 15, 2012 - link

    Anandtech tested at low details, in your link its medium. The HD4000 has decent pixel fill rate, but pretty bad pixel shader performance. So more details = it falls further behind.
  • JarredWalton - Tuesday, May 15, 2012 - link

    Good try, but you're wrong. We test at Medium details, with FXAA disabled and anisotropic filtering set at 4x. We also have the high resolution texture pack installed, though I'm not sure if it's always active at low details. Finally, Skyrim is a massive game. I specifically ran around looking for areas (on my desktop system) where performance was lower so that we could give more of a "this is as bad as it gets" score. That ended up being near Whiterun. Go into dungeons and such and the game runs two or three times as fast as our benchmark section.
  • tipoo - Tuesday, May 15, 2012 - link

    My mistake, I read "value" and assumed low settings. So if they were both at medium, why are the winner and loser completely flipped? Your explanation would explain the lower framerate, but not a complete flipping of winner and loser.
  • JarredWalton - Tuesday, May 15, 2012 - link

    It could be that Whiterun has lower framerates because of CPU bottlenecks as opposed to GPU bottlenecks. I honestly don't know, and I don't know what areas others are using for testing. I suppose I could always try benchmarking a different section of the game to see what happens.

    It's also possible that FXAA and anti-aliasing in general is the cause of the discrepancy. I never turn on AA personally until I'm at the point where I've maxed out other settings and I still have room to spare. Jaggies just don't bother me all that much, particularly at native resolution on LCDs, and FXAA is basically a blur filter for the whole screen -- you lose jaggies as well as details.
  • Burticus - Tuesday, May 15, 2012 - link

    I wonder if they will release standalone mobile chips and if they are the same socket as the current Llano? Currently my laptop has an A8-3500 and I wouldn't mind upping to an A10.

    They did this in the past with the S1 socket, I wonder if it will be an option nowadays...

    For the most part I've been pretty impressed with the A8 for a $500 laptop (especially with some overclocking). Games are playable at moderate settings. Civ 5 still kicks it in the teeth though, and I see that the A10 got a 10fps jump which would be nice.
  • JarredWalton - Tuesday, May 15, 2012 - link

    The sockets are different: FS1r2 this time. I don't know precisely what changed, but apparently it's enough that AMD isn't making them backwards compatible.
  • Fallen Kell - Tuesday, May 15, 2012 - link

    The biggest problem with the design is that the OS doesn't know how to work with the CPU. Take the case where you have 2 of these piledrivers, with 1 floating point intensive job and 1 non-floating point intensive job already running, in which case the OS will place the first job, on one piledriver, and the next on the other piledriver. Then a user starts a new floating point intensive job, and the OS simply puts it on the next free core, which happens to be the one already running a floating point intensive application, and thus, you just bottlenecked both of those processes. The OS doesn't know if a process is floating-point heavy or not, and thus, can not properly schedule it to a core which has a floating point unit not in heavy use. That is why bulldozer failed. It is also why my work will never purchase it, as they do floating point intensive applications.
  • Beenthere - Tuesday, May 15, 2012 - link

    Most every reviewer has indicated that Trinity is a significant jump in performance in both CPU and GPU with extended battery performance yet some reviewers seem hard pressed to admit that for 90% of the laptop market Trinity is superior to Intel's best offerings.

    Some reviewers are trying to pretend that Intel's faster CPU performance some how is of importance to the majority of the laptop market when in fact it is not unless all you do is crunch numbers. I think Trinity sales just like llano and brazos will drive the point home who is leading the laptop market segment with what consumers actually desire.
  • JarredWalton - Tuesday, May 15, 2012 - link

    Beenthere, you have to be the biggest AMD fanatic I've seen around here. EVERY article where AMD comes up, you're there making things up to justify your worldview. As I indicate in the article, Trinity is 10-20% faster than Llano on CPU and 20% faster on GPU, which is a decent improvement. Unfortunately, a lot of places are quoting AMD's "up to 29% faster CPU and 56% faster GPU" and calling it a day. Those are results that just didn't show up in any testing that I conducted.

    Oh, wait, I've got one: using OpenCL in GIMP, Trinity is 72% faster than Llano! There, we now have one statistic you can point to where Trinity is better. For the 0.1% of the population that uses GIMP, and not even them really -- it's the 0.1% of people that use GIMP and will some day benefit when the next major release comes out and incorporates OpenCL. If you can't see the problem with that statement, I can't help you.

    For 90% of the market, Trinity might be enough, but to say it's "better than Intel's best" is pure fanaticism and nothing more. You are more biased than AMD's own marketing department. To pretend that moderately faster graphics with substantially less CPU performance is somehow more important than any other metric is insane. Sandy Bridge with GT 540M can be had for $600 right now, and it will beat Trinity in pretty much every single metric. Lucky for AMD, a lot of people like you will blindly purchase anything with AMD on it without regard for reality.
  • bji - Tuesday, May 15, 2012 - link

    While I agree with your points overall, I think there is a fine detail you need to consider:

    Benchmarks are only an approximation of the performance results that would be achieved on a whole variety of processor tasks. You can rightly point out that only a small fraction of tested programs benefitted greatly from improved OpenCL performance, but you can't claim that this only benefits the 0.1% of people that use GIMP and care about OpenCL, because there may be other programs available now, or in the future, that would see similar performance increases. What your benchmarking shows is that *most* programs don't see a huge OpenCL performance benefit, but that *some* do. This is likely to lead to a more significant performance benefit than would be enjoyed by 0.1% of the users of a particular application.

    However, I think that CPU reviewers are kind of in a hard place these days, since we're arguing over how big of an overkill one given processor is than another when considered for a wide variety of tasks, which starts to make any benchmarking about trying to find benchmarks where the performance difference would really matter. And that invites all kinds of debate about which kinds of performance actually matter to the average user, which is not a very fun or interesting argument.

    CPU performance can still matter for targeted tasks, but that kind of analysis requires a very different approach and is very user-specific, when compared to standard benchmarking.
  • JarredWalton - Tuesday, May 15, 2012 - link

    You're correct, and the real difficulty is first in finding anything where OpenCL is clearly faster, and then seeing similar techniques used in other software. Office for example isn't going to really get any faster because of you GPU or OpenCL -- and it doesn't need to be. Office spends its time waiting for user input. So what we really need are technologies that make the slow parts of using a computer faster. SSDs are a perfect example, because they make the initial boot and application load times all faster. OpenCL isn't doing that for the vast majority of applications, and neither is Quick Sync or DirectX or whatever other GPU related task you want to throw out there. They make graphics faster, but in my experience that's mostly important to gamers, or for high-end workstation stuff where you want OpenGL support.

    For many people, Core 2 Duo is fast enough, and Llano is fast enough, and Trinity is fast enough, etc. So for those users, it's about delivering the lowest cost. Trinity is twice the size of quad-core Ivy Bridge, so Intel could easily start a price war if they wanted, but they'd rather keep higher margins. Sandy Bridge laptops at $600 are still faster for general use than Llano and Trinity, particularly if they have an Optimus GPU around. Unless something is significantly faster in some important metric -- and I really don't see any single area where that's the case for Trinity -- then you just get whichever is the best price.
  • Beenthere - Tuesday, May 15, 2012 - link

    Wow, Jarred is having an unhappy day! :(

    Obviously AMD's testing is different than your's as is other websites. My comments were NOT in regards to your article, which I though was pretty balanced. The website I was referring to is listed below.

    Your knee-jerk reaction to my comment however shows you're loosing it. If you really believe that Intel's platform provides as good a result for mainstream consumers, you'd be in error especially when Trinty Ultrathins will be hundreds cheaper.

    It's pretty obvious you can't deal with differing POVs and you get upset when you're opinion is not shared by others. Losing your objectivety makes it difficult for anyone to take your articles seriously - even though this one was pretty balanced. You should consider a CHILL PILL before over-reacting.

    You really should THINK before you react. In this case my comment had NOTHING to do with your story. If your article has merit then you should not need to go POSTAL even if my comment was about your story. Being a reactionary and calling people names for having a different POV than you shows immaturity. The really funny part about your knee-jerk reaction was my comment was in regard to another story on Trinity on a different website. (see below).

    You must have a guilty conscience? Below is the story I commented on. Oops, I'm sure you are embarrassed now, but it's OK? I don't hold grudges. <LOL>

    Maybe the Intel fanbois are just beating you up too much because Trinity is a far better choice for laptops than anything Intel has at the moment? they'll get over it.

    http://www.pcper.com/reviews/Mobile/AMD-A10-4600M-...

    Cheer up Jarred. You can look forward to Piledriver/Vishera in a few months and more hate from the Intel fanbois.
  • bji - Tuesday, May 15, 2012 - link

    Sorry, but when you start a paragraph with "Some reviewers are trying to pretend" you are VERY CLEARLY implicating that the reviewer is being dishonest by trying to mislead people reading the review by stating intentionally false commentary.

    If you start with that kind of premise, then you deserve a response that, in kind, accuses you of doing the same, which is exactly what you got.

    Trying to then pretend that you're innocent and didn't deserve that response is just more lameness.
  • JarredWalton - Tuesday, May 15, 2012 - link

    Beenthere is your typical passive aggressive anonymous Internet poster. I called him on his post, and now he backpedals. You know what's hilarious, Beenthere? That article you link. Let me give you a quote from the conclusion to show what I'm talking about:

    "I can’t find a way to look at Trinity that paints a favorable picture. Though certainly an improvement over Llano, it’s not enough. AMD is way behind Intel in processor performance, and the graphics performance does not offer redemption. The only way systems based off Trinity will be made competitive is by slashing and burning the prices."

    Okay, that's pretty much what I said as well. Perhaps they're even more negative than I am. And yet... that paragraph is followed by a Silver Award? WTF is up with that? They're awarding something that they can't find a way to describe in a positive fashion? And then you suggest that "Some reviewers are trying to pretend that Intel's faster CPU performance some how is of importance to the majority of the laptop market when in fact it is not unless all you do is crunch numbers." I'd say the opposite: some reviewers are trying to kiss up to AMD with an award or backhanded praise when everything else they say is negative at best.

    But hey, let's not forget how open and unbiased Beenthere is. Here's a quote from page three of the comments that shows his amazing analytical skills and not-at-all-anti-Intel mindset:

    Subject: Excellent by Beenthere on Tuesday, May 15, 2012

    As expected Trinity delivers in all areas and should meet most people's needs quite well. Good job AMD. You get my money!


    Wow. Yup, Trinity is a far better choice for laptop than anything Intel has at the moment. Because Acer's AS4830TG with GT 540M and i5-2410M at $600 offers better CPU performance and better GPU performance. Yup. Far better. I like to pay more for less!
  • Spunjji - Wednesday, May 16, 2012 - link

    Ha! :D
  • medi01 - Friday, May 18, 2012 - link

    "Moderately better graphics" - referring to a 2 times (3D Vantage) faster GPU sounds very biased.
    On top of it, Intel's "HD" has rendering quality problems:
    http://media.bestofmicro.com/Trinity-A10-4600M-Rev...
  • medi01 - Friday, May 18, 2012 - link

    PS
    Things that are important for most users, and I think many users in this thread agree:

    1) Screen quality
    2) Battery life

    On top of it, user is much likely to play games, than do video/audio encoding _in_which_he_would_care_about_performance_.
  • medi01 - Friday, May 18, 2012 - link

    "Trinity is 10-20% faster than Llano on CPU"
    Uhm, I beg to differ (from tom's review).

    Handbreak: +23%
    iTunes: +37%
    Abby Fine Reader: +24%
    WinRar: +38%

    http://www.tomshardware.com/reviews/a10-4600m-trin...
  • potatochobit - Tuesday, May 15, 2012 - link

    Hold on, just to be clear all you tested was the IGP?
    this laptop is not running dual graphics correct?
    because I think that would make a large difference
  • JarredWalton - Tuesday, May 15, 2012 - link

    To test dual graphics you need to have a laptop with dual graphics, and the prototype does not have that. You can't test what doesn't exist. And frankly, after the fiasco that was dual graphics on Llano (at least on the prototype system), AMD made the right decision to skip that element this time. Besides, dual graphics just puts more of a load on the CPU -- the weakest link in Trinity.
  • e36Jeff - Tuesday, May 15, 2012 - link

    Is there any way to play with the bios to lower clock speeds/disable cores to simulate what the low power parts(specifically the A10-4655M and A6-4455M) will behave like? I know the power levels will be wrong as this isnt a LV/ULV part, but I (and im betting quite a few other people) would be very interested in seeing what kind of performance they can get out of the parts aimed at the ultrathins.
  • JarredWalton - Tuesday, May 15, 2012 - link

    No, the BIOS on this laptop is very basic and has no options for adjusting anything really useful.
  • Beenthere - Tuesday, May 15, 2012 - link

    I replied to your knee-jerk reaction back a few comments Jarred.

    I hope tomorrow is a better day than today for U. ;)
  • bji - Tuesday, May 15, 2012 - link

    You are not important enough to demand responses like you are doing. Trying to redirect Jarred's attention to that other pointless thread? Lame and off-topic in this thread.
  • Spunjji - Wednesday, May 16, 2012 - link

    Go away, please.
  • silverblue - Thursday, May 17, 2012 - link

    Can you and sans2212 go into a room and fight it out, please? One of you hates AMD, the other wants its babies, and I've a sneaky suspicion that, like matter and antimatter, you might actually cancel each other out.

    (they also both cease to exist, but I thought that'd be a bit cruel)
  • e36Jeff - Tuesday, May 15, 2012 - link

    oh well, thanks for checking, and thanks for the reply. I guess I'll just have to wait for you guys to get your hands on an actual LV/ULV trinity chip.
  • plonk420 - Tuesday, May 15, 2012 - link

    is the Dell V131 tested 2, 4, or 6gb? (i.e. single or dual channel) and how about the other laptops?
  • JarredWalton - Tuesday, May 15, 2012 - link

    I actually stuck in 2x4GB DDR3-1600 from the IVY system to make things "equal". Sorry for not noting that. I did the same for Llano, Trinity, and QC SNB. Same SSD, same RAM -- though Llano and SNB ran the RAM at DDR3-1333.
  • Iketh - Tuesday, May 15, 2012 - link

    Awesome article Jarred!

    I caught only one mistake...

    "Power consumption is also improved over Llano, making Trinity is a win across the board for AMD compared to its predecessor."
  • SuperVeloce - Tuesday, May 15, 2012 - link

    "Llano was already faster in general use than Core 2 Duo and Athlon X2 class hardware."
    This is so wrong, it's beggar belief. Just for comparison, my old C2D T8300 (2,4ghz, 3mb L2) is actually faster than A4-3300 overclocked to 2,8ghz in every task and benchmark i throw at them. Even at 2,6ghz it's only on par with T7500 (2,2ghz, 4mb L2, year 2006 my friends).

    Well yes, I guess A4 (not overclocked) is less power-hungry and can do quite a bit of undervolting but you see my point... liano is slower than equivalent mobile c2d, if Open-CL from gpu is not in use.
  • JarredWalton - Tuesday, May 15, 2012 - link

    Care to provide some specific benchmarks that prove this out? Because by my numbers, it doesn't look that way:
    http://www.anandtech.com/show/2585/6

    I didn't post Cinebench 10 in the article, but Trinity scores 2834 single-threaded and 8222 multi-threaded. That makes the single-threaded score basically tied with the Core 2 Duo X7900 (2.8GHz) and the multi-threaded score is 50% faster. Llano on the other hand scores 2037 and 6824 in the same tests -- slightly slower than P8400 on single threads but faster on multi-threaded.

    PCMark 05 I can provide results for as well, though the SSD certainly skews things on Trinity. Trinity = 10824, Llano (HDD) = 6236, P8400 = 6561 (close enough to Llano), and X7900 = 7544. But I'm not talking specific tasks; I simply said "faster in general use" -- depending on which version of Llano you're talking about. The fastest Core 2 vs. the slowest A8 would probably be a tossup on the CPU side.
  • eanazag - Tuesday, May 15, 2012 - link

    I am disappointed that the desktop versions are not available till Q3. I was thinking this would replace my Core i3 540 at home soon.
  • PolarisOrbit - Tuesday, May 15, 2012 - link

    I am wondering if there is a better way to communicate value other than pricing because throughout the article the reviewer estimates Trinity at $600, while in the comments to readers the same reviewer's estimates vary from $700-$800.

    What is to be made of this discrepency? I am wondering if it wouldn't be better to just avoid price predictions altogether. Surely there is some other way of describing relative value. Maybe estimates of what other setups would have similar performance is enough by itself.
  • JarredWalton - Tuesday, May 15, 2012 - link

    I think it *needs* to be at $600 to sell, because SNB + GT 540M is already at $600. However, HP has hinted that their sleekbooks with Trinity will start at $600 and $700 for the 15.6" and 14" models, respectively. "Start at" and "comes with a reasonable amount of RAM and an A10 APU" are not the same thing. Until HP actually lists full specs and a price, I have to assume that the $600 price tag for the 15" model is going to be 4GB RAM, 250GB HDD, and an A6-4400 APU. Hopefully I'm wrong, but the fact is we don't know Trinity's real price yet, so in the article I'm referring to the price I think it should be at in order to provide a good value.
  • hechacker1 - Tuesday, May 15, 2012 - link

    As most people I assume are coming from the Core 2 Duo style laptops, I would like to see a comparison of trinity with that.

    I know core i processors are fast, but I don't know if AMD has caught up with Core 2 performance.
  • tipoo - Tuesday, May 15, 2012 - link

    Even with Llano they had caught up, with Trinity the margin will only be larger. Use this to compare whatever you want

    http://www.anandtech.com/bench/Product/399?vs=62
  • cosminmcm - Monday, May 21, 2012 - link

    How about comparing Llano to a core 2 quad? And at about the same frequency.
    Here you go:

    http://www.anandtech.com/bench/Product/399?vs=50
  • This Guy - Wednesday, May 16, 2012 - link

    Sorry to be rude. I really think you missed the point of this chip.

    The CPU in Trinity is close to a 17W CPU with a 17W GPU. It performs about the same as an intel 17W chip. It's graphics engine is far better and the CPUs should cost about the same. The only real disadvantage over 17W Sandy Bridge is that in a prototype chasis Trinity uses more power, but a few watts should be shaved on production models.

    This means AMD has caught up to Intel again! Yes AMD is going to lose spectacularly when ULV Ivy Bridge comes out and I doubt Trinity is going to scale at higher power but at low power, AMD has caught up!

    (Yes I know that Sandy Bridge includes a GPU but if you look at your benchmarks, ULV Intel with a dGPU scores similar to Trinity when transcoding [The only really CPU limited test in this review])
  • ET - Wednesday, May 16, 2012 - link

    Something I just read at The Tech Report: when using MediaEspresso to transcode video, the result of VCE was much smaller than QuickSync or software, yet they didn't notice a difference in quality. I would like to know what your experience was. If that's really the case I'd prefer VCE over other Intel's solution even if it's slower.
  • Riek - Wednesday, May 16, 2012 - link

    As far as i know VCE is not yet supported or been made available by AMD.

    All those tests are due to openCL and not VCE since that part cannot be reached at this point in time. (yes blame AMD for that one, this is already taking 6months and still their is nothing about VCE)
  • Spunjji - Wednesday, May 16, 2012 - link

    You're mistaken, there.

    Quote from Page 2:
    "Trinity borrows Graphics Core Next's Video Codec Engine (VCE) and is actually functional in the hardware/software we have here today. Don't get too excited though; the VCE enabled software we have today won't take advantage of the identical hardware in discrete GCN GPUs"
  • karasaj - Wednesday, May 16, 2012 - link

    When you go to the llano review, the HD4000 gets stomped by Llano's desktop graphics offering. When you look at Trinity, the notebook version of trinity barely beats Llano. Why is it that Intel can practically fit the full power of their IGP (get nearly the same performance from notebooks as from 3770k) but AMD's is drastically weaker?

    Also - will we see a weaker HD4000 in the dual core/cheaper IVB variants? I think Trinity desktop GPU will stomp on the HD4000 and might actually be a viable budget gaming solution as long as CPU improvements are good enough. We could see it take down quite a bit of the discrete graphics market I think, considering the HD4000 already can do that.
  • JarredWalton - Wednesday, May 16, 2012 - link

    It's an odd move by Intel, perhaps, but I think it makes sense. The mobile Sandy Bridge and Ivy Bridge parts basically get the best IGP Intel makes (HD 3000/4000), and what's more the clocks are just as high and sometimes higher than the desktop parts. Yeah, how's that for crazy? The i7-3720QM laptop chips run HD 4000 at up to 1.25GHz while the desktop i7-3770/K/S/T runs the IGP at up to 1.15GHz. SNB wasn't quite so "bad" with HD 3000, as the 2600K could run HD 3000 at 1.35GHz compared to 1.3GHz on the fastest mobile chips.

    Anyway, the reason I say it kind of makes sense is that nearly all desktops can easily add a discrete GPU for $50-$100, and it will offer two or even three times the performance of the best IGP right now. On a laptop, you get whatever the laptop comes with and essentially no path to upgrade.

    For AMD, if you look at their clocks they have them cranked MUCH higher on desktops. The maximum Llano clocks for mobile chips are 444MHz, but the desktop parts are clocked up to 600MHz. What's even better for desktop is that Llano's GPU could be overclocked even further on many systems -- 800MHz seems to be achievable for many. So basically, AMD lets their GPU really stretch its legs on the desktop, but laptops are far more power/heat constrained. It will be interesting to see what AMD does with desktop Trinity -- I'd think 900MHz GPU core speeds would be doable.
  • scope54 - Wednesday, May 16, 2012 - link

    I'd like to see how much faster Piledriver is compared to a Bulldozer CPU with roughly the same specs (like an FX-4100 under-clocked to match speeds), to get an idea of what we can expect when the desktop variant releases.
  • Arbie - Wednesday, May 16, 2012 - link


    I would really have liked to have seen this! Of course none of these chips will run the game at high settings, but what will they do at lower res??? Contrary to all the nonsense printed especially on its release, Crysis was a great game even at low res AND was incredibly scalable both up and down. It remains a great yardstick for comparisions because (a) it is still challenging and (b) we have so much history with it.

    Can you please at least add some average and minimum framerates, if in fact any settings were playable?
  • MySchizoBuddy - Thursday, May 17, 2012 - link

    Is it still PCI-e 2?
  • JarredWalton - Wednesday, May 23, 2012 - link

    Yes. PCIe 3.0 is not needed for lower performance parts, and Trinity laptops aren't going to be paired with high-end GPUs I don't think. I feel like it's already hitting CPU bottlenecks in some games even with the HD 7660G!
  • ArteTetra - Thursday, May 17, 2012 - link

    "What’s Makes a Trinity?"

    What's this? A citation, or a kind of slang? I really don't understand the usage of two verbs (my native language is not English).
  • Drewdog343 - Sunday, May 20, 2012 - link

    A typo =P
  • Targon - Saturday, July 7, 2012 - link

    I suspect it is a play on words. Trinity is the code name for this new generation of AMD APUs, so there has been the question about why AMD gave it this code name.
  • silverblue - Thursday, May 17, 2012 - link

    Then don't read them. Really, if you don't value what he has to say, then you're just doing yourself a disservice by adding comments.

    There really isn't any need to become all aggressive over your disdain for Jarred.
  • silverblue - Thursday, May 17, 2012 - link

    If you've seen it in the article, then it's hardly "buried". He has also reiterated this point in the comments.

    In any case, Jarred is spot on - an SB laptop with dGPU wouldn't be much more expensive (if at all) and would perform better, however Trinity would excel in terms of power consumption. For a lot of people, its CPU performance would be more than enough, and it has easily the strongest IGP.
  • edge929 - Thursday, May 17, 2012 - link

    WTB Trinity A10 for a new HTPC, ETA?
  • medi01 - Friday, May 18, 2012 - link

    Somehow one doesn't wonder at tomsharware, that comparing chips with vastly different price is a bad idea.
  • TC2 - Sunday, May 20, 2012 - link

    amd cpu is far behind intel cpu! about gpu! nvidia is also far better as performance & software support! so for professionals and buyers with budget there isn't point to go with amd! :)))
    but for amd - as good as is get :))) sorry
  • kevmanw430 - Tuesday, May 22, 2012 - link

    Jarred, what version of Skyrim was tested? Bit of a question in a forum discussion, would like to know.

    Thanks!
  • JarredWalton - Wednesday, May 23, 2012 - link

    Running the Steam variant, so it's automatically updated. At time of testing, I believe it was version 1.4 (with high res texture pack). The tested sequence is in the area of Whiterun, as I found in initial playing (on a desktop) that it was one of the more demanding areas. I wanted to find the "worst case" situation for playing the game, so that if you get >30 FPS in our benchmark, you'll generally get >30 FPS throughout the game. There's a reasonable chance that other parts of the game are less CPU intensive and thus might show Trinity in a better light, though it's debatable whether that's more or less meaningful than the current results.
  • seapeople - Tuesday, May 22, 2012 - link

    Why do you even read computer reviews then? Every single review I've ever read in my life on any site ever created includes a new, top of the line product in the graph REGARDLESS of what's being reviewed. It's not bias, it's not inattentiveness, it's not even remotely absurd as you state, it's quite simply expected for computer reviews.
  • seapeople - Tuesday, May 22, 2012 - link

    This is quite silly. We're not comparing a $90,000 Mercedes to a $16,000 Hyundai, we're comparing a processor that might be in an $800 computer to one that might be in a $600 computer based on how sales are at any particular point in time.

    It's amazing that two or three negative posters account for roughly 50% of the comments on this thread. Somebody is trying too hard.
  • Stradi - Friday, May 25, 2012 - link

    I'm a bit confused about how this review was done. Are they comparing the APU without the extra discrete GPU against intel Ivy Bridge with discrete GPU from nVIDIA and saying the Ivy Bridge is better for gaming? or is this comparing the APU with AMD's discrete GPU that's on crossfire against intel/nVIDIA setup?
    I assumed the crossfire setup will actually prevail against ivy bridge/nvidia in terms of gaming. I was waiting for Trinity not because it's got a decent GPU in there, but it gives you the option to crossfire with AMD's discrete GPU. And while I like the nVIDIA's discrete GPU option, I didn't think it can havest the intel HD 4000 graphics power and add to it.
    I need some clarification on this. I hope they didn't mean to compare an AMD's new APU with Intel's latest+nVIDIA's latest. Why would any reviewer think that'd represent a fair comparison? why would they put an extra discrete GPU in one machine and say that's a better gaming machine?
  • rarson - Tuesday, May 29, 2012 - link

    The AMD APU is not paired with a discrete APU.

    Obviously the review is confusing and full of incongruous comparisons, but Jarred and Anandtech aren't going to fix it.
  • rarson - Tuesday, May 29, 2012 - link

    Anandtech now responds to criticism by simply deleting it. Or was that Jarred's doing?
  • R3MF - Friday, June 1, 2012 - link

    the slides say OpenCL 1.1, will full support in hardware for 1.2 arrive?
  • fmcjw - Monday, June 11, 2012 - link

    Jarred, you're a self-engrossed writer... you keep writing out your suppositions and thought processes like "if you think it's... then..." or "don't get excited..." The problem is, I never think what you think I'd think, nor do I get excited where you think I would. And I really don't get the section on the AMD marketing stickers. It's tedious things like this that make me think: just get on with the article!

    That said, I think you're improving and it's great you're covering topics together with excellent writers such as Anand and Brian Klug.

Log in

Don't have an account? Sign up now