POST A COMMENT

82 Comments

Back to Article

  • SmileMan. - Friday, June 28, 2013 - link

    I can't see the results, "file not found". Maybe a bad link? Reply
  • OoklaTheMok - Friday, June 28, 2013 - link

    Images are missing... 404 Reply
  • mmrezaie - Friday, June 28, 2013 - link

    neither I! Reply
  • UltraTech79 - Tuesday, July 02, 2013 - link

    neither u! Reply
  • karasaj - Friday, June 28, 2013 - link

    Yeah... Images broken for me as well :( Reply
  • mikk - Friday, June 28, 2013 - link

    Images are broken. Nothing to see here, move on. Reply
  • Ryan Smith - Friday, June 28, 2013 - link

    Sorry about that, guys. The images have been fixed. Reply
  • karasaj - Friday, June 28, 2013 - link

    Thank you. Nice review. Do we know when kaveri launches? Like November? I've been hearing good things about it, so hopefully they can really come through here. Reply
  • Alexvrb - Friday, June 28, 2013 - link

    Thanks for the review. That's impressive for a stopgap refresh, using the same architecture and process. I look forward to Part 2. Reply
  • UltraTech79 - Tuesday, July 02, 2013 - link

    Too late, we're gonna have to let you go. Hand over your badge and gun ={| Reply
  • monstercameron - Friday, June 28, 2013 - link

    don't think it is fair comparing TDPs, we all know that they mean slightly different things for the two, intel 17W parts blow past 30+ watts from time to time. Reply
  • andrewaggb - Friday, June 28, 2013 - link

    Ultimately the power measurement's that matter are battery life in various use cases and how much heat/active cooling is required. 35W under full gpu load is probably acceptable as long as the idle power usage is competitive with haswell. (I'm doubtful that it is though) Reply
  • monstercameron - Friday, June 28, 2013 - link

    yeah idle is where haswell has richland and kabini beat... Reply
  • Khato - Friday, June 28, 2013 - link

    While I doubt anyone would dispute the fact that Intel allows its processors to turbo past their specified TDP (considering that, ya know, it's kinda part of Intel's specifications) your argument would be better served by providing actual numbers instead of inflating them into the realm of fantasy. Specifically, Notebookcheck's power numbers for systems using the i7-3667U show a delta of roughly 23W between maximum idle and maximum load. Meanwhile their review of the A10-4600M shows a delta of 44W between idle and load.

    Regardless, going over specified TDP isn't really an issue, in fact it's typically a good thing. But it definitely means that actual power draw must be considered when comparing performance.
    Reply
  • monstercameron - Friday, June 28, 2013 - link

    really bro? http://www.notebookcheck.net/Review-Fujitsu-LifeBo...
    idle 9W, maxed out 40W...and that is only an i5.
    Reply
  • Darkstone - Friday, June 28, 2013 - link

    Download throttlestop. Click on the button that says 'tpl'. Expected output: power limit #1: 17W. power limit #2: 21W. Turbo time: 28s.

    Review the stresstest screenshot in this review:
    http://www.notebookcheck.net/Review-Acer-Aspire-M3...
    max
    Max. package power, 19w, current package power, 16.5w.

    Although the power usage can definitely exceed the TDP, it only does so for a short time. The effect on games or benchmarks taking >5 minutes is too small to make a difference. A processor with a TDP of 45w can be cooled by a heatsink rated at exactly 45w, but it might not always enter turbo mode for the full amount of time.

    But that's not all, this review compares the graphics performance of an 17W tdp CPU with an 35W AMD part. The intel ULV parts are heavy power constrained. An 35W i5 part would probably be ~ 30% faster in games. Review notebookcheck, for example, starcraft 2 17W TDP ivy bridge: 31 fps. 45W part: 41 fps. Starcraft 2 does not scale above 2 cores. Check Dead Space or Hitman Absolution for more numbers that tell the same story.
    Reply
  • Khato - Friday, June 28, 2013 - link

    Bravo at finding a single example of a 17W SKU behaving abnormally in their review database. If you notice I specified a particular part which outperforms the one included in your linked review, and all reviews notebookcheck has done with that part show a delta of roughly 23W between maximum idle and maximum load.

    Just because there's an outlying result doesn't mean that it's the norm.
    Reply
  • wumpus - Friday, June 28, 2013 - link

    Are you seriously suggesting that Intel draws more power than TDP for long enough to soak the thermal mass of the CPU+recommend heat sink? The whole point of TDP means "thermal design power", and needs to be taken into account only for heatsink and mobile manufacturers.

    If you want to know what maximum current an Intel chip will draw, pull the datasheet (not the stuff made for marketing or consumers: it should be a pdf with several hundred pages. If you really want to build a motherboard; sign the NDA for the real one with the bugs included). Intel will list the requirements for the motherboard's power supply, and don't expect them to draw more than that (even then they may include a certain capacitance and frequency limits for the power supply. Intel is still free to draw however much they wish as long as the average within one period of the switching frequency is within the limits and can be supplied by the capacitance).

    Breaking the specs is huge sin in this industry. Breaking what consumers think the specs are is irrelevant.
    Reply
  • name99 - Friday, June 28, 2013 - link

    This reminds me of the whining that occurred when Intel first added turbo support to its CPUs --- apparently it's "unfair" to use the laws of thermal physics to improve the performance of devices.

    Look, Intel has done an INCREDIBLE job of allowing its devices to run at short high speed bursts, for responsiveness, while generally using extremely low power. This is a tradeoff that meets most people's needs very well, even if it's not an appropriate tradeoff for a server chip that's going to be running at 95% utilization 24/7.
    Others should be emulating Intel, not complaining that what they are doing is "unfair".
    Reply
  • MrSpadge - Friday, June 28, 2013 - link

    Excellent posts here by you, wumpus, darkstone and others! Reply
  • FwFred - Friday, June 28, 2013 - link

    I am very interested to see 28W Haswell GT3 vs. 37W Haswell GT2 vs. 25W/35W Richland, and 17/19W Richland vs 15W Haswell GT2/GT3. Reply
  • Gabik123 - Friday, June 28, 2013 - link

    Why not boot camp a 2013 macbook air to show Richland performance against a GT3 HD5000? Reply
  • takeship - Friday, June 28, 2013 - link

    Or just post the Tomb Raider - Value number from that review. ~28fps. This is really a chip in search of a market. Richland can't replace a discrete setup except at the margins, and has lost it's DX11 leg over Intel as well. Battery life was not mentioned for a reason. I'm very curious Dustin, what the performance of the MSI looked like before you populated out the last RAM slot. It seems that most OEMs would rather save the few dollars rather than even deliver baseline performance with these chips. Also, is there any chance at all that the Richland ULV line will get a review from Anandtech sometime in the future? Reply
  • xTRICKYxx - Friday, June 28, 2013 - link

    Battery Life was omitted because the laptop came with a 7970M. A 100W GPU is going to skew the results. Reply
  • wcg66 - Friday, June 28, 2013 - link

    The APU is a good niche for AMD. These gaming numbers are pretty impressive IMO given the cost versus the Intel competition. I hope the continue to improve to the point that they can offer midrange discrete graphics card performance in a single chip (say Radeon 7790 levels of performance.) Reply
  • mikk - Friday, June 28, 2013 - link

    desktop Haswell is a joke

    I would say Dustin Sklavos is a joke. Bad reputation for Anandtech.
    Reply
  • nathanddrews - Friday, June 28, 2013 - link

    While a bit harsh for a professional review, it's not wrong. Reply
  • superjim - Tuesday, July 02, 2013 - link

    ^ this Reply
  • solarisking - Friday, June 28, 2013 - link

    Actually I'm glad he put that in there. Somebody's telling it like it is. I was a little surprised Anand seemed as pleased as he was with the first Haswell performance article. Reply
  • claysm - Friday, June 28, 2013 - link

    I agree. Desktop Haswell is a total snooze. There's no upgrade incentive whatsoever from IVB or even SNB in my opinion. Reply
  • MrSpadge - Friday, June 28, 2013 - link

    It's only a joke or disappointment if your expectations have been set far too high. Reply
  • kyuu - Saturday, June 29, 2013 - link

    If you're *not* disappointed by desktop Haswell, I'd say your expectations are far too low. Reply
  • Klimax - Sunday, July 07, 2013 - link

    No, just knowledge what can be done and what are limitations of computing. (For converse example see GPU which increase performance by significant complexity; we all saw how that worked out with manufacturing and consumption...)

    Note: I know I am quite late, but too many people comment without understanding.
    Reply
  • TGressus - Friday, June 28, 2013 - link

    "waste heat"

    :P
    Reply
  • iamezza - Saturday, June 29, 2013 - link

    Desktop Haswell IS a joke, it's clear Intel put very little effort into improving the desktop chips.

    Mobile Haswell is extremely impressive though, from a battery life point of view.
    Reply
  • whyso - Friday, June 28, 2013 - link

    Wow, that IGP is much worse than I thought it would be. Seems to be roughly 30-50% faster than the ULV HD 4000. That means that its roughly it is as powerful as mobile GT2 HD 4600 ( HD 4000 SV is about 30% faster than ULV HD 4000 and HD 4600 is about 20% faster than HD 4000).

    AMD needs to release a 45 watt tdp chip for larger performance notebooks.
    Reply
  • xenol - Friday, June 28, 2013 - link

    I was hoping to see some power consumption and temperature observations. Reply
  • MooseMuffin - Friday, June 28, 2013 - link

    You're disagreeing that desktop Haswell is a joke? Ivy was a bummer too but at least it brought some minor things like PCIe 3.0. Haswell has brought basically nothing to the table. Reply
  • FwFred - Friday, June 28, 2013 - link

    I thought GT3e (-R) SKUs and AVX2 were pretty significant additions on the desktop. Intel has the highest performing desktop IGP. Isn't that news?

    Yes, the -K overclocking folk are probably better served with Ivy-E.
    Reply
  • kyuu - Saturday, June 29, 2013 - link

    When it costs an arm and a leg and isn't available, well, anywhere: no, it's not news. Well, I take that back: it may be news, but it's not *good* news. Reply
  • kyuu - Saturday, June 29, 2013 - link

    I'm referring specifically to the GT3e parts, of course, in case it wasn't clear. Reply
  • trisct - Friday, June 28, 2013 - link

    AMD won't interest me again until there's a high end APU with GCN cores available. I wouldn't buy a CPU that is about to be surpassed by a game console. Reply
  • Rontalk - Friday, June 28, 2013 - link

    "Unlike with Trinity, AMD didn't seed Richland reference notebooks to reviewers"

    Ha ha, I wonder why!
    Reply
  • Shivansps - Friday, June 28, 2013 - link

    Why is compared to a ULT Haswell? it shouldt be compared to mainstream mobile Haswell parts? Reply
  • FwFred - Friday, June 28, 2013 - link

    To be fair, there are only i5 and i7 mobile Haswells released. The i5s only come in the 15W (GT2/GT3) and 28W (GT3) variety. The GT3 comparison would be very interesting, but I'm not sure Anand has any of these parts. Reply
  • Beenthere - Friday, June 28, 2013 - link

    Richland appeals to those who want the best performance/value relationship in a laptop. Few people know or care if a laptop APU is properly rated at 35w or falselty advertised as 17w like Intel does. What most laptop buyers care about is running actual software and reasonable battery life. Richland delivers what the majority of consumers desire and at a price that won't break the bank. When you compare performance based on retail price, Richland is the winner not the $200+ more expensive Intel models. Reply
  • junky77 - Friday, June 28, 2013 - link

    thanks for the review

    what about power consumption?
    Reply
  • Khato - Friday, June 28, 2013 - link

    In the many years that I've been reading Anandtech this may well be the first article of genuinely disappointing quality. It's clearly a conscious decision on the part of the author to omit the Haswell GT3 and GT3e benchmarks of previous reviews, apparently justified by the opinion expressed on the third page of, "The only reprieve AMD seems to be getting on this front is the unusual rarity of GT3-enabled parts in the market." And since the Haswell SKUs with faster graphics were omitted from the benchmark results leaving only the low-end 15W GT2 SKU to fight for the title against AMD's top of the line 35W Richland we get the bizarre conclusion of, "AMD continues to offer superior mobile graphics." What's even worse is that the author then acknowledges the fact that AMD has markedly slower graphics in their A8 and A6 lines. (The commentary on desktop Haswell in a mobile review is also a tad bit grating.)

    Anyway, I've always liked Anandtech as articles typically include all the relevant information along with informed commentary. Hopefully this review is merely a random anomaly.
    Reply
  • Vi0cT - Friday, June 28, 2013 - link

    I agree with you in the sense that Dustin can/should get higher quality stuff out but are you taking into account the price difference for GT3/GT3e enabled solutions?.

    The last time I checked, an Iris Pro 5200 enabled SKU was around ~600 USD with the price difference you can get a current gen discrete mobile GPU(from AMD or NVIDIA) then Intel doesn't stand a chance (GPU).

    As far as results goes in the HD 4000 <> HD 5000 comparison article the only benchmark we can relate to is Futuremark 3DMark 11.

    Resuming for you here:

    MSI GX60 (A10 5750M + HD8650G): 1336
    AMD Trinity (A10 4600M + 7660G): 1138
    Intel HD 5000 ( 2013 13-inch MBA): 1080

    So... HD5000 good lower results than Trinity, and considering that the HD5100 just got +0.1Ghz I don't think it's close to Richland.

    Then there is Iris Pro 5200, and then we get into HD 8850M /GT 750M territory price-wise.

    Not to mention that right now the AMD APU is a better option in the price/performance department (for gaming).
    Reply
  • esgreat - Friday, June 28, 2013 - link

    I the MacBook Air was power constrained (15W part). Iris 5100 is a better comparison not because of it's +0.1GHz nominal frequency change, but it's ability to operate at higher power (28W). This means it doesn't have to throttle down on power as frequently. Reply
  • Vi0cT - Friday, June 28, 2013 - link

    Still don't think 5100 can do a lot to the AMD A10-5750M GPU, I believe that at best it can offer equal performance to it and AMD stills wins in the price/performance (take into account driver optimizations too since ). Also VLIW is a more parallel architecture, so it should fare better in non-mainstream GPGPU stuff, however who runs that kind of stuff on a mobile device is beyond me, the best usage case I can think of (for a notebook) is Nebula 3 and the Volterra kernels on CUDA(Professional Audio) but that is on NVIDIA hardware.

    Still believe the market for both is a bit different, I mean Intel is focusing into getting more mainstream devices meanwhile AMD is kind of stuck in this kind of devices (GX60).

    Even being better than the Intel solution price/performance-wise any design aimed at mainstream will prefer the Intel's lower TDP and power consumption.
    Reply
  • whyso - Friday, June 28, 2013 - link

    Look at the 4770k igp results from the iris pro article. It looks like HD 4600 SV is quite on par with even the top range of AMD's apus (4770k is around 10-15% faster than trinity, richland is marginally faster than trinity [sometimes slower] so it looks like 8650G is barely faster than HD 4600). Reply
  • Vi0cT - Saturday, June 29, 2013 - link

    In the desktop Richland is 34% faster than the HD4600 in avg.

    The only comparison point we have here for now is 3DMark 11 (Sadly :/) and that puts Richland at 29% advantage vs the HD5100 in the MBA.

    Again the peak theoretical performance difference between the HD4600 and the HD5000 is 63%. Between the HD5000 and the HD5100 there's is only 18.3% even taking into account throttling the best I can see the HD5100 doing is being on par with Richland.

    Also take into account driver optimization and the optimization for AMD/NVIDIA's GPU architectures.

    Still with Richland we're still talking about VLIW, with CGN (Kaveri) AMD will gain ground against Intel.

    If done right AMD can get this performance into the 15-20W targets and beat Intel at 28-35W with less problems in CPU intensive games.
    Reply
  • Khato - Friday, June 28, 2013 - link

    Pricing is definitely a legitimate concern... too bad that we have no clue how much AMD's mobile parts even cost isn't it? They don't even publish recommended pricing. If we compare similar models that differ only in their processor manufacturer (Lenovo Edge E531 vs E535) then you're talking a $50 premium for the Intel chip on the base offering... though Intel's recommended pricing is the same for the upgraded i5-3230m as the base i3-3110m, so let's figure that the i5-3230m actually costs that recommended $225. In that case we arrive at roughly $100 for the base AMD Trinity and then something around $200 for the A10.

    Regardless it doesn't matter too much. For comparison, Iris Pro models start at a recommended price of $440 for the i7-4750HQ. And then all the non-i7 28W Iris parts have a recommended price of $342. But yeah, who knows what actual prices are? The i3-4158U (28W Iris 5100 part) may well be cheaper than AMD's A10-5750m. And as esgreat already stated, expect Iris 5100 to be quite a bit faster than HD 5000.
    Reply
  • Vi0cT - Friday, June 28, 2013 - link

    I'm pretty sure AMD is giving lower prices to OEMs (else they will get almost 0 design wins) many OEMs will try to get extra margins from the AMD. Also don't underestimate 50 USDs. In developing countries (being from one myself) what looks like "50 bucks" makes a huge difference for the avg Joe.

    It's the difference between the guy actually buying or not the AMD one.
    Reply
  • Khato - Saturday, June 29, 2013 - link

    Oh, I wouldn't be surprised if AMD is giving lower prices to OEMs for their low-end models. But they want to make more money on their 'high end' SKUs same as Intel. Heh, and if Intel's i3-4158U outperforms AMD's top end mobile offering then I'm sure that the AMD part will be priced a bit lower than it. But it's hard to claim that AMD has a massive lead in price/performance if that's the case. Regardless, it's all speculation since neither AMD nor Intel give real pricing information for mobile parts. Reply
  • Vi0cT - Saturday, June 29, 2013 - link

    However we can see it like this, maybe both sell the SKUs at the same price (let's say 5% more or less whoever gets it) and the OEMs get those "50 bucks" just cause it says Intel (there is a lot of change that is happening), then it's not Intel's pricing problem :P Reply
  • Vi0cT - Saturday, June 29, 2013 - link

    there is a lot of chance* Reply
  • FwFred - Saturday, June 29, 2013 - link

    Why would you think the maximum turbo rate has anything to do with the average GPU clock for a 15W part vs 28W? Reply
  • kyuu - Saturday, June 29, 2013 - link

    There is no mobile GT3e part, so what are you even talking about. Or does your definition of "fair" somehow involve comparing mobile Richland to desktop Haswell? Reply
  • Khato - Saturday, June 29, 2013 - link

    Yes there is. Mobile is the primary market for GT3e - there are 3 mobile products with Iris Pro 5200 and only 1 desktop product. Reply
  • kyuu - Saturday, June 29, 2013 - link

    Apologies. I actually missed that 47W Haswells were considered "mobile" when I read about it the first time. Still, given the price and TDP disparity, I don't think comparing Iris Pro to Richland is terribly interesting. Reply
  • FwFred - Friday, June 28, 2013 - link

    I sure hope 'Part 2' had battery life... an essential part of any mobile CPU review. Reply
  • Frenetic Pony - Saturday, June 29, 2013 - link

    WE never needed Bulldozer to begin with. AMD shot itself in the face with Bulldozer, just after its former CEO hoodwinked the shareholders that stuck with AMD and the company itself by making off with the manufacturing arm in the way of Global Foundries just as he was simultaneously making the entire thing a viable, moneymaking business.

    WE need Bulldozer gone with, a long time ago. AMD, and thus us, needs a new much less power hungry architecture that can fit into a SOC like structure on the low end and makes Intel stop piddling around on the high TDP end. But unless both Kabini and the PS4/Xbone do very well this holiday season, AMD might not get a chance to produce that.

    Still, maybe Qualcomm will be up for it. I could see them even buying out the GPU division if AMD goes under. And their updates Krait this year are perfectly competitive with ARM's new Cortex a15 AND Apple's Swift. Maybe they can just keep getting better. Start to put pressure on Intel from the bottom of the power scale up.
    Reply
  • torp - Saturday, June 29, 2013 - link

    But where are the notebooks with *just* a Richland and no discrete graphics?
    If I get an A10 I'll get it for the 'good enough' performance at a low price using just the integrated GPU.
    Does anyone make such a notebook, preferably that's not a piece of crap, build quality wise?
    Reply
  • Kalelovil - Saturday, June 29, 2013 - link

    "In this reviewer's opinion, 35W isn't the target, it's the halo. 15W-17W is the target, and while AMD has offerings at those TDPs, they're woefully uncompetitive."

    True, their 17W offerings are disappointing, but AMD has the 19W A8-5545M and 25W A10-5745M which offer reasonable specifications (http://www.anandtech.com/show/6979/2013-amd-elite-...

    Any chance you will be having systems using either of those in for review?
    Reply
  • Kalelovil - Saturday, June 29, 2013 - link

    Corrected link: http://www.anandtech.com/show/6979/2013-amd-elite-... Reply
  • TerdFerguson - Saturday, June 29, 2013 - link

    All the "we need AMD" shtick is getting very long in the tooth. Maybe it wouldn't be so offensive if authors were able to provide some scholarly references to back up the claims that consumers are going to really be hurt by AMDs failure to compete. There are certainly plenty of industries with many vendors where competition isn't a huge factor in pricing, even in the tech sector. I'm nowhere near convinced that AMD should somehow derive credit for Intel's progress, and the baseless whining in this article has done nothing to convince me otherwise. Reply
  • kyuu - Saturday, June 29, 2013 - link

    ... I generally dislike being so blunt, but: wow, that's clueless even for someone who's name is "Terd". Reply
  • Pneumothorax - Saturday, June 29, 2013 - link

    Wow... Somebody forgets the 'good ol days' of the 90's when Intel basically completely owned the x86 market and routinely released their latest chips always around $900. For example the original Pentium in the 'cheaper' 60mhz variant was released in 1993 at $847. And no that was not an 'extreme edition' either. In today's dollars it's close to $1400. That is the x86 world without a viable competitor to greedy Intel. Reply
  • drothgery - Saturday, June 29, 2013 - link

    That was also a world where $3000 desktops were in "reasonable high-end" space, not "if you don't have a serious business case where you're maxing out the resources on this thing -- and you probably don't -- only buy it if you've got more money than sense" space.

    AMD was only a viable competitor to Intel from the trailing end of the P3 era to the Core 2 launch. If Intel was going to jack up their prices when AMD stopped being a viable competitor, they've certainly taken their time at it. They released a dominant product 7 years ago, have only increased and broadened their performance lead, and still aren't doing it.
    Reply
  • TerdFerguson - Sunday, June 30, 2013 - link

    I haven't forgotten those heady socket 7 days in the least. As I recall, one could buy x86 chips from IBM, Cyrix, AMD, and others. The $2000+ machines you're talking about were perhaps not marketed as "extreme", but they certainly performed remarkably well compared to the nearly as expensive 486 machines from Intel and others that they slowly replaced. Fast-forward 20 years, and we're down to two manufacturers and CPU prices are pretty much at an all time low. So, where's the correlation? There, meanwhile, are a dozen different motherboard manufacturers and prices have been rising like mad during that same time period. Again, where's the correlation?

    If having a large number of vendors automatically precluded ludicrous pricing, there'd be no such thing as price fixing.
    Reply
  • mitcoes - Saturday, June 29, 2013 - link

    I agree and I always miss a Price/Performance note at benchmarks. Perhaps with a second bar.

    i7 vs A8 / A10 for gaming Price / performance is a no brain choice

    And all we know that gaming is almost the only thing that requires real desktop performance as almost every other desktop common app will run well at almost any actual CPU+GPU
    Reply
  • johnny_boy - Saturday, June 29, 2013 - link

    I would have liked to have seen the (a) system running dual channel 1866 memory, since that would have offered an additional small boost to graphics performance. I'm surprised how much this evolutionary development over Trinity results in significant performance gains. Waiting for Kaveri now. Reply
  • dineshramdin - Sunday, June 30, 2013 - link

    For me, I need something with a high end APU…. I sometimes feel its irritating to get ur CPU occupied with some unnecessary game console… I am not gonna buy this. Reply
  • mikato - Tuesday, July 02, 2013 - link

    Error-"PCMark 7 is always going to respond primarily to the storage system, so the GX60's SSD takes a bath." page 2.

    I thought at the GX60 didn't have an SSD and that's why it takes a bath. Justin needs to take a bath actually since I keep hearing about all this bathing of computer hardware lately from him.
    Reply
  • medi02 - Wednesday, July 24, 2013 - link

    It's hard to get this where this conclusion is coming from:

    Graphics performance will at best be slightly above parity, while CPU performance takes a bath.


    As Intel's HD in this very article is roughly 2 times slower that AMD's APUs. (while gap between CPU's is about 1.5)

    This means that if you occasionally play games you should avoid Intel's notebooks without dedicated graphic cards, while you're fine with AMD's without. And I have yet to find an app that I would run on a notebook, besides games, that would seriously benefit from a faster CPU.
    Reply
  • PsychoticFlamez - Thursday, August 22, 2013 - link

    Ok let me just say something all these sites say the new cpu is the same as trinitys. but its not richland has improved thier cpu, and intergrated gpu so much that its at a comparason. to your mid range desktop. I would know I upgraded not to long ago and this spd increase is about 60+fos in my games. P.S. I do not have a dedicated video card in my computer. Reply
  • webcat62 - Tuesday, September 24, 2013 - link

    I just bought a HP AMD A10 laptop 2.5ghz cpu with 8gb of ram 1TB hard drive ATI Radeon 2500 with 768 memory, 8mb of of L2 cache, blutooth, multiple dvd writer usb 3.0 x2 usb 2.0 glossy screen. 5 hour battery life, hdmi port 10x card reader, loaded with windows 8. I bought it at Future shop, there were only 10 units available for $399.00 + tax= $480.00 This laptop retails on the web between $650 to $700 How is that for a great bargain, IT does not overheat, I leave it on all day, i play the most demanding games at medium resolution. For this price it does not get any better.
    webcat62
    Reply
  • UtilityMax - Thursday, December 26, 2013 - link

    AMD needs to bring something new to mobile APU market ASAP. If this APU was compared to a portable with Intel's 35watt Haswell processor with HD4600 or even HD4000 graphics, the massive lead of the APU in 3D games would disappear. I mean, A10 may still be a little faster, but not by a truly significant margin. At best, it competes with Haswell i3, which will be priced aggressively, considering Haswell i5 portables can go for $600 or less. Reply
  • touristguy87 - Wednesday, April 09, 2014 - link

    "AMD doesn't just need Kaveri. We need Kaveri."

    No, we don't. Maybe you need it, but the majority of the computer market certainly doesn't. This is why the desktop/laptop/notebook market is dying rapidly: people only upgrade their machines either when the machines die (because a component fails) or miraculously they find themselves in need of a much-faster machine. Short of that all they do is buy a new one to fill a need for a new one, and then the hardware is plenty fast enough for most needs most of the time. The big issue becomes price, because there's no need to spend $2500 for a top of the line laptop. Oh sorry, $1000. I'm thinking of 2000 prices, 2005 maybe.

    Phones. Tablets at most. That is where the market is. 35W laptops are an afterthought. Especially running Windows 8, especially for gaming.
    Reply
  • limbo90 - Saturday, September 13, 2014 - link

    Currently I'm considering to take MSI GX60 3CC Destroyer laptop that comes with AMD A10-5750M and Radeon R9 M290X.... most review said that it is not a recommended laptop for gaming as for the AMD A10-5750M bottleneck issue with Radeon R9 M290X.... some said it is only suitable for single player game, not online gaming/online multiplayer... Any opinion on this laptop specs? Reply

Log in

Don't have an account? Sign up now