• What
    is this?
    You've landed on the AMD Portal on AnandTech. This section is sponsored by AMD. It features a collection of all of our independent AMD content, as well as Tweets & News from AMD directly. AMD will also be running a couple of huge giveaways here so check back for those.
    PRESENTED BY
POST A COMMENT

95 Comments

Back to Article

  • vitaminwater - Tuesday, November 08, 2011 - link

    1st time poster, long time lurker

    a AMD x2 in this day an age?? you can get a Athlonx4 955 black editon for $120 shipped

    for a couple of bucks more you get MONSTER performance for a budget build

    just my 2cents

    keep up the good work!
    Reply
  • JarredWalton - Tuesday, November 08, 2011 - link

    $120 is twice the cost of the Athlon X2, and at that price you could also get the Llano A8 processors or an i3-2100. You could also add a discrete GPU for the same $60 increase, or some other upgrades like a better display. Given the higher power requirements of the 955BE relative to other choices, I'm not sure it's any more sensible than an inexpensive Athlon II X2. Reply
  • jefeweiss - Tuesday, November 08, 2011 - link

    It seems like part of the problem with the comparison of the low end A4 part is that it should be compared to a system that is comparable in cost of both CPU and graphics, because you get both with the Llano parts. The cost of the Celeron plus the 5670 discrete graphics or the Athlon II plus 5670 discrete card would give you enough money to buy a much nicer Llano processor. Otherwise, it doesn't make sense to compare the Llano to the other two with discrete graphics, you would have to use integrated graphics. Part of the value proposition of the Llano is that you can game on a system and not have to buy a discrete card, or at least that was my understanding. Reply
  • Wierdo - Tuesday, November 08, 2011 - link

    Yeah that was strange, are we comparing products we can buy for X price, or not? Makes no sense. Reply
  • mczak - Tuesday, November 08, 2011 - link

    The gpu of the a4 is not anywhere close to a HD 5670 however - there's a reason its gpu is called HD 6410 (not even the a8-3850 is really close to that fwiw more like HD 5570)... I guess it should be quite comparable in performance to a HD 6450, it has significantly less shader power (thanks to much lower clock) but it possibly can make up this loss with its higher memory throughput (of course not against the HD 6450 gddr5 but these aren't available anyway).
    Granted the price difference between a HD 6450 and HD 5670 is not really all that large - but the 3d performance of the A4 is going to be low enough that you can't really say it comes with both cpu and graphics but the celeron does not (yes the HD 2000 will be slower, a HD 3000 such as in the i3-2105 could however possibly challenge it).
    Reply
  • mino - Wednesday, November 09, 2011 - link

    Well, the Celeron, or any Intel for that matter, has no proper 3D drivers.

    So no it does not come with a GPU. Even a Radeon 9700 from 2002 has more "GPU" in itself.
    Reply
  • Paul Tarnowski - Tuesday, November 08, 2011 - link

    Haven't yet read anything beyond the first page, but I'd say the problem in your thinking is that budget builders are going to want to spend the extra dosh. In fact, most budget builds are extremely tight. When I'm wearing my system builder hat I have to be very careful when building these things. For the price of your 955 I can get both CPU and motherboard.

    It's just not going to be much of a gaming build. Budget PCs aren't meant for gaming.
    Reply
  • MonkeyPaw - Tuesday, November 08, 2011 - link

    I think Microcenter is probably the best kept secret in CPU pricing, as they actually regularly check and try to beat Newegg pricing. For example, they are selling the G530 for $39.99 as I type this. They are also selling OEM Phenom II X4 830s for $49.99. If you need a retail model, the 840 is $10 more. That would probably be my first choice for a "budget CPU" right now unless absolute efficiency was the goal. At least you can overclock them still. Reply
  • Roland00Address - Tuesday, November 08, 2011 - link

    Phenom 830 is a true phenom II x4 with 2.8 ghz, 2mb l2 cache, and 6mb l3 cache
    Phenom 840 is a rebrand athlon II x4 with 3.2 ghz, 2mb l2 cache and 0mb l3 cache.

    Agreed that microcenter is awesome.
    Reply
  • TrackSmart - Tuesday, November 08, 2011 - link

    Agreed x 2. $50 for a 2.8 - 3.0 GHz quad core is the way to go (assuming you have access to a MicroCenter store. I think they advertise that deal every week. It makes me want to build a system, even though I have no need for another one, just because it's such a goo deal. And I guess that's the point. Reply
  • Taft12 - Tuesday, November 08, 2011 - link

    For a long time lurker, you could have done a lot better with your first contribution. You've completely missed the point of this article. Try harder for your second. Reply
  • mhahnheuser - Friday, November 11, 2011 - link

    ...IMO very rude reponse. Maybe you should use your energy on something other than punching the keys on your keyboard. He was only pointing out that you could get much more cpu performance for relatively little extra dollars. I thought a valid point myself. Reply
  • SleepyFE - Tuesday, November 08, 2011 - link

    My friend you are thinking of the first generation Athlon, which would be a bit slow and old. For anyone not needing more cores (since you can only play one game at a time and other things are less taxing) a good idea is to look at Phenom II x2 555. I am from Slovenia (it's in Europe) and you can get it for 80€ while the athlon at 3GHz is about 65€.
    Also it is a good Idea to buy a motherboard with USB 3.0 since USB 2.0 will get real slow real fast once you get a taste of USB 3.0.
    Reply
  • antef - Tuesday, November 08, 2011 - link

    Thanks for this but could you do a mid-range guide by Christmas :) I plan to build over my vacation. Can't decide if should do 8 GB of RAM or more, if I should wait for new AMD video cards, or what SSD I should get... Reply
  • Beenthere - Tuesday, November 08, 2011 - link

    antef-

    8 GB. is plenty for most folks. AMD Vid cards should be available in Jan. SSDs are still immature technology and some what unreliable and having compatibility issues. If you can afford BSODs, lost data, weekly firmware updates and RMA'ed drives, then an SSD may work for you. If not you may want to wait another six months and see if they sort the problems out better.
    Reply
  • JarredWalton - Tuesday, November 08, 2011 - link

    Woah there tiger... SSDs are not 100% reliable, but let's not get carried away. I've yet to experience a BSOD that I'd attribute to my use of SSDs (haven't been using SF-2200 stuff, though), and firmware updates are really only necessary if you're an early adopter of a specific model. I've got a Vertex, Vertex 2, a couple Intel SSDs, and several 64GB Kingstons that are all running fine. Now, they're not inexpensive so I wouldn't necessarily force one into a budget build, but for midrange builds I would definitely try to get at least a 120GB in there for the OS and apps. Reply
  • StevoLincolnite - Tuesday, November 08, 2011 - link

    Never had a problem regarding system stability and SSD's.

    Had my OCZ Vertex 2 60gb drive for months. Never had a crash or RMA'd the drive, lost data... Or even updated the firmware. It just works and it's an upgrade I highly recommend to anyone.

    Currently counting 17 days of total up-time without a single reboot, crash or error even with a "buggy SSD".

    The great thing about an SSD is how much faster and responsive windows feels over a mechanical drive.
    Reply
  • slayernine - Tuesday, November 08, 2011 - link

    I just RMA'd my one year old OCZ Vertex that just randomly died on a desktop system that saw very light use. Crossing my fingers that my replacement does not fail. A close friend of mine had that happen to him. Bought the same or similar model of OCZ, it died, got a replacement and it died too. The claimed 3% failure rate is more like 30-40% in my estimations. I think many people just don't bother to RMA due to the shipping costs, effort and system down time. Reply
  • Death666Angel - Tuesday, November 08, 2011 - link

    I have an OCZ Agility (first Indilinx generation) 60GB and a OCZ Vertex 2 120GB running, alive and well to this day. The 60GB was in my desktop and got transferred to the Acer Travelmate 8172 once the Vertex 2 got affordable. Have flashed them and done one or two secure erase, but other than the Vertex 2 giving the BIOS some SMART-data grief after a cold start, no issues. Also, I have been employing a 1.8" OCZ Onyx for about 2 months in my HTPC without issues. Reply
  • geniekid - Tuesday, November 08, 2011 - link

    "SSDs are still immature technology and some what unreliable and having compatibility issues."

    That I can somewhat agree with.

    " If you can afford BSODs, lost data, weekly firmware updates and RMA'ed drives, then an SSD may work for you."

    That's definitely hyperbole. Maybe you had a bad experience, but I counter your anecdotal evidence with my own - I've been using an Intel X-25M for a year now with no firmware updates or lost data.
    Reply
  • antef - Tuesday, November 08, 2011 - link

    I'm not worried about getting an SSD. I think reliability and such has really improved if you stick with the right brand. Even so, if I have a problem, I won't be keeping any personal data on it, just my OS and apps. If it has a decent warranty I think it's absolutely worth the increased performance and responsiveness that everyone raves about.

    Everyone else agree on 8 GB memory? I have kept my last build for over 3 1/2 years so I could potentially keep this one for a similar amount of time.
    Reply
  • Paul Tarnowski - Tuesday, November 08, 2011 - link

    Right now 8 GB is right around $50-$60, and less if you look for deals (like $35 for 1333 on Newegg's shellshocker today). 16GB 1600 can be gotten for as little as $70 on special. So if you find a deal, and it's in your budget, I'd say go with 16GB. It's probably more than you need, but it'll mean you can get rid of your pagefile. If you have an SSD, that's always a plus.

    Either way, just stick with 4GB sticks as the 8GB sticks are overpriced.
    Reply
  • Lunyone - Tuesday, November 08, 2011 - link

    I generally think that Llano is really good for laptops and not as good for desktop area. This is just my opinion. Llano isn't that more expensive than the 2 other budget builds above, but does better in games (at stock configuration) than either of the budget builds. Another note is the Llano chipset doesn't have much upgrade path, IMHO. The AM3+/1155 chipset mobo's I think are better options for upgrades than the FM1 socket/chipset, IMHO. Isn't Piledriver a different socket/chipset?

    Now don't get me wrong, Llano chips aren't all that bad, but I just don't see them being a good/better long term buy in the desktop arena. If' I'm just building a regular "Desktop" they are fine as well as the other 2 budget builds listed above. But since I usually build more than just a regular Desktop, I usually opt for a system that works better with a dedicated GPU (mainly for better gaming capabilities). Of coarse this is just my preference and is a bit different than what the article was looking at. These builds look pretty good for what the design/budget is looking for.
    Reply
  • StevoLincolnite - Tuesday, November 08, 2011 - link

    Rumors are that AMD may end up making Socket FM2 backwards compatible with socket FM1, just like they did with Socket AM2+ and AM3.
    Nothing Solid yet of course.
    Reply
  • Lunyone - Tuesday, November 08, 2011 - link

    Well if that is the case, than the Llano build might not be too bad. I just see Llano as a good laptop chip and an okay desktop chip. For probably 90% of people, Llano will do just fine, because they probably wouldn't notice too much difference than what they currently own. Most people will notice if you have an SSD as your main boot drive than if you have an i7 quad core instead of a Llano chip. Now if your doing encoding or some heavier CPU related tasks, than the i7 quad core will definitely notice the CPU difference (of coarse you'll be paying for that option too). Reply
  • Taft12 - Tuesday, November 08, 2011 - link

    I seem to be in the minority, but I think desktop Llano is a great product that is the best choice for 80% or more of desktop systems. You sacrifice CPU capability (which has been "good enough" for just about everything ever since the Core 2 Duo and Athlon 64 X2 days) and in its place you get much (MUCH!) improved IGP performance which will have no problem whatsoever for Flash games or 1080p Youtube.

    Hardly any of my family or coworkers play 3D games or use Photoshop, video transcoding, etc -- I would think this is a pretty accurate sample of the majority of PC users.
    Reply
  • JarredWalton - Tuesday, November 08, 2011 - link

    Have you tried Flash games and 1080p YouTube on Sandy Bridge? I haven't had any issues with either one, but I'd love to know what specific games/videos you've tried that didn't work properly. AFAIK, even desktop Core 2 Duo systems handle YouTube 1080p pretty well, doing all the decoding on the CPU.

    The other issue I have is that you imply all Llano are roughly equal, and obviously they are not. A quad-core Llano with the full on-die GPU is a far more interesting chip, but then it costs twice as much as the A4-3300. 160 shader cores is barely enough for most games at minimum detail and a reasonable resolution (hint: 1024x768 isn't "reasonable" in my book; 1366x768 for laptops and 1600x900 minimum for desktops is what most users are running).

    Saying Llano is the best choice for 80% of users is like saying 80% of users shouldn't have ever upgraded from a Core 2 Duo E6600. Hey, wait... I'm running one of those in my HTPC!
    Reply
  • QChronoD - Tuesday, November 08, 2011 - link

    Would you guys consider this level of hardware adequate for HTPC duty? I'm guessing that it might need a discreet video card since most of the boards don't have HDMI. What would you recommend as the minimum video card for good quality playback & passing the audio to a real AV receiver? Reply
  • duploxxx - Tuesday, November 08, 2011 - link

    any liano build will satisfy that need Reply
  • geniekid - Tuesday, November 08, 2011 - link

    I'm using an A3500 in my HTPC connected to my receiver via HDMI.

    In my honest opinion, you don't need a graphics card. 34 of the 42 FM2 motherboards on Newegg as of this moment have built-in HDMI.
    Reply
  • Taft12 - Tuesday, November 08, 2011 - link

    Not only do you not need a graphics card, you'll have much better thermals and acoustics from your small form factor HTPC case without an additional furnace in the case. Reply
  • Death666Angel - Tuesday, November 08, 2011 - link

    This was precisely why I chose an A6-3500 for my HTPC. I came from a Core i3 530 (which could OC to about 4GHz) and an HD4670. I used it for a few games (beat em ups, racing games etc.) and of course movies/tv series of all sorts. But once I decided to get a new, smaller case (I had a desktop HTPC case that fit 7HDDs but they moved to a dedicated file server making the case too big for the living room), I decided against the SNB line up, because of the crappy game performance. My Llano setup saved me money assuming I went for a SNB setup with equal graphics power. And then, the system would not have allowed the case I ended up using (half height, no PCI slot in the back, 60W PSU). I is awesome to just have to worry about cooling one thing, not an extra graphics card and no IGP chip on the motherboard. :-)

    Overall, I think this article is interesting but a bit all over the place. As others have mentioned, you compare Llano sans graphics card to X2/Celeron with graphics card.... you don't show the power consumption of those two with graphics card.... While I agree with the overall theme of it, it is not very thorough or thought through. I think people not familiar with all the different tests will have a harder time figuring out which is the best setup for them.

    I also don't like the dual-core Llanos. The graphics part is too cut down. The triple core for me was a great balance, because I needed to fit the 60W envelope of the PSU (works great with undervoling, I'm at 54W@2400MHz LinX) and I wanted a powerful graphics part. And the difference between the dual core and triple core parts for me was about 15 or 20€. If I wanted, the triple core also looks to be easily overclockable with K10Stat from within windows.

    But enough ranting for now. :D
    Reply
  • piroroadkill - Tuesday, November 08, 2011 - link

    Sure, the GPU may be a bit worse than the Llano system, but you can't underestimate the usefulness of already having a socket 1155 board. You're immediately open to THE best upgrade path with nothing more than a flip, a click, and a pop of 4 pins. Reply
  • piroroadkill - Tuesday, November 08, 2011 - link

    Along those lines, I'd spend a teeny bit more to begin with on the PSU. Then you're all set. Reply
  • Lunyone - Tuesday, November 08, 2011 - link

    The PSU is just fine, IMHO. I have a e6700, 6850 GPU, 4 gb's DDR2, 2 DVD's, & 1 HD being powered by the Antec 380w PSU. It runs just fine and no issues whatsoever, so these systems have plenty of upgrade path with the PSU's selected. Now if you get a really power consuming GPU than you should upgrade the PSU, but most people the PSU is just fine. Reply
  • Dustin Sklavos - Tuesday, November 08, 2011 - link

    Honestly, people grossly overestimate the amount of PSU they need these days. My desktop is running an 80 Plus Gold 1.2kw PSU, but under heavy load only pulls maybe 510 watts? That's with an overclocked i7-990X, overclocked GTX 580, four hard drives, an SSD, a separate USB 3.0 card, AND a GeForce GT 430 driving two other monitors.

    So long as the PSU is quality (and mine is) and the amperage is right, people could be getting by with a LOT less these days than they actually use. On the flipside, if I ever completely lose my mind, at least I know my desktop'll take a second 580. ;)
    Reply
  • JarredWalton - Tuesday, November 08, 2011 - link

    Look who's talking about overestimating power requirements. LOL. 1.2kW PSU, Dustin? I'm running my GTX 580 off a "paltry" 750W model. ;-) Reply
  • ckryan - Tuesday, November 08, 2011 - link

    Man, my OCd 2500K P67 system idles at 50w@the wall and pulls about 200w gaming... with a 650w Seasonic X series PSU. That's about the same ballpark of ridiculous, but as soon as someone makes an equally great 350w I'll get one. 1.2KW is pretty silly though... I hope the PSU was comped at least. I got the Seasonic for $100 shipped and the fan never comes on even when gaming so that's one justification.

    Incidentally, I just purchased the G530 to drop in a H67 board for my SSD endurance testing rig. After playing with it for a while, I think it's well worth the premium the CPU is going for over MSRP. I could see just about every one I know using a G530 and thinking it's the greatest thing ever. The problem with SB Celerons and Pentiums is the motherboards. H61 boards are just not that good, and H67 boards are a little too expensive for a $50 processor. I already had a Biostar TH67+ laying around, and the system is now sitting in my entertainment center trying to put a SSD in the dirt at the rate of over 10.5TB a day. If Intel actually cared about what I want, they'd release a i3 K series as that's the SB I'd really like to see. I considered getting an i3, but I couldn't justify spending $130 for a processor that will spend it's entire life at idle putting SSDs to the sword.
    Reply
  • Dustin Sklavos - Tuesday, November 08, 2011 - link

    Yeah, the PSU was comped. That's why I'm using it. 1.2kW is most definitely nuts. ;) Reply
  • jabber - Tuesday, November 08, 2011 - link

    Yeah we really need a manufacturer to bring out the ultimate baddass 450W PSU that delivers top quality and most of us can use that instead of these frankly ridiculous mega supplies we feel we need to buy. Reply
  • DanNeely - Tuesday, November 08, 2011 - link

    Unless you're more concerned about squeezing every last watt of efficiency out of your system than noise under load, you want a PSU that maxes at 200-300W more than your peak consumption level so that its fan never goes above idle speeds. For a gaming box that typically means an extra 10-20W drawn while the system is idle since you're in the <20% load low efficiency zone on most PSUs. Reply
  • piroroadkill - Tuesday, November 08, 2011 - link

    That's odd, because my i5-2500k @ 4.5 with a Radeon 6950, 8GB RAM and 7 hard drives pulls around 150w idle. Seasonic X-660. Under load, we're talking north of 300w easy.

    When I had a Q9550 @ 3.8 and a Radeon 4890, it pulled about 230w idle. That was with a Corsair HX520. I easily pushed 400w at the wall under CPU + GPU load, and I was actually pretty afraid to load both to the maximum.

    I have a power meter permanently hooked up to my PC.
    Reply
  • piroroadkill - Tuesday, November 08, 2011 - link

    I realise this is AC load, not DC load. However, I have been running pretty efficient PSUs.

    I do completely agree people overestimate vastly.
    Actually, with my old Radeon HD 2900XT, that used MORE power than my 4890.
    Reply
  • Taft12 - Tuesday, November 08, 2011 - link

    It's the hard drives that are pushing up your idle power usage. WD Blacks or 2+ TB "green" drives use 6-8W each at idle. Reply
  • erple2 - Friday, November 11, 2011 - link

    No, it's the 4890 combo with the Q9550 that's pushing that kind of output, even at idle. Drives typically consume "only" 7-8 W each at idle, and only about 10W under load. So expect the drives alone to contribute 49-56W alone under idle. The top 2 consumers in that setup are clearly the GPU, then the CPU.

    My i7-950 + 6870 and one WD Black drive eats 200W at idle.

    My old computer (core2duo 6750, 4890, and similar drive) used to idle at 240W give or take.
    Reply
  • Iketh - Tuesday, November 08, 2011 - link

    That's funny, because I have a 2600k @ 4.2ghz converting to x264 as I type this and using a steady 170w. That's with 2 Seagate greens, an ssd, and a Radeon 6870. My power supply is an Antec 380w. If I game at the same time, it's at 250w. Idle is 92w. Sounds like there is a little tweaking you can do in your bios.

    I also have a power meter permanently hooked up to my PC.
    Reply
  • wifiwolf - Tuesday, November 08, 2011 - link

    Just as a side note: you're in the 50% spot, so max efficiency. Reply
  • DominionSeraph - Tuesday, November 08, 2011 - link

    $420??

    Inspiron Desktop 560 Mini-Tower
    Processor: Intel Pentium Dual Core E6700 (3.20GHz 2MB)
    Genuine Windows 7 Home Premium
    500 GB SATA Hard Drive (7200 RPM)
    3 GB DDR3 SDRAM 1333MHz (3 DIMMs)
    16X DVD +/- RW Optical Drive

    $289 at Dell outlet.

    Why the heck would anybody build a budget system?
    Reply
  • slayernine - Tuesday, November 08, 2011 - link

    For friends and family that mostly just Internet browse but want a system that could perhaps be upgraded in the future as most prebuilt systems don't allow.

    Also building your first entry level system lets you get into system building or gaming without breaking the bank. Your Dell system will not play any games without a dedicated card and would likely need a power supply upgrade if you wanted to install a dedicated video card. Also airflow in most consumer desktops is not suitable for a gaming system unless you buy something like a Dell XPS which then puts you into a much higher price bracket. At that point you will realise why custom built is better :)
    Reply
  • jabber - Tuesday, November 08, 2011 - link

    Indeed, had many a Dell Dimension or similar in for 'upgrades' and by the time you check them over the HDD and ram is about all that's worth upgrading. If you want to put another HDD in then you have to contend with their bafflingly over complicated HDD mounting systems. Why they have to use 8 parts when other cases use just a simple slot and screw method I don't know. Reply
  • Taft12 - Tuesday, November 08, 2011 - link

    Non-terrible graphics performance and a Windows install free of "value-add" bloatware for starters Reply
  • slayernine - Tuesday, November 08, 2011 - link

    Did I just hear you say Intel onboard graphics offer "Non-terrible graphics performance" ? I hope you are just trolling me because that may just be the most ridiculous statement I've seen all day. Reply
  • slayernine - Tuesday, November 08, 2011 - link

    This comment system lacks an oh crap how do I delete that last post function. I get what you are saying taft12, you are saying a budget system not using onboard graphics allows for non-terrible graphics. Reply
  • Esben84 - Tuesday, November 08, 2011 - link

    Thanks for an interesting article. I like that there's also focus on more budget oriented systems. It's not everyone that needs high-end components like us enthusiasts. Though I find it difficult to recommend building budget systems, when the prebuilt ones can be found so cheap. At work I've introduced the Vostro 460 basic config, and now we are now up to using 7 in total. They've been on offer for a long time now. We buy the basic config, which in the US costs $469 and it includes a Core i5-2400 quad-core Sandy Bridge CPU, H67 chipset, 2 GB memory, 320 GB harddrive, GBit LAN, Win7Pro, HDMI connector and Intel HD Graphics 2000. It's quiet, very cheap and it's a decent looking case. Only 2 GB extra memory is needed, and if used with two digital monitors, a cheap graphics card. With such a fast system, it feels very bottlenecked by not having an SSD. If only they would add a DisplayPort and change the harddrive to a 64 GB SSD, this would be the perfect system.

    Esben
    Copenhagen, Denmark
    Reply
  • Halnerd - Tuesday, November 08, 2011 - link

    Why would you compare the APU($70) vs a Celeron($60) + AMD5670($70)? The Celeron had a built in GP, right? The APU would be expected to perform markedly slower than a CPU and discrete GPU. You need to run both the APU and Celeron with and without AMD5670 for proper comparison. This section of the article is ridiculously lopsided.

    Reply
  • frozentundra123456 - Tuesday, November 08, 2011 - link

    well, to be fair, he did test the Athlon with a discrete GPU too. He did basically say that the integrated GPU on the Celeron was worthless for gaming except in the very undemanding LFD 2.

    However, I agree with you in that I would have liked to see a comparison between the HD2000 of the Celeron and whatever integrated graphics the Athlon had (HD 4200??).

    Maybe some older, non-demanding games or whether the integrated GPUs could handle 1080p video or Netflix streaming, or whatever. Or at least address whether the APU of the Llano offered any improvement over the other two integrated solutions (except in gaming) for the typical uses of such a budget system.
    Reply
  • Wierdo - Tuesday, November 08, 2011 - link

    Yeah makes no sense, the article's writing is so unfocused, the goalposts keep moving allover the place. For example, showing benches of Llano product vs X2 product + video card, but then in the build section using integrated mobo video card with the X2, wth.

    Interesting collection of info if you ignore the goal of the article though.
    Reply
  • Halnerd - Tuesday, November 08, 2011 - link

    Yeah, It would have definitely been more informative had they run all three CPU's with and without a discrete GPU. That would have given us a good idea of the full capabilities of each, whether for HTPC or gaming. As it stands the test methodology is bunk. Reply
  • Taft12 - Tuesday, November 08, 2011 - link

    Agreed, it would allow a conclusion that points out the tradeoffs made between the 3 platforms. None are truly better or worse than the others, just different strengths and weaknesses.

    The article was actually terrific until the gaming benchmarks. That's when things went off the rails.
    Reply
  • Halnerd - Tuesday, November 08, 2011 - link

    I would really like to see some analysis of the socket FM1 Athlon II X4 631. That is a really interesting product at around the same price point. Might be a winner if we are going to look at discrete GPU solutions. A full review of the 631chip (or any other FM1 Athlons you can get your hands on, i.e. 641?) would be very awesome. Reply
  • buildingblock - Tuesday, November 08, 2011 - link

    I don't see how the X4 2.6Ghz 631 can ever even begin to be a winner against the Intel opposition. It is no more than an A6 3650 with the graphics unit disabled. My local hardware dealer is listing it at around 10% more than the Intel 2.8Ghz G840, which easily out-performs it and has a GPU. The budget end of the market is dominated by cheap Intel H61 motherboards and the 1155 socket Pentium G series, and the X4 631 brings nothing whatsoever to the table that's going to change that. Reply
  • Taft12 - Wednesday, November 09, 2011 - link

    What analysis do you need to see? It is an A6-3650 with the graphics lopped off. If you want to know how it would do in CPU benchmarks, just look at a A6-3650 review.

    Despite the GPU sacrifice, you get no TDP savings. It's a piece of shit through and through not worthy of discussion.
    Reply
  • mhahnheuser - Friday, November 11, 2011 - link

    ...you are spot on. But I can tell you why. Because if he ran the right discrete card in the Llano it would crossfire with it and then there would be absolutely no point to the comparison. Reply
  • mhahnheuser - Friday, November 11, 2011 - link

    ...so the conclusion should have been....don't buy the Celeron over Llano unless you add a fast discrete DX compatible gpu. He only tested the X2 so that he could validate the comparision to Llano by building a higher performing, higher cost Celeron system. The X2 is a old Gen processor whereas the Celeron tested is in Intel's SB family...how is that a basis for fair comparison? Reply
  • mhahnheuser - Friday, November 11, 2011 - link

    Good observation. I second your opinion. Reply
  • Wierdo - Tuesday, November 08, 2011 - link

    Maybe I'm missing something, but how come the performance difference between the X2 3.0Ghz and A4 2.5Ghz is so big?

    They're pretty much the same core give or take a few tweaks and an added GPU block, right? I don't understand how a 500mhz drop can lead to 30->19 sec in PPT to PDT conversion for example.

    What am I overlooking here?
    Reply
  • slayernine - Tuesday, November 08, 2011 - link

    The A4 is a different processor, it is not from the same line as the X2 and thus performs quite differently. Also 500mhz can make a huge difference in single threaded applications. For example if you tried to play back a 1080p video without GPU acceleration (relying entirely on the CPU) the A4 would stutter at 2.5GHz but the X2 should be ok at 3GHz. However in reality the A4 is a much more well rounded processor that allows light graphical capabilities for gaming and video performance.

    Also some might point out that a significant portion of the A4 chip is dedicated to Radeon cores thus limiting the ability of the CPU portion through purpose build design.
    Reply
  • Taft12 - Tuesday, November 08, 2011 - link

    Wow, this is really coming out of your ass.

    The CPU part of Llano *IS* derived from good ol' K10 - Llano is/was referred to as K10.5

    It DOESN'T perform "quite differently", Anand found in the A8-3850 review that its performance was quite close to the Athlon II X4:

    <i>Although AMD has tweaked the A8's cores, the 2.9GHz 3850 performs a lot like a 3.1GHz Athlon II X4. You are getting more performance at a lower clock frequency, but not a lot more.</i>
    Reply
  • slayernine - Tuesday, November 08, 2011 - link

    It offers different features thus is quite different type of processor. One of those differences is the amount of CPU die dedicated to actual CPU functionality. I didn't say the CPU is portion is built differently. I am fully aware it is based on the same architecture. Perhaps I confused you in my choice of words.

    FYI Taft12 we are talking about the X2 3.0Ghz VS A4 not Athlon II X4 vs A8. The reason this matters is that the X2 3.0Ghz offers better CPU performance A4.

    Summary: Clock for clock the X2 isn't much different from the A4 but the A4 is a lower clock speed and thus slower at CPU intensive tasks because half the the damn thing is a GPU! APU's at the same price point will generally be slower than the competing CPU. So perhaps the simple answer to Wierdo's question is simply: "Clock speed matters."
    Reply
  • Wierdo - Tuesday, November 08, 2011 - link

    Hmm...I don't see how 500mhz difference causes super-linear scaling in performance, 2.5/3ghz is less than a %20 difference, it shouldn't be more than %20 performance difference one would think - with the cores being from the same family (K10) for both products I'm missing where the balance can affect non-multimedia workloads.

    It's quite interesting from an academic perspective, If it was primarily limited to graphics type workloads I could understand, but I don't see it for stuff like PPT->PDF conversions for example.
    Reply
  • buildingblock - Tuesday, November 08, 2011 - link

    We now have the curious situation where AMD is selling both the A6 3650 APU and the X4 631 Athlon II socket FM, which is the same unit with the graphics unit disabled. Because of the design constraints of Llano, and I suspect because the die-shrink to 32nm didn't really work out that well, the CPU part of the current Llano range is puny compared to the socket 1155 processors, even the low-end budget Gxxx range. At my local hardware dealer, the X4 631 is priced more than the Intel G-series equivalent, but that seems to be the theme of AMDs current APU/CPU offerings - uncompetitive performance and uncompetitive pricing. Reply
  • Iketh - Tuesday, November 08, 2011 - link

    You have the 500mhz difference and also the A4 has half the L2 cache of the X2. 1MB of L2 cache with no L3 cache is anemic.

    Ignore slayernine, he's a babbling idiot.
    Reply
  • Wierdo - Wednesday, November 09, 2011 - link

    Ah, if the cache structure is different that I could see one possible potential reason for variation in same-core performance, thanks I didn't spot that. Reply
  • slayernine - Tuesday, November 08, 2011 - link

    May I suggest an interesting alternative build that costs a bit more but is still within reach of most budgets. This system build is very tiny, good for those with limited space or in want of a portable machine:

    AMD A8-3850 2.9GHz $139
    ASRock A75M-ITX $94
    G.SKILL Ripjaws Series 8GB $34
    XFX HD-667X-ZHF3 6670 $83 (not including $25 MIR)
    SILVERSTONE SG05BB-450 (incl 450w PS) $129
    Crucial M4 CT064M4SSD2 64GB $119

    This system is tiny and takes advantage of AMD's Dual Graphics between the onboard GPU and the 6670. I normally shop NCIX.ca but I bought this system from NEWEGG.ca because they actually had AMD Mini ITX Boards. Please note these are Canadian prices as well. I would suggest a Momentus XT 500GB drive for this system if it was not for the insane prices right now. In this build I'm actually not purchasing a new HD I'm reusing a 60GB OCZ that I just got back from RMA. The RMA business being a big reason why I don't recommend OCZ, Intel and other brands are so much more reliable.
    Reply
  • A5 - Tuesday, November 08, 2011 - link

    Your system costs double of those in this article (one you take out the Win7 license). Also the A8 is a waste if you're going to use a dGPU anyway. Reply
  • slayernine - Tuesday, November 08, 2011 - link

    1. Canadian prices are higher than american ones. eg. $60 mobo turns into $90 mobo. This is not a currency value issue, more so that once things cross the border they magically cost more.

    2. The A8 processor is not a waste if you know about dual graphics. You technically get a 6690D2 which offers performance similar to the 6770 without paying more for in money and power usage.

    Educate yourself on dual graphics (sorry for the non anandtech link):

    http://www.amd.com/us/products/technologies/dual-g...

    http://www.tomshardware.com/reviews/amd-a8-3850-ll...

    3. I think $400 is not enough to spend on a system even if it is a budget computer. Also I did forget about the OS as I had previously purchased one.
    Reply
  • silverblue - Tuesday, November 08, 2011 - link

    Asymmetric Crossfire is (or was... any change?) hit-and-miss. In some cases, it can actually harm performance to the point that the iGPU isn't much slower. However, in some cases, it does work very well. WoW works better, but Metro 2033 drops performance, if we consider your second link.

    The following AT link provides more data on aCF's performance (admittedly, things may have changed since then):

    http://www.anandtech.com/show/4476/amd-a83850-revi...
    Reply
  • slayernine - Tuesday, November 08, 2011 - link

    Thanks for the link, for some reason I couldn't find that article in my quick google search. Check out this article which actually reviews the 6690D2 configuration that I've been talking about (I hate their graphs love the anand ones) Also rage3d doesn't compare enough games unfortunately but the ones it does use show 6690D2 > 6670:

    http://www.rage3d.com/reviews/video/sapphire_hd667...

    The other option I was also considering for this build was to go with Intel plus a 6770 which you can also find single slot cards for:

    http://www.newegg.ca/Product/Product.aspx?Item=N82...

    However you will notice much higher power requirements on the 6770 as well as it needs a 4pin power connector on the end of the card. Something which caused me a lot of hassle when taking my 4850 out of my previous mini pc build.
    Reply
  • Paul Tarnowski - Tuesday, November 08, 2011 - link

    3. That is your choice. This is about building a budget system. When a client asks me to supply an office computer, putting in Hybrid Crossfire is not going to make them magically want to spend double. Likewise for home use for the grandparents or so the little kids have something to write their homework on (they tend to play on iPads if they have them).

    Budget means that you have a low amount allotted to the project. Otherwise you miss the entire point of the article.
    Reply
  • slayernine - Tuesday, November 08, 2011 - link

    I'm looking at this from the perspective of a budget gamer. I realize that the average Joe who just surfs the web doesn't give a crap about crossfire or gaming performance.

    What I'm saying is that without breaking the bank you can get significantly improved performance with AMD's new dual graphics (hybrid crossfire, Asymmetric Crossfire, whatever else people want to call it) Also note that some games see this benefit more than others so it depends what you play.
    Reply
  • Hubb1e - Tuesday, November 08, 2011 - link

    I really don't understand why these Llano chips sport such low clock speeds. The Llano has shown to overclock at stock voltage over 3.5ghz and yet they sell this A4 at 2.5ghz. Wouldn't this be a better chip at 3ghz? I suppose they would use more power at 3ghz but not THAT much more. I have a hard time justifying a 2.5ghz Phenon II speed chip, but at 3 ghz and above coupled with the 160 radeon cores this would be a decent performer. I just don't understand it at all. The chips will run faster and on a desktop who cares about an extra 5W at load. I'm just confused as hell. It feels like AMD is shooting themselves in the foot. The Llano chips could be really good with some extra mhz. Reply
  • Prosthetic Head - Tuesday, November 08, 2011 - link

    The gaming section compares the fusion part to a reasonable discrete GPU in a way which I think is slightly misleading. It the competing systems are to be upgraded by addition of a discrete GPU then the fusion system should also be upgraded with a discrete GPU in hybrid crossfire (or whatever they are calling it now) or to the top end A8 part. Otherwise you are comparing substantially more expensive systems to the low end fusion system at a price point where very little money goes a long way in terms of upgrades.

    Other than that a good article with some sensible recommendations, thanks.
    Reply
  • just4U - Tuesday, November 08, 2011 - link

    I was disgusted at recent pricing on hard drives. Seagate has been low man on the totem for some time.. and their drives typically ran for $39 (500G) and $54(1T) here in Canada.. Now though? HAH.. $99/$130 respectively. I've been meaning to send a 500 G WD Black in for RMA and at these prices.. I certainly will.. (it's at $140 GAH!!!!)

    I tell you, SSDs don't look quite so expensive right now.
    Reply
  • Taft12 - Wednesday, November 09, 2011 - link

    Good lord man, read a news site once in a while! Reply
  • mino - Wednesday, November 09, 2011 - link

    What you mean by that.

    THE ARTICLE was published with _completely_ obsolete HDD prices. Now you bitch about a reader noticing it for being "uninformed"?
    Yeah, sure.

    With those HDD prices OEM systems start looking all the more appealing ...
    Reply
  • bgclevenger - Wednesday, November 09, 2011 - link

    a lot of good info here concerning hardware choices, but what about software? since these are budget builds and not gaming machines, install linux instead of windows and save $100. Reply
  • scubba85 - Wednesday, November 09, 2011 - link

    70$ for a hdd? Reply
  • mino - Wednesday, November 09, 2011 - link

    more like $100 these days ... :GRRR Reply
  • madmachinist2 - Thursday, November 10, 2011 - link

    Were the power consumption measurements taken with the discreet GPU's installed in the Celeron and Athalon? if not, is it possible to find out what the correct power consumption ratings would be? Reply
  • Schmich - Thursday, November 10, 2011 - link

    I know that gaming was named but in my opinion Linux should be named as well. For those who don't game then using something like Ubuntu would be a lot better. Reply
  • countdooku2028 - Friday, November 11, 2011 - link

    I just decided to build a new computer for my workplace. I saw that Microcenter was giving me the motherboard with the purchase. Let me tell you I am extremely satisfied with my purchase. I opted for the ASUS motherboard.

    I put 4GB of Corsair Ram and my old P128 SSD. This machine is incredible for my needs..... we do a ton of Corel Draw layouts for our trophy shop.

    Only thing missing is a dedicated video card...... once a 560 GTX or 6870 goes way down I will slap it in there. Oh and I will unlock the additional cores when I have the proper cooling.

    Oh and my old machine was a Dell Vostro 200 which I had dumped an old Raptor drive into. It was zipper than the original 7200 drive...... but now my machine kicks ass.

    Peace.
    Reply
  • flpxjs - Saturday, November 12, 2011 - link

    H61 1156?
    Too funny!
    Reply
  • TGIM824 - Sunday, November 13, 2011 - link

    The table for Intel motherboards lists socket as 1156, and should be 1155. Reply
  • Artas1984 - Friday, November 25, 2011 - link

    Intel Celeron G530
    Gigabyte GA-H61M-S2V-B3
    Kingston Value DDR3 1333 MHz C9 4 Gb
    Kingston SSDNOW v100 64 Gb
    Tacens Radix 4 450 W

    Advantages: cheap build, MB has DVI, power supply is 10 dBA & 80+ E.

    I do not recommend neither Antec or Corsair power supplies, because they are too expensive for a budged system+ they are not silent + less than 80+ E.
    Reply
  • Saikii - Thursday, June 12, 2014 - link

    I think Intel should have changed the name of their budget CPUs, Celeron. Most people still think Celerons are crap, but the newer Sandy-Bridge- based dual core Celerons are actually great. They are very good CPUs for the value. I have an Intel Celeron G530 (2.40GHz, dual core, 2MB L3 cache) paired with a GTX 650 (1GB, OC edition), 8GB RAM and a 450W PSU (don't know the model). All I can say is that I'm very happy with this bugdet rig. It performs great in everything...from regular tasks to intensive gaming. I tried many games...from older games such as WoW, to Crysis 3 (on low --> 30-35 fps). I get decent FPS in ANY game, on medium-high settings. Reply

Log in

Don't have an account? Sign up now