Back to Article

  • mikk - Thursday, May 29, 2014 - link

    G1820 is missing or at least a cheap Haswell Pentium. Reply
  • hojnikb - Thursday, May 29, 2014 - link

    Yes that would be really great, since those chips are price about the same. Reply
  • jospoortvliet - Sunday, June 01, 2014 - link

    These do use far more power.

    On that note, why on earth doid the reviewer compare power usage over idle (never seen that particular metric at anandtech?!?) While not mentioning the idle power (according to various other sites, the amd's sport significant lower idle power). I don't like to think so but this is probably the only power metric to make the atoms look remotely good... Why was it chosen?
  • savagemike - Thursday, May 29, 2014 - link

    I agree completely. If I were building a budget desktop right now that is exactly the chip (or similar) which I'd be comparing these to. Reply
  • MikeMurphy - Thursday, May 29, 2014 - link

    I can buy a G3220 Haswell Pentium running at 3.0ghz for $60. I was really hoping this would make it into this review!! Reply
  • Stuka87 - Thursday, May 29, 2014 - link

    The G3220 is a 53W chip. These are 25W chips. They do not compete with each other. Reply
  • HisDivineOrder - Friday, May 30, 2014 - link

    Atom chips are 10W chips. These Semprons are 25W. They do not compete with each other.

    See how that doesn't impact the fact that people are talking about more than just wattage? ;) Some people just want to know what the best VALUE is per dollar and these low end options are all in the running.

    Why limit yourself to just discussing wattage-appropriate? Especially when those Semprons are already over twice the Atom chips in terms of watts.
  • bsim500 - Friday, May 30, 2014 - link

    "The G3220 is a 53W chip. These are 25W chips. They do not compete with each other."

    Intel's TDP is way overstated on its dual-cores. My "55w" i3 pulls about 32w in reality (measured at the wall, not calculated). I've seen Haswell Pentium's that are sub-30w, (full speed not the slow "T" variants). They are very definitely in the same bracket. In fact, at stock 3.4GHz, with a -0.15v undervolt, I can get my "77w" i5-3570 down from a measured 59w (1.1v) to around 47w (0.95v). At 3.0GHz at 0.83v, you're looking at 36w 4T / 25w 2T (for an i5). AMD's Kabini's are still on 28nm vs Intel's 22nm, and you'd be surprised just how low you can go with undervolting the latter's "big cores".
  • silverblue - Friday, May 30, 2014 - link

    The point is moot as AMD is known for overvolting its processors; an article on Kabini would be very interesting. Reply
  • lyeoh - Sunday, June 01, 2014 - link

    Which is why this article needs some actual power consumption benchmarks. Reply
  • haardrr - Saturday, May 31, 2014 - link

    but the i7-4675t is a 35 watt processor... does that mean that the i7-4675t competes with the 5350? Reply
  • Namisecond - Monday, June 09, 2014 - link

    They are a lot closer in power usage than you might think, both idle and loaded. Reply
  • BMNify - Saturday, May 31, 2014 - link

    why would you use a dual core anything to day ! , and these chips do not have AVX simd so are underpowered before you even start in 2014/15 Reply
  • BMNify - Saturday, May 31, 2014 - link

    also unless you are a large org looking to finally do mass signage in bulk then again these chips/soc are not worth it to day, if you want a few web enabled apps per Soc device then look to the for instance , now that will be worth the cash for something you want to actually buy on a whim

    Crowdfunding: 3 incredibly tiny PC modules starting at just $15
  • silverblue - Saturday, May 31, 2014 - link

    If you're including Kabini in that statement, the Jaguar architecture does indeed support AVX. In fact, it supports pretty much everything Piledriver does. Reply
  • BMNify - Saturday, May 31, 2014 - link

    i wasn't, but again why settle for only old AVX , when you can have AVX2 AND a free hardware encoder and decoder that works for when you just want a quick conversion (or to make a new correctly time coded V/A fixed rate) Intel QuickSync Decoder - HW accelerated FFDShow decoder with video processing before you do the real x264 high visual quality encode Reply
  • Alexey291 - Monday, June 02, 2014 - link

    Why wouldn't you?

    The fewer the number of cores required to deliver equivalent performance the better.

    Besides dual core cpu's are perfect for certain very real world applications - such as for example me typing out this message on a chromebook sporting a dual core haswell celeron.
  • eanazag - Tuesday, June 03, 2014 - link

    Agree. I was looking for a cheap Haswell Pentium. Maybe you can compare in the bench. Reply
  • DanNeely - Thursday, May 29, 2014 - link

    The graphs in this article leave a lot to be desired. The huge number of high power/performance chips only add clutter; at most 2 or 3 representative examples from each major vendor would suffice to show these processors are very low end performing. Alternately, use the color coding not to call out AMD vs Intel; but to highlight the 4 AMD chips being reviewed along the with the J1800/1900 celerons they're nominally competing with. Reply
  • Ian Cutress - Thursday, May 29, 2014 - link

    I'll duplicate the graphs (after I eat dinner!) and limit the results data within a narrow band, and offer the option to switch between both. Reply
  • Ian Cutress - Thursday, May 29, 2014 - link

    I've adjusted most of the severe cases into graphs that are easier to read :) Reply
  • easp - Thursday, May 29, 2014 - link

    Better, but dude, the red bars stand out, and yet they represent something other than the focus of the article. Given the color pallet, I'd assume that the black bars were the least significant numbers, the background information, and yet, they actually represent the focal point of the article. Reply
  • Ian Cutress - Thursday, May 29, 2014 - link

    I originally have had blue for Intel and red for AMD. Black is the only other color I can add that doesn't look odd; if graphs start having three-plus colors then it just looks a little odd. It might be worth greying everything and just highlighting the important points without an Intel/AMD distinction except in the labelling for the future. Reply
  • edzieba - Friday, May 30, 2014 - link

    How about adding 'cores' to bars that are immediately relevant to the articke (e.g. an orange line in the centre of the bars for the socketed Kabinis and the Celerons). This would highlight the bars that are being compared directly, while still being in the context of all the other data, and keeping the expected blue/Intel red/AMD bar colouring intact. Reply
  • DanNeely - Friday, May 30, 2014 - link

    Thanks, that's much better. Reply
  • pjkenned - Thursday, May 29, 2014 - link

    Ian - good to see you had similar results as I did. The other bit is that the J1900 can be passively cooled while the AM1 chips need active coolers. That helps lower power consumption, noise and points of failure.

    I think I had benchmarks with the Raspberry Pi also - these are MUCH faster.
  • buffhr - Thursday, May 29, 2014 - link

    Would have been nice if you could have included some hd video playback (1080p/720p/3D) and impressions. Reply
  • vesoljc - Thursday, May 29, 2014 - link

    I second this! Reply
  • nirolf - Friday, May 30, 2014 - link

    Me too! Should be fine, but what about 4K? Reply
  • BMNify - Saturday, May 31, 2014 - link

    "but what about 4K?"
    what about it!, it doesn't really exist for consumers, oh perhaps you mean pseudocolor UHD 2160p, or the real colour UHD-1 3840 pixels wide by 2160 tall at 10bit or 12bit per pixel content today as used by several ARM Cortex Soc
  • Eeqmcsq - Thursday, May 29, 2014 - link

    "... Results are given in minutes, where the Wii itself scores 17.53; meaning that anything above this is faster than an actual Wii..."

    I think you mean anything "below" this number, since the measurement is in time.
  • Ian Cutress - Thursday, May 29, 2014 - link

    'Above' being above on the graph. I'll just change the wording. I thought I had changed my explanatory template, guess it didn't save... Reply
  • Metalliax - Thursday, May 29, 2014 - link

    Why can't review sites please start using sub 150W PSUs for these tests! You should delete your power chart completely unless you can show actual power draw with a PSU that would actually be used in such a setup. I can't believe you think you can post such nonsense with a 1250W PSU. Reply
  • extide - Thursday, May 29, 2014 - link

    X2 ! I know you have your reason for using the 1250W PSU (so it's always the same) but the result is USELESS NUMBERS. It doesnt matter if they are comparable if they are utterly useless! Reply
  • easp - Thursday, May 29, 2014 - link

    Good call! Reply
  • CU - Thursday, May 29, 2014 - link

    Yes, adding haswell celerons like the g1820 and pentiums like the g3220 would be very helpful. Those cpu's plus MB's cost about the same as the Ahtlon 5350. Reply
  • savagemike - Thursday, May 29, 2014 - link

    Agree completely. Due to similar costs and the fact I can't passively cool the AMD stuff anyway these are the Intels I'd be comparing them to for a budget build. Reply
  • azazel1024 - Thursday, May 29, 2014 - link

    I guess you can consider the IGP benchmarks for something, because I assume it would mean some older and more casual games would be likely to run fine, where as with the Bay Trail IGP they might not. However, at least based on the benchmarks used, even the AMD's IGP here in these low end processors is not up to the task of playable frame rates. Unless I am missing something, this was 1280x1024 at LOWEST settings and they were hitting mid 20's on average frame rates. That may be playable to some and bumping it down to 1280x720 might get you in to the low 30's...but that is still barely, barely playable at best.

    Which makes me wonder...what would a Pentium or Celeron Haswell that might cost less manage in terms of performance. The IGP is a fair amount more capable, especially the ones with HD4400 on there...which is MASSIVELY more performance than either the Bay Trail or Kabini the same price. Granted, higher TDP, but based on what I know of the actual power consumption of the processors, Haswell Celeron/Pentium might only be a litteral handful of watts more (5-10) under load and probably around the same idle.

    Low end board + low end Haswell processor runs ~$100 combined...same price as the high end Kabini and WAYYYYYY more performance.
  • ozzuneoj86 - Thursday, May 29, 2014 - link

    I don't really understand the inclusion of so many recent high end chips. For future reviews of this nature you could probably cut out all but one mid\high end CPU from each vendor, and you'd save a huge amount of benchmarking time.

    Along with that, it doesn't seem like I've really learned much from many of the benchmarks, since all we have to compare are high end, high TDP CPUs from the past 3 years, and these AM1 Kabini chips... as expected, the Kabinis are all similar in performance (with slight increases from least to most expensive), and the high end CPUs are significantly faster. Readers probably already knew that without the benchmarks though.

    It'd be far more useful to compare these to older CPUs and their low wattage variants, such as Core 2 Duos and Quads, AM2\AM3 Athlon X2 and X4 CPUs, as well as other low power chips that are common these days, like the Ivy Bridge based Celeron 1037u, Sandy Bridge Celeron 847, AMD E-350... even an old Atom (from the Ion platform era) would be helpful for those upgrading aging HTPCs. Not to mention any of the direct competition for these chips, like the Ivy\Haswell Celerons and Pentiums.

    It'd also be nice to have something to compare the dedicated GPU results to... like the Celeron J1900, or any of the CPUs mentioned above. There are many J1800\J1900 boards out there that can have a dedicated GPU installed.

    Also, as others have mentioned, the power consumption numbers with such a high wattage PSU aren't very helpful either.

    If its a matter of not having the hardware on hand, we're talking about low end hardware... you could probably get a decent example of almost everything I mentioned for a few hundred dollars total. Besides, I'm sure readers would gladly donate some hardware if it meant having it used in benchmarks for a few years.

    Thanks for the article, I just feel like it could have been much more focused on the products being reviewed and how they compare to the competition.
  • ozzuneoj86 - Thursday, May 29, 2014 - link

    Just remembered something else...

    The game benchmarks need to include older games. One or two examples of recent games that are nearly unplayable would be enough to let us know what we can't do with these... but a few old games, or even highly popular current games that don't require much power would be far more likely to influence someone's buying decisions.

    Will it run Minecraft at 1080P? How about 720P? How about a few games from 4-5 years ago?
  • Ian Cutress - Thursday, May 29, 2014 - link

    I've taken test data from all of my old testing as well, where I've run the same benchmarks on the same OS and SSD, hence why there are many data points to choose from. I have adjusted several of the graphs to have a narrower band of data showing to more easily see the difference now. Unfortunately the even older data (pre Core 2) is before my time at AnandTech.

    Regarding the J1800/J1900 motherboards, the two that I hurried in for testing were unfortunately limited in the GPU aspect and a third one I have since received is also in the same boat. Due to the hurried nature of getting the data from the initial release (as well as other testing on hand) I had perhaps wrongly assumed that all J1800/J1900 motherboards were in the same boat.

    I am shifting my test stations around somewhat this week, so when I come back from Computex I will have more of a low power/DRAM testing setup alongside the higher power systems I normally test. If you want to see anything specific, please feel free to email.
  • edwpang - Friday, May 30, 2014 - link

    It's definitely makes better sense than current review. As someone has already, using 1250w PS on this low end setup is kindly uselessly. Reply
  • piroroadkill - Thursday, May 29, 2014 - link

    Truecrypt link seems like a bad idea right now, since the official Truecrypt site is in a terrible state of limbo where nobody can figure out whether it's discontinued by the devs or been hacked. Benchmarks for 7.1a are relevant, but 7.2 is a gutted, useless pile of crap. Just saying. Reply
  • A5 - Thursday, May 29, 2014 - link

    It's good thing the bench is 7.1a then? Reply
  • Ian Cutress - Thursday, May 29, 2014 - link

    Ha! I thought about taking the data out given that I had already uploaded almost all it before that announcement was made. However 7.1a is still viable and I still have the installer, so it might still be relevant if the installer still floats around in cyberspace. I somehow doubt we will ever get a full explanation from the developers on why they took it down, though there are many theories about it. Reply
  • Runamok81 - Thursday, May 29, 2014 - link

    Typo, second to last sentence. platgorm Reply
  • someeeguy - Thursday, May 29, 2014 - link

    Ian, in the "dGPU Benchmarks with ASUS HD7970" portion of your review, it would have been interesting to see some Mantle results on these low power CPUs. Reply
  • JBVertexx - Thursday, May 29, 2014 - link

    I think the value in having a socket solution is less about providing an upgrade path and more about lower carrying costs in the entire supply chain.

    If you look at having 4 CPU combinations over lets say 4 motherboard options, having a BGA solution means that you need to source and stock 16 different items. With a socket solution, that cuts your inventory and carrying cost down to 8 items.

    The economics of this are huge. It impacts motherboard manufacturers, system builders, and businesses. It impacts the amount of up front investment required by every organization in the supply chain, and it impacts the inventory costs (or carrying capital).

    It especially impacts motherboard manufacturers, who must actually purchase the CPU in a BGA solution in order to sell a motherboard.

    In the face of those compelling economics, having an upgrade path is really small potatoes.
  • marvee - Thursday, May 29, 2014 - link

    The understanding of those economics could be the experience of Rory Read, from his time with Lenovo. Reply
  • Hrel - Thursday, May 29, 2014 - link

    Pretty disappointing you guy didn't include a CPU with HD4600 on it in the gaming benchmarks. Why compare to last generation's hardware? Perhaps to show AMD in as favorable a light as possible? hmmmmm..... Reply
  • Silver47 - Thursday, May 29, 2014 - link

    They are in the graphs, what are you smoking? Reply
  • Novaguy - Thursday, May 29, 2014 - link

    I think using a weaker card for the dgpu would also have been interesting. Maybe something like an r7 270, 7750 or 7770.

    Also, what about Mantle benchmarks?
  • Novaguy - Thursday, May 29, 2014 - link

    I meant r7 250, not 270... can't seem to edit via mobile. Reply
  • V900 - Friday, May 30, 2014 - link

    No, no Mantle benchmarks... It's useless, and only serves to clutter the article.

    Hardly anybody cared about Mantle when it was announced... And now that we know the next version of Direct X is coming, the only people that care the slightest, are a handful of AMD fanbois.
  • silverblue - Friday, May 30, 2014 - link

    You're entitled to your opinion. Let's look at it this way - DirectX 12 won't be here for another 18 months. Also, Mantle has been shown to perform better than AMD's own implementation of DX11 (and sometimes faster than NV's as well) and also helps with lower performing CPUs.

    The following link only shows the one game, but it should be enough to highlight the potential benefit of Mantle on a comparatively weak architecture such as Jaguar:

    If you didn't need to upgrade your CPU to play the latest and greatest, I think you'd care, too.
  • formulav8 - Friday, May 30, 2014 - link

    If Mantle was from NVidia or Intel you would be a bigger stooge than anyone AMD. Reply
  • Gauner - Thursday, May 29, 2014 - link

    I think it would be far more useful for this weak systems to have a different set of game tests.

    No one will buy one of those to play tomb raider or bioshock but I'm guessing some people would consider buying one if you could play for example league of legends or dota2 at minimum and 720/1080p.

    I understand it would mean an extra effort but right now the gaming tests don't really tell anything useful nor do they add useful information to other reviews by having an extra comparison.

    Just taking into account steam games(since they are the easier ones to get data about) in the top 5 most played games daily you always have: dota2, CS:GO and TF2. Those 3 games used to run(poorly, 800x600, low detail, 20-30fps) in the old Intel GMA4500M so in my opinion they have a lot of people potentially interested in them and would run well enough in low power systems to give a useful comparison between chips.
  • res057 - Thursday, May 29, 2014 - link

    Tests with high end games and cpu intensive tasks for processors such as these is like Car and Driver putting a Prius through quarter-mile test and comparing the results with a Corvette. Pointless. Reply
  • Gauner - Friday, May 30, 2014 - link

    I can understand the logic behind CPU intensive tasks, even if you wont use this kind of CPUs to compress video with x264 you can watch the results and get a realistic estimation on the performance difference because the test is the same for both weak CPUs and high end CPUs.

    Problem with the gaming tests is that the comparison is with settings completely different from the high end tests so no parallelism there and with a set of games that no one would think to play in weak systems so there is no useful info given.
    I cant look at the results and say "since tomb raider at 1280x1024 with low graphics ran at 20fps I guess league of legends will run at 1080p with medium graphics and 35fps", there are too many different points to make a comparison(AAA engine vs engine designed for low power computers, different resolution, completely different poly count and texture sizes, ...).
  • takeship - Friday, May 30, 2014 - link

    I'm curious why no comparison with the old E-450/E-2000 Jaguar chips. Not so much for the CPU performance, but for the GPU improvements going to GCN. Reply
  • serpretetsky - Thursday, May 29, 2014 - link

    I'm not sure I understand the power chart on page 2. Is the title correct? Power difference? So the numbers we are seeing are power differentials between idle and load and not absolute values? Reply
  • casteve - Friday, May 30, 2014 - link

    Ian uses a 1250W PSU in this setup. So, absolute value is pretty meaningless when your system idles down in the low efficiency (and high slope) part of the power supply's curve. The delta power part at least provides an idea of what's going on. Reply
  • jospoortvliet - Sunday, June 01, 2014 - link

    Yet idle power is quite important - cpu's are idle most of the time... Reply
  • coburn_c - Thursday, May 29, 2014 - link

    There's your future of AMD. Scaling up Jaguar will fix their code/module problems and improve their perf/watt. There is certainly no future in their big cores. Reply
  • rootheday3 - Thursday, May 29, 2014 - link

    Ivy Bridge gt1 was 6 eus; bay trail is 4 eus. Reply
  • jvp - Friday, May 30, 2014 - link

    What is missing in this review is that some Intel processors have stripped down instruction sets. Like the Celeron J1900 with which the Athlon 5350 is compared as direct competitor.

    I'm also missing tests about performance when using virtual machines, and hardware accelerated encryption. These are area's that are becoming more and more important for systems.
  • R3MF - Friday, May 30, 2014 - link

    this is lovely, thanks, but how long before we get beema in socket AM1? Reply
  • R3MF - Friday, May 30, 2014 - link

    in addition, do i understand correctly that:
    kabini is 28nm TSMC, whereas
    beema is 28nm GF
  • plonk420 - Friday, May 30, 2014 - link

    thank you thank you THANK YOU for including i3-4300 series on the dGPU page! i've been wondering how they stand up to gaming on a budget as i've helped do 2 builds in the last couple months for the first time in years! Reply
  • Icehawk - Friday, May 30, 2014 - link

    I'm with prior posters - testing these low end machines with current games is all well and good but also unlikely to happen in real life. Can these handle games I'd be more likely to want to play on a low power device like TF2 or Diablo 3? NO CLUE. Reply
  • tarqsharq - Friday, May 30, 2014 - link

    Or League of Legends.... I have a friend whose little brother would benefit from a machine that could run League without barfing.

    Low budget, bench some free to play games!
  • HighTech4US - Friday, May 30, 2014 - link

    I have no need for any these underpowered CPUs from either Intel or AMD. As shown in the X264 tests the Intel Q8400 blows away all these low end parts. My main need is video encoding. Yes I know the Q8400 it is a 95 watt part but I recently spent $34 total to upgrade my 2008 system that has an Asus P5K Deluxe motherboard from a Q6600 105 watt CPU to a 50 watt Xeon L5420 50 watt CPU.

    $29 for the Intel L5420

    Intel specs on the L5420

    $5 for the Socket LGA 771-775 adaptor

    Intel specs on the Q6600

    For $34 I have an upgraded system (100 mhz higher speed (2.5 vs 2.4 Ghz), 4MB (12 vs 8) more Cache, a higher FSB (1333 vs 1066) and much lower TPD (50 watts vs 105 watts) and this system put the hurt to all these low end parts.

    I also sold off the Q6600 on eBay for $40 so my upgrade was basically FREE.
  • ozzuneoj86 - Saturday, May 31, 2014 - link

    I've done this myself! While I wouldn't necessarily recommend that most people run out and buy an obsolete motherboard, mod it, buy expensive DDR2 and then buy a dirt cheap Xeon and mod that... BUT, I've modified a couple of systems like this and the performance per dollar is fantastic. I also like to use the L5240 3Ghz dual core 40W Xeons. Strictly from a performance per dollar perspective they will stomp all over any of these modern value CPUs, and you can pick up a pair of them for $15 or less. I own six or eight of them... I lost count... I also had a pair of L5420s, but used one to upgrade an old Dell which I sold for a $300 profit. I have two Dual Xeon boards I picked up on ebay a year and a half ago for $60 each, and have two L5240 3Ghz Duals in them. Performance in most cases should be similar to a desktop i3, but I only had to invest around $200 into the CPUs, coolers, motherboard and 8Gb of (unbuffered) ECC DDR2.

    These systems don't sip power by any means, so I wouldn't recommend them as file servers but they aren't bad for normal usage cases. One of my Dual Xeon systems is my living room gaming\htpc, coupled with a 350W Seasonic 80plus PSU I got on ebay for $15 and a Radeon HD 7750 I got on sale for $40... it's really quite a powerful system for how inexpensive it was.
  • casteve - Friday, May 30, 2014 - link

    Thanks for the review, Ian.

    Perhaps this would have been better covered from a more likely scenario - web browser/office productivity, and HTPC. I can't imagine anyone using these for scientific/big gaming/video encoding set ups. Might be handy as a HTPC in the den - what's the HQV score, for example? How well does it perform with lower need Steam games. Heck, how well does it perform with Steam streaming?
  • toyotabedzrock - Friday, May 30, 2014 - link

    No tests of the hardware encoding engines? Reply
  • haardrr - Saturday, May 31, 2014 - link

    back-the-day i was rooting for AMD, and i was pissed at the fact the anandtech used intel processors that were 3-4 times more expensive than AMD...
    i kept hoping that AMD was equal or faster than Intel. Sadly NEVER!.
    the closest that amd came was an oc'd FX8350 against a stock 4570k.
    But AMD had to use double the power consumption!...
    fast forward to right now, i bought the 5150 and a asrock am1h-itx. The asrock am1 motherboard is awesome (maybe a bit expensive, but it has dc-in, all all the standard outputs; no overclocking though)
    Dissappointed by the 5150, and wondered if a 25% bump in clock speed that the 5350 has would fix the "slow" feel i get from the 5150, i bought the 5350 on sale at 65.

    while AMD has priced the 5350, to "theoretically" match the performance of a g3420, (5350's can't match it in normal use)

    a celeron g1610/ g1810 on 14.04 linux has better mythtv deinterlacing, than the 5350 does.
    there was a time that you needed an Nvidia GPU (VDPAU) for proper deinterlacing, but with intels open source work, the NVIDIA card is now not necessary but an asset. Whereas with the 5350 requires an NVIDIA card (gt630v2 gk208)... maybe someday the AMD engineers will think that the Hardware features that they build into their processors, should be usable under linux...

    i bought the 5350 thinking that it would work well with MythTV... thinking that it could handle openGL high quality, yadif 2x deinterlacing, or even greedymotion 2x... it can't!. the best you can hope for is Yadif. and to get that you need to install the oibaf PPA to get VDPAU!... mind you it does use alot less power than the openGL deinterlacing. the main problem is AMD just does not care about free opensource opengl drivers (the ps4 while "linux" is not open, thus not open source, so AMD cares about it) so Mythtv sucks on the amd 5350. the only advantage the the 5350/asrock am1h-itx has is that it idles at 16 watts(laptop 19v psu), and the msi h61i/g3420 idles at 19 watts.(tfx bronze 300w... surprisingly efficent at low level, equal or 2watts better than the gold seasonic tfx. at the 20 watt level)

    THE PROBLEM IS THAT the intel g3420 on trusty tahr(14.04) can deinterlace OTA HDTV 1080i, at yadif 2x... for 30 dollars more. in order for the 5350 system to deinterlace properly, requires an nvidia gt640v2 at 70.

    to sum up, I regret buying the 5150 (waste of silicon), the 5350 can be useful under some scenarios, and the asrock am1h-itx( while 75$, has dc-in) is absolutely required to make the 5350 useful.
    AMD has got to stop Acting like a loser (cut back, way back because they are a console CPU maker, and perhaps a server cpu maker) and produce a cpu that is better than intels.
  • jospoortvliet - Sunday, June 01, 2014 - link

    Well, good to know these AMD CPU's suck for a basic mythtv box... Too bad. Reply
  • Soul_Est - Sunday, July 27, 2014 - link

    I was looking to do the same thing under Linux using the 5350. Arch Linux + ( MythTV || TVHeadEnd ) + XBMC. Well, I'll start my HTPC planning from square one again.

    The PS4 runs FreeBSD and not Linux IIRC.
  • Soul_Est - Sunday, July 27, 2014 - link

    Wish I could edit my own replies.

    Just did the pricing on the AMD (Athlon 5350, ASUS AM1I-A) and the Intel (Celeron G1820, BIOSTAR Hi-Fi B85N) components and they are similar in price. Too similar given what may be a large performance delta. Would the G1820 handle yadif 2x much better you think?
  • Soul_Est - Sunday, July 27, 2014 - link

    Given that I don't know what your complete setup is haardrr, what do you make of this? Reply
  • ET - Sunday, June 01, 2014 - link

    I found the colours of the graphics confusing. Not only were they inconsistent (with the sometimes black, sometimes not) but I think it would have been better to also give the Bay Trail CPU's their own colour.

    And by the way, why 1280x1024? For historical reasons? 720p would be more reasonable. Can't say if it will get framerates to playable, but at least they'll be closer for this more real world resolution.
  • loimlo - Sunday, June 01, 2014 - link

    Two suggestions :
    1. I suggest adding power consumption, fan noise to the review.
    2. A few comments about Gigabyte motherboard like BIOS, fan control would be nice.
  • carol45 - Monday, June 02, 2014 - link

    Really nice test. Maybe i will convince to AMD thanks this test. Reply
  • PPB - Wednesday, June 04, 2014 - link

    No LoL or any other actually popular game benchmarks but instead some AAA games being tested? Just wow.

    Can someone tell me with a straight face they will be buying a kabini to play Tomb Rider? :rolleyes:
  • HeyHey!!!345 - Monday, August 18, 2014 - link

    It seems the new chips greatly bottlenecks high-end graphics card. What about mid-range?? I plan to get a Sempron 3850 as a cheap build for my cousin since he's on a tight budget. Would it bottleneck that great with graphics cards say an HD7770 or even a GTX 750?? Reply
  • Jeris Boltsin - Monday, November 24, 2014 - link

    Athlon 5350 + GTX 750 Ti ==>
    (Considering how "powerful" the CPU is, I'd personally would've liked to also see 720p results, to see what kinda difference that would've made)

    Of couse, if they'd used those ASUS mobos & overclocked that 5350, the result might've been a little better, with some games. And before anyone even asks: There's a video on YouTube where some dude overcloked it to 2709 Mhz, using Asus AM1I-A and the stock heatsink & fan.

    Some other people have managed to go up to 3 Ghz with 5350s, using water cooling - but "money doesn't grow on trees" and all that.

Log in

Don't have an account? Sign up now