Back to Article

  • meacupla - Wednesday, April 09, 2014 - link

    oh, god... semiporn is back... Reply
  • ViRGE - Wednesday, April 09, 2014 - link

    There's a Duron joke in here somewhere... Reply
  • Death666Angel - Wednesday, April 09, 2014 - link

    The good old days for AMD. That's when I got into PC stuff. Good times. Reply
  • nathanddrews - Wednesday, April 09, 2014 - link

    Please tell me you are going to pair Kabini with 295x2. LOL Reply
  • scottjames_12 - Wednesday, April 09, 2014 - link

    That WOULD be interesting! See how starved the monster GPU gets with an anemic CPU.. Reply
  • nathanddrews - Wednesday, April 09, 2014 - link

    I know that people are always trying to pair appropriate CPUs and GPUs together to avoid bottlenecks, but the honest truth is that a lot of people just leapfrog their builds every few years alternating between CPU and GPU upgrades. I don't know many people that buy/build a completely new system every 2 or 5 years. I know it happens plenty, but I know a lot of people running Conroe-era CPUs with modern GPUs. Up until a year ago, I was running an old GTX 470 on my 3570K. If money were no object, I would constantly be upgrading, but for now I wait until I absolutely need to.

    I'm probably just weird, but I spent a couple *hours* over the weekend watching YouTube videos of old PCI or AGP-based Pentium and Athlon systems playing modern games and modern OSs. Some are terrible while others are actually playable. Pushing old or slow hardware beyond the reasonable limits to see just what you can get away with is a fun pastime of mine.
  • NeatOman - Wednesday, April 09, 2014 - link

    I always tell people to go a little overkill on the CPU in case later on they upgrade the GPU.. mostly because they wont be gaming on it most the time and i think its money better spent and later it won't give you much of a CPU bottleneck. Also because if you got a i7 from 6 years ago (yeah, they came out in 2008).. it would only be about 15%-20% slower. But a GPU that was $300 6 years ago lol, yeah right... a HD 3850 from around 6 years ago is about 1/3 the speed of a HD 7850 and twice the price. So over time a CPU holds it value and GPU can drop from year to year by up to half (for the high end models).

    I've found that spending more then $250 ($200 being the sweet spot) is a point of diminishing returns, and same goes for CPU's.
  • fokka - Wednesday, April 09, 2014 - link

    i'd still go with a fast i5 instead of i7, except i've really got money to burn. i5 is where you get the best bang for your buck at intel, with a level of performance sufficient for the following years. paying 50% more for 10-20% increased performance doesn't seem very effective to me. this might have been a bit different with the first core-i CPUs.

    you're right about the GPUs though.
  • jonathanharrison - Thursday, April 10, 2014 - link

    I wish they made more P-series chips, like the i5-3350P, that has the graphics core disabled, for gamers who are just going to get a discrete card(s) anyway. An i3-4370P (you could speed bump the core to 3.8, maybe 4.0GHz since you would be disabling the graphics core) would be nice. So would an i5-3570PK :) But that would just be tantamount to intel admitting failure at delivering 3D performance. Oh, this is an AMD article? Heh, I almost forgot with everyone talking about intel. Reply
  • tecknurd - Wednesday, April 09, 2014 - link

    You may think that going "a little overkill on the CPU" may be a good suggestion, but it is not. I found over the years that it is better to spend on the storage system and memory. Back in 2012, I went with a i3-3225. I could go with an i5-3570 processor. That i5 processor will be a little overkill of what I do. The cost of that i5 processor just gives me a lot of grunt that I will rarely will use. That i5 processor will cut in my budget, so I can not go with a good storage system and/or go with a decent size of memory. If I add a graphics card, the med-end graphics models would not be bottleneck by my processor. If I go with a high-end graphics models, then yes the processor that I picked will bottleneck the high-end graphics card.

    A CPU only holds value if it is overclockable and the motherboard supports overclocking the processor. The storage system holds more value than any hardware in your computer because the reason why people went from hard drives to solid-state drives. A graphics card has no value since the technology of graphics changes dramatically. Also high quality or details of graphics during game play does not matter.
  • 5thaccount - Thursday, April 10, 2014 - link

    100% agree. Any Core 2 or newer based system with plenty of memory and an SSD will work fine for most users out there (the ones that use office, browse the web, and watch youtube)... and for quite a few years to come. Heck, I'm still using my E5200 daily after 6 years. Works perfect! Even looking at these benchmarks, the new Kabini is slower than a Core 2 Duo E8400. Which, oddly enough, can be picked up for under $20 on eBay... now that's a deal! Reply
  • jonathanharrison - Thursday, April 10, 2014 - link

    I got a whole Vostro 200 system used with 20" widescreen LCD, 4GB DDR2 RAM (667), Core 2 Duo E6550 from a dude on Craigslist for a mere $45. Wife had gone to the Mac side and wanted the system out of the house ASAP. That's one way to do it - all but GIVE it away :)

    On-board intel graphics kinda don't cut it for gaming, so... when I could, I added a Sapphire Radeon HD 7770 GHz Edition 1GB video card ($109) and a 550W ThermalTake power supply ($40) into it and plugged it also into my other 20" monitor with a HDMI-DVI cable ($8) for a nice dual display... may go triple-head when I can afford to... wireless mouse/keyboard set ($30), Logitech gamepad F310, ($25)... and I've got a pretty nice light/older (DX9/10) gaming system for less than $220 of new purchases...
  • trueserve - Friday, April 18, 2014 - link

    "A CPU only holds value if it is overclockable and the motherboard supports overclocking the processor."

  • mikato - Friday, April 11, 2014 - link

    Yeah, I had a Core 2 Duo E7300 in my wife's gaming system, and then when the latest COD came out with higher requirements, I bought a nice GTX 760 video card to put in there (with bigger PSU), and a better Core 2 Duo E8500 on ebay to go with it. Unfortunately the CPU was still really limiting things though, even after overclocking as much as I could, so in the end I had to build a new system around the video card... got an i5 4670K. But your point is valid. The system was used for HTPC tasks, XBMC, Skype, etc as well and the Core 2 Duos performed just great in everything besides the latest COD game, which is apparently a resource hog. Reply
  • phoenix_rizzen - Wednesday, April 09, 2014 - link

    Wouldn't the PCIe x4 slot be the bigger bottleneck? Reply
  • nathanddrews - Wednesday, April 09, 2014 - link

    PCIe 3.0 x2 = PCIe 2.0 x4 = PCIe 1.0 x8 = 2GB/sec

    Even with just a PCIe 2.0 x4 electrical, it would still be massively CPU-bound. I wonder if any other manufacturers will bother to do a fully enabled x16 slot?
  • extide - Thursday, April 10, 2014 - link

    They can't, there simply arent enough lanes on the CPU. If they ditched the NIC and uses every single lane for graphics, you could have x8. Reply
  • nathanddrews - Thursday, April 10, 2014 - link

    Ah yes, you are correct. Reply
  • jonathanharrison - Thursday, April 10, 2014 - link

    Hey, I resemble that. I've got a Radeon HD 7770 GHz being underfed by a Core 2 Duo E6550. Used system bought for cheap then added the video card, purchased new. I guess the 7750 would have been more than enough, but hey, the 7770 was a steal at $109. How's that for a CPU/GPU imbalance? Hey... at least it's in the right direction for gaming. I'd rather be CPU-limited than GPU-limited any day. I can still run 99% of the games out there, and do so with rather high settings and at least 4X AA most of the time. Maybe not 60fps in all games, but in a lot of them, especially the older ones, and getting 30fps or close in some of the newest (like MKKE). Reply
  • etamin - Wednesday, April 09, 2014 - link

    In the "key points" on the first page, you listed two SATA 3gbps ports, but the Gigabyte comes with 2x SATA III. The block diagram shows two "SATA 2/3". In any case, two SATA ports is extremely restrictive. If mSATA and/or secondary SATA controllers can be had for the same $35, this would make for a terrific micro media server. Reply
  • Medallish - Wednesday, April 09, 2014 - link

    Well sadly it's not $35, but $59 But it does come with two extra Sata III ports, and the ability to plug a 19V DC directly into the board Should be said the cable that comes along with it to supply drives with power directly from the motherboard, only supports 2 drives, so yeah if you do use that port you likely need to come up with an inventive way of supplying 2 additional drives.

    If I were building a mini server though I'd likely to a little crazy and get this:
    6x Sata III ports and 1x mSata & 1x eSata. And then get some Richland or Kaveri APU as cheap and low power as they get with onboard GPU.

    I'm sure you'll get a lot of other replies with even better suggestions.
  • Communism - Wednesday, April 09, 2014 - link

    Lol @ those prices,

    37 USD motherboard:
    BIOSTAR H61MGV3 LGA 1155 Intel H61 Micro ATX

    43 USD CPU/GPU:
    Intel Celeron G1610 2.60GHz LGA 1155 Processor

    Desktop Kabini is DOA for it's intended purpose.
  • Medallish - Wednesday, April 09, 2014 - link

    I love it when people trot out a H61 based board like it renders Kabini useless, completely missing the fact that it compared to Kabini misses a lot, with H61 you only get USB 2.0 and Sata II.

    And then there's that wonderful 55W CPU you put in there that I'm sure doesn't even compare on GPU performance to the Kabini and only is slightly ahead in CPU bench's.

    I built a kabini system using this case:
    And the Asrock board:

    I got a board with a lot more functions, and an incredibly simple build thanks to the DC-In port, and still incredibly cheap. Kabini is like the perfect build for a lot of my family members who still can get use out of newer standards like Sata III and USB 3.0
  • meacupla - Wednesday, April 09, 2014 - link

    Do you really think the presence or not of SATA3 matters in such a low end system?
    USB3.0 is nice, however.
  • Medallish - Wednesday, April 09, 2014 - link

    Might not be a selling point telling someone with no clue, that it has Sata III certainly, but IO performance adds to the "feel", and on that front I would definitely say it can matter. Reply
  • fokka - Wednesday, April 09, 2014 - link

    sure i'd prefer sata3 too, but the IO performance you're talking about isn't hindered much by sata2. max throughput is capped in half, yes, but once you go random rw, sata2 shouldn't be the limiting factor. Reply
  • MrSpadge - Wednesday, April 09, 2014 - link

    Once you're transferring at 270 MB/s (SATA2) the feeling is pretty good. And during installs & loads the CPU has to keep up, too. Reply
  • rudolphna - Wednesday, April 09, 2014 - link

    Well. Here is the thing. I have a Celeron g1610 on a mATX b75m board that together cost me $105. I use our for network storage and as a plex server which is real-time h264 encoding. I tested it with handbrake. Under handbrake h264 encoding using the igp for display, the g1610 pulled a maximum of 17w according to coretemp, not 55 or anywhere close to it. It generally runs sub 10w, when doing a single plex encode. Reply
  • Medallish - Wednesday, April 09, 2014 - link

    I really wouldn't count on Software to get a accurate meassure of that, and TDP != Power use, it simply refers to the cooling requirement, I'm pretty sure the 5350 and 5150 don't use the same amount of power, despite them both having a TDP of 25W. Reply
  • frozentundra123456 - Wednesday, April 09, 2014 - link

    Upgradability is nice, but the problem is upgrade to what. Right now there is no real upgrade path, and it is unknown what and when the next upgrade will be. Seems like another very niche product, like the rest of AMD APUs, trying to use the graphics to leverage an advantage against higher power consumption and mediocre cpu performance. So far this strategy hasnt really been successful, as the marketplace shows. That could change as more apps use graphics, but IMO single core performance (and power consumption in a small envelope like this) is still king. Reply
  • Medallish - Wednesday, April 09, 2014 - link

    Indeed, I loved building one, but it's kind of hard to advertise it being insanely cheap, and advertising upgradeability, you are more likely to simply purchase something new entirely when upgrading, if they are able to upgrade AM1 CPU's greatly, like lets say the next Mullins or Beema fits in effortlessly and introduces new stuff like Dual Channel DP 12a, or HDMI 2.0, and it worked on older boards, then I would say AMD had a nice idea, but the question really comes down to, at this price point do people even care about upgradeability? Reply
  • mikato - Friday, April 11, 2014 - link

    The good thing is that if/when you do upgrade and get a Beema, you're upgrading both the graphics and the CPU, hopefully both significantly. Reply
  • mrdude - Wednesday, April 09, 2014 - link

    I'm actually fond of AMD's APUs and their direction with respect to heterogenous computing (I think it'll succeed even if AMD goes out of business, personally), but I agree with you.

    Providing a socketed platform is great, but without a clear direction and upgrade path it's all for naught -- and I mean AMD needs to state that Beema will be available on X date and features Y improvements. Nobody with more than a few brain cells buys into a platform that offers nothing outside of 'wait and see.'

    AMD also needs to come to grips with the fact that on-die GPUs aren't going to taken seriously until they can provide playable framerates at 1080p without substantially increasing platform cost. Kaveri does the first part very well, but the memory scaling and memory cost issue really hamstring the platform. Pushing for beefier graphics makes sense in the tablet space where Mullins is supposed to make headway, but on the laptop/desktop side it's next-to-worthless unless it can play modern titles at 720p or 1080p at medium settings.

    And Kabini desperately needs an aggressive turbo core and dual-channel memory. Kanter's article shows a very potent little microarchitecture, but those two points are really holding it back.

    I think Jaguar is a better microarchitecture than Steamroller, but it seems clear to me that AMD isn't giving it the sort of attention that I feel it deserves.
  • JDG1980 - Wednesday, April 09, 2014 - link

    Backtracking on GDDR5 for Kaveri was a big mistake. Without the added memory bandwidth, the iGPU is bottlenecked, and the result is that Kaveri barely offers any improvement over Richland at all. Reply
  • Musafir_86 - Wednesday, April 09, 2014 - link

    -It's Mantle time!

    -On a serious note, please test any GCN 1.1 card (Bonaire & Hawaii), and also GCN 1.0 card with BF4 & Thief in Mantle mode, pretty please! Don't let that ×16 slot go to waste!

  • otherwise - Wednesday, April 09, 2014 - link

    Looking at those CPU benchmarks, I would argue if you're even considering using an external GPU with this system you should be looking elsewhere. It's not really cheaper than a low end celeron system; and even atom is beating it in CPU power. Reply
  • Flunk - Wednesday, April 09, 2014 - link

    I don't think he wants to see how it performs because he's thinking of building a rig like that. It would however be a good way to see how the reduced CPU overhead of Mantle affects GPU performance. Reply
  • Medallish - Wednesday, April 09, 2014 - link

    Honestly it would be kind of fun/interesting to see if Mantle will have an impact at such a low-end system, I mean likely the CPU will be bottlenecked simply using a 260x. Reply
  • munim - Wednesday, April 09, 2014 - link

    How does it work for everyday use? Web browsing, 1080p youtube, things of that nature? Reply
  • JDG1980 - Wednesday, April 09, 2014 - link

    They don't bother testing that since any modern system will do it just fine.

    There were a couple of JavaScript benchmarks (SunSpider and WebXprt) at the bottom of the 'CPU Productivity' page.
  • jabber - Wednesday, April 09, 2014 - link

    Yep slap a 120GB SSD in there and you have a nice little office system for not a lot of outlay. Doesnt matter if its SATA2, its all in the access times. Reply
  • meacupla - Wednesday, April 09, 2014 - link

    what would be nice is high bit rate software 1080p playback, for when DXVA doesn't work.

    I remember my E-350 getting choppy on those.
  • Medallish - Wednesday, April 09, 2014 - link

    Well the one I built feels pretty responsive, even with a mechanical drive. You do need to install the drivers, but once they were on there it didn't feel at all sluggish, I tried loading up a bunch of heavily advertised pages on chrome, and didn't really notice any impact on performance. I must admit I haven't tried any 1080p Youtube videos, but I will give those a go, I imagine it runs fine, I had a Quad Core "Phenom II" Mobile CPU that ran 2GHz, performance should be pretty close to this I imagine. And yeah I did a test on that running a youtube video @ 1080p and a 1080p video on VLC at the same time, now this was running a 6670m which is bound to be a lot better than this, but the CPU wasn't holding it back atleast. Reply
  • gandergray - Wednesday, April 09, 2014 - link

    Hello Ian. Thank you for the article. In the second to last sentence of the paragraph that follows the Athlon v. Celeron table in the “Competition” section, the word “subsiding” should be “subsidizing”. Reply
  • JDG1980 - Wednesday, April 09, 2014 - link

    The Cinebench 10 benchmark indicates that Kabini's IPC is only slightly below that of Intel's old Conroe architecture (E2180, running at the same 2GHz clock speed). For such a small die and low TDP, that's actually pretty good. Integer IPC seems to be better than Intel's Baytrail series, though floating-point lags a bit (at least if 3D Particle Motion is a good test - never heard of it before).

    IPC for AMD's big cores is still not much better than Conroe, and lags behind Nehalem. Steamroller was supposed to help, but didn't do much for IPC (even though it did reduce the CMT penalty). This calls into question why AMD's 'construction equipment' cores are even still being developed. If the small 'cat' cores can do as well as they do with only a dual-issue front end, and a bunch of other missing optimizations (e.g. no decoded micro-op cache), then beefing up the basic design for desktop/server use would probably beat Steamroller in IPC while having lower power usage. It wouldn't close the gap with Intel's big cores, but it would at least provide a solid foundation to build on. This would be the same thing Intel did themselves back when they ditched the unviable Netburst design in favor of one derived from the mobile Pentium-M.
  • mrdude - Wednesday, April 09, 2014 - link

    Yea, I think many people would agree with you. I like the Jaguar architecture quite a bit, as it seems to be very well balanced. The 4-way shared L2 and integer performance seem like very solid foundations to build upon. The fact that it has AVX whereas Intel's Atoms and low end Celerons don't is also an awesome feature.

    ...but it has no turbo. AMD's turbo core is actually really advanced, dynamically adjusting clock speeds and voltages depending on the CPU or GPU load or both. Even the steaming pile of poo that was Bulldozer had a great turbo, but for some reason AMD decided not to dedicate the resources and engineering talent to fit Kabini with the same feature.

    I'd love to see AMD dump the construction line and work on something similar to Jaguar that scales upward and wider more easily. It might not be a monster, but at least it offers decent perf-per-watt.
  • errorr - Wednesday, April 09, 2014 - link

    Pretty sure the 3d particle motion test was written by Ian and I vaguely recall it being related to his PhD Thesis. Reply
  • lever_age - Wednesday, April 09, 2014 - link

    Just a small suggestion, but for comparative performance tables like the one on the last page, could you mark benchmarks by whether a higher or lower number is better (like on the larger graphs)? I guess this could be done either in the margin by the benchmark name or by coloring the winning column. Reply
  • mr_tawan - Wednesday, April 09, 2014 - link

    Looks a lot like P3 or Athlon board to me, especially when the cpu+cooler is installed. Feel nostalgic. Reply
  • errorr - Wednesday, April 09, 2014 - link

    Well except the lack of North Bridge and no diarrhea brown AGP slot. Reply
  • MikeMurphy - Wednesday, April 09, 2014 - link

    I'm excited to see this in an upgradeable NUC form factor. Reply
  • jardows2 - Wednesday, April 09, 2014 - link

    Please, someone make a thin-mITX for this platform. This is going to be a limited use platform anyway, so there is no reason to put in the full gamut of I/O on the back. Reply
  • jabber - Wednesday, April 09, 2014 - link

    Mmm vertically stacked and the size of a paperback book.

    But whats this...they put the VGA socket at the top! Noooooooooo!

    No analogue audio out on the back, just at the front! Noooooooooo!

    Yep seen that.
  • macs - Wednesday, April 09, 2014 - link

    Sorry Anandtech but this is not a good review.
    Where is power consumption data? Comparison with Intel Haswell G1820 (really affordable Haswell chip)? Htpc quality? Older generation (amd e-350)??
  • Communism - Wednesday, April 09, 2014 - link

    They put the newfangled AMD part in the best possible light since they can't afford AMD to be mad at them since AMD pays the bills :P

    Does make the a mockery of this Article that it doesn't even include the Intel Haswell G1820 that is it's real competition.

    MSRP Intel Haswell G1820 is 42 USD
    You can get an H81 board for 49 USD

    You can also have real upgradability, since you can go up to a 4670 if you want to.
  • CiccioB - Wednesday, April 09, 2014 - link

    I agree, having a sponsored section on the site isn't really a best thing to avoid any suspicion about bias or favouritism. Any glitch in a review can be seen as a favour to the sponsor. And none will ever know if that's true or not.

    BTW, I have an Atom D510 (dual physical cores with HT) ITX server which is on 24/24 7/7 used for P2P, backup and some script, stotage and database server. No GFX capacity needed (it is connected to the main PC through Remote Desktop thus "sharing" one of the monitors).
    What would be the best solution to replace it if it will eventually die tomorrow?
    That Pentium J1800@10W seems very good. Is there something better than that? Consider that power consumption and related fan noise are critical, being always on in my bedroom.
  • macs - Wednesday, April 09, 2014 - link

    I have a similar setup with an AMD e350 itx board. I'm trying to understand if I should upgrade to a newer platform but this review didn't help.
    Main concern is power consumption, it would be interesting comparing Kabini, BayTrail D and low end Haswell
  • Communism - Wednesday, April 09, 2014 - link

    I can understand not doing the Power Consumption tests, since their standard benching platforms are "Enthusiast" and "Overclocking" motherboards, resulting in higher power consumption of the boards masking the true TDP of the CPU/GPU as a result of the additional chips on the board adding other functionality.

    They would have to use very sparse basic boards to make the comparison anywhere close to realistic.
  • jospoortvliet - Thursday, April 10, 2014 - link

    Reviews on other sites show AMD consistently beating baytrail at virtually the same active power envelope and much lower idle. Baytrail offers no advantage whatsoever so you can cross that off the list already 😎 Reply
  • Beany2013 - Wednesday, April 09, 2014 - link

    Try reading the article rather than just looking at the pretty pictures:

    "As mentioned in our test setup, the benchmark results in this preliminary article are only a small fraction of our normal coverage. Due to other commitments we were unable to run every test on all comparison systems, but we have the other Athlon and Sempron APUs as well as comparable Intel counterparts coming in for review."

    Also, the title - Review *Part 1*.

    More numbers, compared to competing parts from intel, will be upcoming. It says it in the article - at least twice. It's not ATs problem if you weren't gifted with the good grace to actually read an article before accusing the author of being a shill/being biased/being on the payroll.

    Steven R
  • macs - Wednesday, April 09, 2014 - link

    I know it's only part one. Anyway it's pretty much useless. I think that on low cost platform like this power consumption for a file server and htpc quality are way more relevant than playing tomb raider or borderlands... Reply
  • DudemanX - Wednesday, April 09, 2014 - link

    I don't disagree with any of that but then why have the game benches at all? Even if they just show us how they get unplayably crushed at 1920x1200xHigh quality settings at least we would be able to see how the IGP power is relative to the discrete chips. I'm just wondering for whom these 1280x1024 numbers are for. Reply
  • hero4hire - Saturday, April 12, 2014 - link

    Bought a H81 for $60, haswell g1830 Celeron for $30. There were cheap $40 boards available. I'm not sold on this chip as graphics performance are rapidly increasing while cpu not. Why hamstring myself to graphics target in upgrades (socketed) when any $20 graphics "accelerator" will be better. Plus I know have a real upgrade path graphically if needed. Reply
  • azazel1024 - Wednesday, April 09, 2014 - link

    For a really cheap, low power system, the AMD Kabinis might work.

    However, looking at what they offer, they can't replace my server with a lower power version. Only 2 SATA3 slots, which means I'd need to throw a RAID card on there of some sort, even a lot end one would add a lot to the price and a lot more to the power consumption.

    My G1610 based system ran $92 for CPU+board ($42 processor, $50 board on sale). It can run rings around even the best Kabini here. Yes, it is 55w versus 25w. That is one of the big things I see missing from this review, was actual power consumption of those Sandy/Ivy celeron systems and the J1800 and the Kabini. What are we talking practical power consumption? I use my server as a file server, itunes server, calibre server and download server. It doesn't need a ton of grunt...but in some use cases, I could see it being bogged down with something like a Kabini (or even the Bay Trail based ones) pretty badly. Like updating iTunes library entries and such forth.

    TDP isn't the only story though, idle power consumption is probably lower on the Kabini than mine...but my TOTAL system power consumption is 21w at idle, 33w streaming video and only 51w under max CPU load (with HDD spun up too, which add around 12w of load for them when not parked). So between idle and max, the CPU has only a dW of 20w, and possibly a little less as the network cards are somewhat more active to, which might be accounting for a watt or so there.

    I know that the CPU is pretty power efficient at idle, though I am sure it could be the CPU itself might be consuming 6-10w at idle and only hitting maybe 30w max under load.

    Now if the board itself that Kabini goes in to can also reduce power consumption a fair amount...

    Back to needing a RAID card though....sigh.

    I am really hoping that Cherry Trail Pentium and Celeron systems include RAID on the boards and also 4 SATA2 slots (or 1/2 SATA3 and 2/3 SATA2). Might just be a good shot to replace my server with one. Especially if it has onboard dual NICs.

    Probably be a Haswell or Broadwell Celeron/Pentium that replaces my server in a year or two though. Sigh.
  • mrdude - Wednesday, April 09, 2014 - link

    You may want to take a look at something like this:

    Provided you don't need IPMI, it seems a solid deal for under $180. Twice the cost of the best Kabini combination above, but it's lower power, passively cooled, has dual Intel NICs with 6 SATA ports.

    I'd love to see AMD compete in that segment of the market since Kabini has a lot to offer there, but they seem to have completely abandoned the x86 server segment. A Kabini NUC-like form factor would be great to see too if they can maintain the socketability. NUCs are wasteful and idiotic in that you have to throw them away if you want to upgrade. An AM1 SFF with swappable motherboards and SoCs would certainly be interesting.
  • Shivansps - Wednesday, April 09, 2014 - link

    @ Newegg
    G1820 $54
    Asrock H81M-DGS $49

    Athlon 5350+AM1 mb = +/- $90

    im sorry, but i dont see why the G1820 is not incluided here... I whant to see that comparison.
  • YuLeven - Wednesday, April 09, 2014 - link

    Because Haswell G1820 would put Kabini's performance to shame and arguments like power comsumption wouldn't cut as excuses to it. Well... that would hurt "AMD Center". Reply
  • jospoortvliet - Thursday, April 10, 2014 - link

    Could you keep that nonsense to yourself please? Reply
  • JDG1980 - Wednesday, April 09, 2014 - link

    You'll be paying more for the LGA 1150 solution if you want a Mini-ITX form factor. In contrast, AM1 has Mini-ITX boards as low as $35. Reply
  • Torashin - Wednesday, April 09, 2014 - link

    "For non-GPU intensive tasks, on paper, the J1900 for $92 and 10W TDP would seem to be the choice if upgradability is not a concern."


    I'm starting to understand why some people call this place Inteltech. How can you justify what you just said? The CPU with half the power - yeah you're keen to point out they're similar in single threaded, but it has half the core count and it shows - and also more expensive, is somehow the better choice!
  • Shivansps - Wednesday, April 09, 2014 - link

    J1900 is quad, J1800 is dual.

    Anadtech did everything they could to make the 5350 look good, not incluiding Haswells is one giant red flag right there, the 5350 MUST be tested vs G1820 as well J1900 and AMD A4s, none of those tests where done to make the 5350 look good.
  • YuLeven - Wednesday, April 09, 2014 - link

    But, but, but... but... G1820's power comsumption is a solid 25W more! Think of the dolphins and the trees! It must be rubbish! Reply
  • Communism - Wednesday, April 09, 2014 - link

    The funny part is that the G1820's true TDP isn't anywhere near 55w.

    It's only listed as 55w TDP because Intel doesn't want to make selling their more expensive low power parts even harder :P.
  • Communism - Wednesday, April 09, 2014 - link

    To put this in perspective, my 3570k supposedly has a TDP of 77w.

    It actually does about 70w for the CPU alone @ 4.4ghz running Intel Math Library Linpack (The highest heat output you're likely to ever see).

    That was running just 5-10 degrees from TJmax as well, so superior cooling wasn't a large factor since the stock CPU w/ stock cooler @ stock speeds would be about that temperature as well.
  • jospoortvliet - Thursday, April 10, 2014 - link

    Esp as it turns out that power usage compared to jaguar cores is barely a few percent lower and actually muc worse at idle. The final article will have to come back on this statement. Reply
  • riottime - Wednesday, April 09, 2014 - link

    gosh. it's depressing looking at the stale fx line in that roadmap. :( Reply
  • FriendlyUser - Wednesday, April 09, 2014 - link

    Great product! Ideal for an HTPC or NAS or even a casual desktop. It will be a hit in the weaker economies. Reply
  • Ranari - Wednesday, April 09, 2014 - link

    It's a great product for the price, and that's all it needs to be: Quad core, a powerful and compute-capable GPU, and reasonably feature rich. The only thing that seems to be holding the Cat cores back are their clockspeeds. Reply
  • robbertbobbertson - Wednesday, April 09, 2014 - link

    mini-PCIe slot cannot be used for cooking scrumptious bacon pancakes
    i am disappoint
  • QChronoD - Wednesday, April 09, 2014 - link

    Just wanted to point out that the description of the IGP Gaming page says the tests are being done at 1080P and Xtreme settings, but it lists 1280x1024 on the graphs. Are they being done at the highest settings as well? I would imagine the test would be more useful if the settings were turned down to medium or low since that's what anyone with the system would actually be using. Reply
  • kyuu - Wednesday, April 09, 2014 - link

    Yeah I was going to say something about that. The article text and the graphs are not in agreement. Reply
  • kallogan - Wednesday, April 09, 2014 - link

    ground breaking new tech ! Wow ! Reply
  • Alexey291 - Thursday, April 10, 2014 - link

    such tech
    much ground breaking
  • evonitzer - Wednesday, April 09, 2014 - link

    In addition to what you are already planning to test, I would like to see some of the low end A4 or A6 APU's tested. They seem to be a blank spot for your reviews, making it tough to compare the cheap stuff against the previous generation. The A4's come with the same 2 CU GPU, but presumably better cpu performance, and are available pretty close to $50. Sure, they are higher TDP, but whatev.

    Anyway, interesting review. I'd be seriously tempted if I didn't just put together a cheap PC for my brother already. Maybe grandma can get a surprise upgrade ...
  • evonitzer - Wednesday, April 09, 2014 - link

    Oh hold on, there aren't any A4's or A6's available in Kaveri form, which means no GCN to be had for cheap. That's interesting. I wonder how much difference it makes on the low end. Well either way, the A4-6300 (and below) are still interesting to compare. Reply
  • Glory2God - Wednesday, April 09, 2014 - link

    That Atom C2750 looks awesome in the multi threaded benchmarks. Reply
  • rogueninja - Wednesday, April 09, 2014 - link

    AMD dualcore, quadcore, octacore, 100 cores. Who give a damn. They're as fast as a turtle. Reply
  • Nintendo Maniac 64 - Wednesday, April 09, 2014 - link

    Would have been nice if there were more older CPUs to compare to, like Athlon 64 x2, first gen Phenom X4, and Conroe Core 2 Duo (rather than Wolfdale). It'd be even better if said older CPUs were around 2.0-2.5GHz as well Reply
  • saiki4116 - Wednesday, April 09, 2014 - link

    Please add comparison with A4-4000(40 USD) and A4-6300,they could cost 10-20usd more than kabini. Reply
  • BushLin - Wednesday, April 09, 2014 - link

    Anandtech, where's the power consumption figures? Just quoting the 25w TDP feeds assumptions such as the one I'm replying to.

    Why is it every time AMD have a CPU worth buying (doesn't happen that often) you guys manage to totally miss the point in the review? It's enough to make an objective person sound like a fanboy.

    I'll save you the trouble:

    Although making the typical reviewer mistake of using a very high wattage PSU on a low power system, we can at least see something close to parity between an Athlon 5350 and a Celeron J1900 (the very same CPUs you reviewed).

    Makes quite a different outcome doesn't it?
  • ozzuneoj86 - Wednesday, April 09, 2014 - link

    I do hope that all of this works out for AMD. It seems like a pretty huge gamble to invest so much time and energy into creating a new socketed platform. I do like the idea of tiny yet powerful systems, for sure. I built a Pentium 4 based mini-ITX system that won Maximum PC's rig of the month in July 2007, since there were relatively few ITX systems available at the time. This was pre-Atom, so you were either spending $300+ on a board that took an equally expensive Core-based laptop CPU, or you were going with a Via C7 or some other abysmally slow chip built into a board that still cost $200. Our grand total for the whole project was under $500.

    That system is socketed, but with Intel Extreme 2 graphics (before GMA) and only a PCI slot to work with, there was only so much it could be used for. It handles emulators perfectly fine though.

    Anyway, I just hope they hang on to this platform for a while so that it makes sense for it to be socketed. Looking at the CPU options available, it doesn't look like it offers a huge range of possibilities. They are all quite firmly seated in the low end of performance and aren't all that power efficient. I could see a low end platform being useful if they offered significantly different combinations of CPU power, GPU power and wattages, but I'm not really seeing that with these.
  • ozzuneoj86 - Wednesday, April 09, 2014 - link

    For clarification, by "not all that power efficient" I mean the TDP, when compared to Intel's non-socketed chips. These are still fairly low wattage, but you can't exactly stuff one of these in a cigar box and run in passively cooled.

    Also, any chance we could get a Celeron 1037u system benchmarked as well? I'm curious as to how the CPU\GPU performance compares to the J series, as well as Kabini.
  • Hardened - Thursday, April 10, 2014 - link

    I woul like to support request for comparison with Celeron 1037U. It is a very interesting CPU performance and cost wise. Gigabyte GA-1037UN mobo is also a very nice offer with 2x LAN and eSATA ports all under 90 USD. Reply
  • Computer Bottleneck - Wednesday, April 09, 2014 - link

    Do we know if any manufacturers will be supporting ECC RAM on these boards? Or any upcoming AM1 boards?

    According to CPU World Athlon 5350 does indeed support ECC:
  • TheCrustyCupcake - Wednesday, April 09, 2014 - link

    Water cooled AM1 socket-ed APU's anyone? If overclocking was available I could see an EVGA super ultra FTW edition AM1 motherboard with an 8-pin supplemental CPU power connector, and crossfire support! Reply
  • DudemanX - Wednesday, April 09, 2014 - link

    In part 2 and in future articles when reviewing IGPs, can you do some game benchmarks at more mainstream resolutions(1680, 1920, maybe even 1440) but with "Low" quality settings? Many might disagree but not running my display at native resolution is one of the last compromises I make when using inadequate hardware. I understand these aren't mainstream chips but they are desktop/media center chips. Do people really buy desktop monitors/TVs lower than 1680 these days? Certainly no one's running games in 4:3 anymore, are they? Reply
  • mariush - Wednesday, April 09, 2014 - link

    I just don't understand why they don't make the boards for these systems to run on 12-19v laptop adapter type power supplies.

    It's really not that expensive to add a dc-dc converter on the motherboard to make 5v for the sata drives and onboard stuff... no, we still have to use a big fat 24pin atx cable AND a 4 pin cpu cable for a system that uses 30-60 watts (without a powerful video card).
  • srkelley - Wednesday, April 09, 2014 - link

    Can you post some modified benchmarks of the gaming performance please? I'm interested in seeing the average frame rate of the same games but run with Medium settings at 1280x720 please. It doesn't need to be in a fancy chart or done completely scientifically, just a few numbers would be helpful! Reply
  • shady28 - Thursday, April 10, 2014 - link

    Great review, this helps clear up some questions regarding AM1.

    Suggestion for your follow up -
    If possible, show more of the chips this succeeds and that it competes against.

    ie :
    J1800, J1900, 3770, and some of the older Atoms like D525 or D510.

    It would also be interesting to see a couple of older reference desktop, like a P4 D chip. For some this would help answer the question, is this a viable upgrade for someone who wants a small low power replacement for an old power hog of a P4?

    The obligatory i5-2500k or i5-3570k is fine, but stacking up a huge number of 100W chips in the comparison doesn't help, it just clutters up the charts with products that in no way compete with something like Kabini.
  • Penti - Thursday, April 10, 2014 - link

    So it's a Kabini/Jaguar fabbed at Global Foundries in Germany? Why? Reply
  • azazel1024 - Thursday, April 10, 2014 - link

    Dear gods yes. Taking DC-DC from a laptop power brick would be awesome. Heck, especially the Intel systems are probably looking at 40-50w absolute max, even with a drive or two in there.

    Considering there are many 90w power bricks...seems like it could take 12v in no problems.

    For the commenter who posted the Tom's review on power numbers who seems to be indicating little difference in power consumption...that 3.5w figure, looking at TOTAL PACKAGE power consumption, is greater than a 10% difference at idle and the figures under load are more like a 30-40% difference in power consumption.

    Since I assume the board and other bits add a fair amount here, the CPU difference is looking like probably double the CPU idle power consumption and triple the CPU load power consumption for the Kabini versus Bay Trail.

    Also, the power consumption figures on both systems are crap. My full up mATX G1610 uses less power at idle and under load than the Kabini system uses and mine even uses less power than the Bay Trail based one at idle. So either the boards and devices attached are total crap, or else they are using one huge and power inefficient PSU to test with. Heck, mine isn't even all that great, a 380w Antec Earth watts bronze rated. Something like a 360w Seasonic gold would probably drop my idle power under 20w from the 21w it is now and load to under 45w.

    So my guess is a big, power inefficient PSU, at the least and maybe just crappy component selection too.

    No matter what though, Kabini doesn't look good there, in comparison to Bay Trail or in comparison to an Intel "55w TDP" processor either...which has oodles more CPU performance.
  • mikato - Friday, April 11, 2014 - link

    I agree, the PSU is really important here. A big PSU will be inefficient for a low power system even if it's a 80 PLUS Gold or something. Reply
  • ruthan - Thursday, April 10, 2014 - link

    Simply slug, this HW is already dead.

    25W is too much for netbook or tablet, and for NAS or HTPC are Corei3 much better choice. GPU performance is also worse that IntelHD4000.
  • azazel1024 - Thursday, April 10, 2014 - link

    I am tossing around the idea of a core i3 for my next server depending on exactly what Cherry/Willow Trail and Broadwell/Skylake might hold.

    The extra cost might be worth it for, what might be, significantly better performance at not significantly higher power consumption during "normal work" which is idle or streaming.

    Otherwise, back to a Broadwell/Skylake based entry level celeron probably. The price and performance are hard to beat for the kind of basic server that I need.
  • Samus - Thursday, April 10, 2014 - link

    Wow that's actually really fast for a 25w CPU. I mean A6-5200 isn't no slouch and its right on par with it. Reply
  • beesbees - Thursday, April 10, 2014 - link

    My good ole AMD Athlon dual core at 3GHZ and 7790 oc 2GB with 4GB DDR2 plays all games maxed. I spent like $60 on that CPU on Newegg back in 2009! Who needs 4 cores? lol Reply
  • HangFire - Friday, April 11, 2014 - link

    Moshi Monsters is great stuff, isn't it? Reply
  • abufrejoval - Thursday, April 10, 2014 - link

    Here in Germany the J1900 became available before the J1800 at my favorite retailer: I got a
    GIGABYTE GA-J1900N-D3V (quad core) some weeks ago, put it into a mini-ITX case with a 90Watt PicoPSU (needed the 12V 4pin connector) and a 60Watt 12V notebook power supply.

    Added a Crucial C300 for storage and went ahead testing with Window 8.1 (the only thing that worked with the initial BIOS) and then with Win7, CentOS 6.5, Fedora 20, Android x86 after the new BIOS made that possible.

    Did the same with a GIGABYTE GA-J1800N-D2H (dual core) two weeks later and benched them side by side.

    Main attraction was of course the fully passive cooling design and the main question was whether they would qualify as a credible desktop for office work or low power server.

    First off, both CPUs *always* work at their top speeds unless idle. So that's 2.41GHz for the J1900 and 2.58GHz for the J1800. The nominal speeds aren't ever used, and I guess their main reason for existance is because it make them look nicer in the Intel charts. And perhaps their predecessors were actually fixed clocked at that value and I guess you'd still get those if you disable turbo in the BIOS.

    Again, even running a Prime95/Furmark combo for hours, won't get any of these CPUs to drop their speeds to nominal: Turbo speeds aren't just for single threaded loads.

    That mainly means that the normal clock difference between the J1900 and the J1800 isn't all that big, just 170MHz on the CPU, while the GPU on the J1900 is a notch above the J1800.

    That again means, that the main difference between the two is the number of cores (2 vs. 4) and the amount electricity they consume and turn into heat.

    It doesn't matter in terms of normal office applications or browsing: The J1800 typically came 170MHz out ahead on things like Kraken or Octane and both are fast enough at 1080p for most users. Yes, side by side with a top-notch 100Watt desktop CPU they are a tad slower, but nothing to loose hair about: Again not-an-Atom any more!

    I managed to get the J1800 to 6.3Watts at the power outlet (behind the 60Watt AC/DC and the 12V PicoPSU) with 8GB of LV DRAM, the Crucial SSD, video off on a 64-Bit Windows 8.1 idle desktop. The J1900 will take 3 Watts more (9.3) for the same setup, which seems to indicate that one half of the J1900 can't go to C7 if the other one is still more or less awake.

    There is of course also another Ethernet port, more USB 3.0 but none of them were used during the low-power tests.

    On the other end, a combined Prime95 and FurMark will result in 28Watts on the J1900 and in 22Watts on the J1800. Core power consumption measured via CPUIDs HWMonitor showed 2.29Watts for the J1900 cores and 2.4Watts for the J1800 cores, while the package consumption was put at 6.85Watts for the J1900 vs. 6.54Watts for the J1900.

    This oddity was consistent and I can only explain it by HWMonitor only measuring one of the two CPU blocks on the J1900, but the full GPU block (and remainder of the SoC), which is clocked a little higher on the J1900 under load.

    The passive cooling solution on the Gigabyte J1900 board was not capable to dissipate all the generated heat on the Prime95/Furmark combination which generated 28Watts at the socket. About 30 minutes into the test at the threshold temp set in the BIOS (I used 90°C) the CPU started to throttle to 1.3GHz and went back to 2.41GHz once the temperature sank sufficiently.

    The J1800 never reached or exceeded 50°C under the same load.

    That all points at the 10W TDP as bolloks or only valid for nominal CPU clocks, but I'm not going to complain, because under any normal or reasonable load, even the J1900 never throttles.

    I was most interested to compare the relative performance of Silverton against the normal Intel architectures and used a QX9100, a Core2 mobile quad core at 2.26GHz, which the J1900 is basically replacing.

    For all ordinary CPU loads the Silverton quad core reached around 80% of the performance of the 45nm QX9100 after adjusting for the clock speed difference (2.41 vs. 2.26 GHz).

    That isn't too bad at all for an "Atom" and clearly shows that the Silverton architecture isn't that bad at all and an incredible value jump if you consider that the QX9100 alone was a 4 digit dollar CPU when it came out.

    And it vastly exceed the GPU performance and functionality of the GM45 chipset, even if it still doesn't qualify for gaming, except under Android-x86, where it kick-ass pretty well, even compared to my Nexus 10 or Galaxy Note 3.

    All-in-all an incredible value which puts a little dent into the A10 Kaveri I just built two weeks before that one.

    My *biggest gripe* about the Silvertons so far is, that Intel hasn't enabled QuickSync yet: It's a documented feature in ARK and one of the reasons I bought the J1900: I am assuming the VPU on Silverton is just as capable as the one found on Haswells and you can't get that speed an functionality cheaper anywhere (which may precisely be why Intel isn't enabling it).

    I could just stop myself from also ordering this Kabini, which has also become available over the last couple of days.
  • imeez - Friday, April 11, 2014 - link

    Interesting. I wonder is there any change to create something similar to Raspberry Pi based on AM1 platform? AM1 is not really for desktop usage. Dont know why they started to name it as Athlon parts even. One biggest problem with whatever x86 "for the masses" is the BIOS. Although coreboot does have initial support for AMD F16 family the AMD guys are not yet ready to provide VGA bios for free. But there is a bigger chance that AMD is gonna open the flood gates rather than Intel. Last time I had a x86-like fully open platform was a ZX Spectrum clone :) Reply
  • Krautmaster - Friday, April 11, 2014 - link

    Well, 120 Bucks for a Intel system for comparison?

    What abt:

    -> IB Celeron @ 2x1,8 Ghz and 17W TDP. Should be at least as fast as the 1,9 Ghz SB Celeron in the review, at a cheaper pricepoint.

    or here

    2x1,8 Ghz Celeron with Board and 3x sata for
    $72.18 & FREE Shipping.
  • tomsworkshop - Friday, April 11, 2014 - link

    i think these chips are nice for 12"-13" slim laptop, and don't if it is soldered on board, is about time they replace the current E350 APU on mobo with jaguar core chips, maybe the GDDR5 sideport memory onboard will be a nice feature. Reply
  • Hrel - Friday, April 11, 2014 - link

    I don't even know why they're bothering, these performance numbers are useless. Reply
  • Oscarcharliezulu - Saturday, April 12, 2014 - link

    PC's wont compete with tablets by being cheap, they compete by being awesome - 4k screens, fast ssd's, great 3d graphics, cloud storage so you can work anywhere, great games at ultra HD.... Reply
  • hangfirew8 - Monday, April 14, 2014 - link

    ...and a full-sized keyboard and mouse. Reply
  • Antronman - Monday, April 21, 2014 - link

    Microsoft solved that problem.
    Heck, some of the Surface tablets have serious CPU computing power.
    PCs compete by having better price/performance, and yes being generally better.

    I'd like to see UE4 on an iPad air :P
  • plonk420 - Sunday, April 13, 2014 - link

    on my 19 watt E-350 (microATX, also see ), i removed the tiny, possibly whiny fan, and secured a PWM 120mm Arctic Cooling fan over it. that with a PC Power and Cooling Silencer, and it was so quiet, i could hear my fridge and other PCs around the apartment, it was so quiet. Reply
  • fteoath64 - Sunday, April 13, 2014 - link

    How could a dual core chip clocked at less than 1.5Ghz with gpu at 400Mhz have a TDP of 25 watts ?!!!. Even an Arm chip clocked at 1.9Ghz have a TDP of about 5 watts. How can x86 compete really in terms of price performance and power ?!. These are throw away chips with such power consumption levels. Reply
  • 0ldman79 - Sunday, April 13, 2014 - link

    Any way we could get some FX processors in the benches?

    I know that AM3+ is probably a dead line, but we've still got the machines out in the world. I'd like to know where my PC lines up without having to find an FX 6300 compared to a Core i5 compared to an A8.
  • payton2037 - Sunday, April 13, 2014 - link

    Start a Second Income™with the company that's first with Internet entrepreneurs!
    For more Details Please visit at :--
  • duploxxx - Monday, April 14, 2014 - link

    And yet again we have to find out that in the conclusion the review start to focus on cpu performance only and compairing with totally different types of cpu to make this kabini look bad especially the single threaded performance. while it is on par with latest generation of intel or on multithreaded way better.

    not a single word in the conclusions on graphics which again shows how bad all these intel parts really are... even these latest generations.

    I often wonder that reviewer background and how neutral they actually are and look at a certain system from a certain perspective, sure my ferrari will run faster then a fiat, but look at the purpose and design...
  • TheinsanegamerN - Monday, April 14, 2014 - link

    Why doesnt anyone make a mini itx boardthat uses the laptop fusion APUs, like the a10-5750m? those only draw 10-15w more power, but have dual channel memory controllers, are much more powerful overall, and are still pretty cheap,. the kabinis are just to slow, but the high end fusions are too expensive now. Reply
  • tyaty1 - Thursday, May 01, 2014 - link

    You can unervolt/undecvock a 6600k just as well.
  • sheh - Friday, April 18, 2014 - link

    No noise, power, temperature figures? Reply
  • unseen1980 - Wednesday, July 16, 2014 - link

    My overclocked 5350 is not performing that bad Reply
  • TheFriz - Monday, October 19, 2015 - link

    I had an E-machine whose MB died. At first I was going to install an older P4 single core MB/CPU, but the performance just wasn't enough for the online game my cust. just had to play. Fry's had a MSI AM1I MB on closeout, and a AMD 5350 on special. So for under $40 had a replacement for the E-machine. Win 7 worked noticeably better, and the video was plenty fast. Plenty of USB ports for the printer & scanner. MB had PS-2 ports, so didn't have to throw away the KB & mouse. MB didn't have any PCI slots, just a PCI-Express slot for video card, which I tried, but the on board video was just as good as the 256 meg card I had. MB dropped in without a hitch.
    Only 2 SATA ports, this made cloning the hard drive a nuisance. Thanks for USB DVD-RW's.
    Just for grins, tried Mint and Puppy Linux. Was really snappy, and the online game played on Mint. Win. 10 probably would have worked, but I quit while I was ahead.
  • LikeClockwork64 - Monday, June 05, 2017 - link

    The problem with this review is the modern content you are trying to run. This processor can wreck old school content. A little $150 PC with this processor could do things that a $1000 from the mid 2000s would struggle with. Reply

Log in

Don't have an account? Sign up now