IGP Gaming

Bioshock Infinite

Bioshock Infinite was Zero Punctuation’s Game of the Year for 2013, uses the Unreal Engine 3, and is designed to scale with both cores and graphical prowess. We test the benchmark using the Adrenaline benchmark tool and the Xtreme (1920x1080, Maximum) performance setting, noting down the average frame rates and the minimum frame rates.

Bioshock Infinite: Performance

Bioshock Infinite: Performance

Tomb Raider

The next benchmark in our test is Tomb Raider. Tomb Raider is an AMD optimized game, lauded for its use of TressFX creating dynamic hair to increase the immersion in game. Tomb Raider uses a modified version of the Crystal Engine, and enjoys raw horsepower. We test the benchmark using the Adrenaline benchmark tool and the Perfomance, Quality and Xtreme performance settings, noting down the average frame rates and the minimum frame rates.

Tomb Raider: Performance

Tomb Raider: Performance

Sleeping Dogs

Sleeping Dogs has a highly complex benchmark that can bring the toughest setup and high resolutions down into single figures. Having an extreme SSAO setting can do that, but at the right settings Sleeping Dogs is highly playable and enjoyable. We run the basic benchmark program laid out in the Adrenaline benchmark tool, and the Xtreme (1920x1080, Maximum) performance setting, noting down the average frame rates and the minimum frame rates.

Sleeping Dogs: PerformanceSleeping Dogs: Performance

IGP Comparison, Synthetics Final Words
Comments Locked

126 Comments

View All Comments

  • azazel1024 - Thursday, April 10, 2014 - link

    Dear gods yes. Taking DC-DC from a laptop power brick would be awesome. Heck, especially the Intel systems are probably looking at 40-50w absolute max, even with a drive or two in there.

    Considering there are many 90w power bricks...seems like it could take 12v in no problems.

    For the commenter who posted the Tom's review on power numbers who seems to be indicating little difference in power consumption...that 3.5w figure, looking at TOTAL PACKAGE power consumption, is greater than a 10% difference at idle and the figures under load are more like a 30-40% difference in power consumption.

    Since I assume the board and other bits add a fair amount here, the CPU difference is looking like probably double the CPU idle power consumption and triple the CPU load power consumption for the Kabini versus Bay Trail.

    Also, the power consumption figures on both systems are crap. My full up mATX G1610 uses less power at idle and under load than the Kabini system uses and mine even uses less power than the Bay Trail based one at idle. So either the boards and devices attached are total crap, or else they are using one huge and power inefficient PSU to test with. Heck, mine isn't even all that great, a 380w Antec Earth watts bronze rated. Something like a 360w Seasonic gold would probably drop my idle power under 20w from the 21w it is now and load to under 45w.

    So my guess is a big, power inefficient PSU, at the least and maybe just crappy component selection too.

    No matter what though, Kabini doesn't look good there, in comparison to Bay Trail or in comparison to an Intel "55w TDP" processor either...which has oodles more CPU performance.
  • mikato - Friday, April 11, 2014 - link

    I agree, the PSU is really important here. A big PSU will be inefficient for a low power system even if it's a 80 PLUS Gold or something.
  • ruthan - Thursday, April 10, 2014 - link

    Simply slug, this HW is already dead.

    25W is too much for netbook or tablet, and for NAS or HTPC are Corei3 much better choice. GPU performance is also worse that IntelHD4000.
  • azazel1024 - Thursday, April 10, 2014 - link

    I am tossing around the idea of a core i3 for my next server depending on exactly what Cherry/Willow Trail and Broadwell/Skylake might hold.

    The extra cost might be worth it for, what might be, significantly better performance at not significantly higher power consumption during "normal work" which is idle or streaming.

    Otherwise, back to a Broadwell/Skylake based entry level celeron probably. The price and performance are hard to beat for the kind of basic server that I need.
  • Samus - Thursday, April 10, 2014 - link

    Wow that's actually really fast for a 25w CPU. I mean A6-5200 isn't no slouch and its right on par with it.
  • beesbees - Thursday, April 10, 2014 - link

    My good ole AMD Athlon dual core at 3GHZ and 7790 oc 2GB with 4GB DDR2 plays all games maxed. I spent like $60 on that CPU on Newegg back in 2009! Who needs 4 cores? lol
  • HangFire - Friday, April 11, 2014 - link

    Moshi Monsters is great stuff, isn't it?
  • abufrejoval - Thursday, April 10, 2014 - link

    Here in Germany the J1900 became available before the J1800 at my favorite retailer: I got a
    GIGABYTE GA-J1900N-D3V (quad core) some weeks ago, put it into a mini-ITX case with a 90Watt PicoPSU (needed the 12V 4pin connector) and a 60Watt 12V notebook power supply.

    Added a Crucial C300 for storage and went ahead testing with Window 8.1 (the only thing that worked with the initial BIOS) and then with Win7, CentOS 6.5, Fedora 20, Android x86 after the new BIOS made that possible.

    Did the same with a GIGABYTE GA-J1800N-D2H (dual core) two weeks later and benched them side by side.

    Main attraction was of course the fully passive cooling design and the main question was whether they would qualify as a credible desktop for office work or low power server.

    First off, both CPUs *always* work at their top speeds unless idle. So that's 2.41GHz for the J1900 and 2.58GHz for the J1800. The nominal speeds aren't ever used, and I guess their main reason for existance is because it make them look nicer in the Intel charts. And perhaps their predecessors were actually fixed clocked at that value and I guess you'd still get those if you disable turbo in the BIOS.

    Again, even running a Prime95/Furmark combo for hours, won't get any of these CPUs to drop their speeds to nominal: Turbo speeds aren't just for single threaded loads.

    That mainly means that the normal clock difference between the J1900 and the J1800 isn't all that big, just 170MHz on the CPU, while the GPU on the J1900 is a notch above the J1800.

    That again means, that the main difference between the two is the number of cores (2 vs. 4) and the amount electricity they consume and turn into heat.

    It doesn't matter in terms of normal office applications or browsing: The J1800 typically came 170MHz out ahead on things like Kraken or Octane and both are fast enough at 1080p for most users. Yes, side by side with a top-notch 100Watt desktop CPU they are a tad slower, but nothing to loose hair about: Again not-an-Atom any more!

    I managed to get the J1800 to 6.3Watts at the power outlet (behind the 60Watt AC/DC and the 12V PicoPSU) with 8GB of LV DRAM, the Crucial SSD, video off on a 64-Bit Windows 8.1 idle desktop. The J1900 will take 3 Watts more (9.3) for the same setup, which seems to indicate that one half of the J1900 can't go to C7 if the other one is still more or less awake.

    There is of course also another Ethernet port, more USB 3.0 but none of them were used during the low-power tests.

    On the other end, a combined Prime95 and FurMark will result in 28Watts on the J1900 and in 22Watts on the J1800. Core power consumption measured via CPUIDs HWMonitor showed 2.29Watts for the J1900 cores and 2.4Watts for the J1800 cores, while the package consumption was put at 6.85Watts for the J1900 vs. 6.54Watts for the J1900.

    This oddity was consistent and I can only explain it by HWMonitor only measuring one of the two CPU blocks on the J1900, but the full GPU block (and remainder of the SoC), which is clocked a little higher on the J1900 under load.

    The passive cooling solution on the Gigabyte J1900 board was not capable to dissipate all the generated heat on the Prime95/Furmark combination which generated 28Watts at the socket. About 30 minutes into the test at the threshold temp set in the BIOS (I used 90°C) the CPU started to throttle to 1.3GHz and went back to 2.41GHz once the temperature sank sufficiently.

    The J1800 never reached or exceeded 50°C under the same load.

    That all points at the 10W TDP as bolloks or only valid for nominal CPU clocks, but I'm not going to complain, because under any normal or reasonable load, even the J1900 never throttles.

    I was most interested to compare the relative performance of Silverton against the normal Intel architectures and used a QX9100, a Core2 mobile quad core at 2.26GHz, which the J1900 is basically replacing.

    For all ordinary CPU loads the Silverton quad core reached around 80% of the performance of the 45nm QX9100 after adjusting for the clock speed difference (2.41 vs. 2.26 GHz).

    That isn't too bad at all for an "Atom" and clearly shows that the Silverton architecture isn't that bad at all and an incredible value jump if you consider that the QX9100 alone was a 4 digit dollar CPU when it came out.

    And it vastly exceed the GPU performance and functionality of the GM45 chipset, even if it still doesn't qualify for gaming, except under Android-x86, where it kick-ass pretty well, even compared to my Nexus 10 or Galaxy Note 3.

    All-in-all an incredible value which puts a little dent into the A10 Kaveri I just built two weeks before that one.

    My *biggest gripe* about the Silvertons so far is, that Intel hasn't enabled QuickSync yet: It's a documented feature in ARK and one of the reasons I bought the J1900: I am assuming the VPU on Silverton is just as capable as the one found on Haswells and you can't get that speed an functionality cheaper anywhere (which may precisely be why Intel isn't enabling it).

    I could just stop myself from also ordering this Kabini, which has also become available over the last couple of days.
  • imeez - Friday, April 11, 2014 - link

    Interesting. I wonder is there any change to create something similar to Raspberry Pi based on AM1 platform? AM1 is not really for desktop usage. Dont know why they started to name it as Athlon parts even. One biggest problem with whatever x86 "for the masses" is the BIOS. Although coreboot does have initial support for AMD F16 family the AMD guys are not yet ready to provide VGA bios for free. But there is a bigger chance that AMD is gonna open the flood gates rather than Intel. Last time I had a x86-like fully open platform was a ZX Spectrum clone :)
  • Krautmaster - Friday, April 11, 2014 - link

    Well, 120 Bucks for a Intel system for comparison?

    What abt:
    biostar-nm70i-1037u

    -> IB Celeron @ 2x1,8 Ghz and 17W TDP. Should be at least as fast as the 1,9 Ghz SB Celeron in the review, at a cheaper pricepoint.

    or here

    http://www.amazon.com/ECS-Elitegroup-NM70-I-Proces...

    2x1,8 Ghz Celeron with Board and 3x sata for
    $72.18 & FREE Shipping.

Log in

Don't have an account? Sign up now