Modifying a Krait Platform: More Complicated

Modifying the Dell XPS 10 is a little more difficult than Acer's W510 and Surface RT. In both of those products there was only a single inductor in the path from the battery to the CPU block of the SoC. The XPS 10 uses a dual-core Qualcomm solution however. Ever since Qualcomm started doing multi-core designs it has opted to use independent frequency and voltage planes for each core. While all of the A9s in Tegra 3 and both of the Atom cores used in the Z2760 run at the same frequency/voltage, each Krait core in the APQ8060A can run at its own voltage and frequency. As a result, there are two power delivery circuits that are needed to feed the CPU cores. I've highlighted the two inductors Intel lifted in orange:

Each inductor was lifted and wired with a 20 mΩ resistor in series. The voltage drop across the 20 mΩ resistor was measured and used to calculate CPU core power consumption in real time. Unless otherwise stated, the graphs here represent the total power drawn by both CPU cores.

Unfortunately, that's not all that's necessary to accurately measure Qualcomm CPU power. If you remember back to our original Krait architecture article you'll know that Qualcomm puts its L2 cache on a separate voltage and frequency plane. While the CPU cores in this case can run at up to 1.5GHz, the L2 cache tops out at 1.3GHz. I remembered this little fact late in the testing process, and we haven't yet found the power delivery circuit responsible for Krait's L2 cache. As a result, the CPU specific numbers for Qualcomm exclude any power consumed by the L2 cache. The total platform power numbers do include it however as they are measured at the battery.

The larger inductor in yellow feeds the GPU and it's instrumented using another 20 mΩ resistor.

Visualizing Krait's Multiple Power/Frequency Domains

Qualcomm remains adament about its asynchronous clocking with multiple voltage planes. The graph below shows power draw broken down by each core while running SunSpider:

SunSpider is a great benchmark to showcase exactly why Qualcomm has each core running on its own power/frequency plane. For a mixed workload like this, the second core isn't totally idle/power gated but it isn't exactly super active either. If both cores were tied to the same voltage/frequency, the second core would have higher leakage current than in this case. The counter argument would be that if you ran the second core at its max frequency as well it would be able to complete its task quicker and go to sleep, drawing little to no power. The second approach would require a very fast microcontroller to switch between v/f modes and it's unclear which of the two would offer better power savings. It's just nice to be able to visualize exactly why Qualcomm does what it does here.

On the other end of the spectrum however is a benchmark like Kraken, where both cores are fairly active and the workload is balanced across both cores:

 

Here there's no real benefit to having two independent voltage/frequency planes, both cores would be served fine by running at the same voltage and frequency. Qualcomm would argue that the Kraken case is rare (single threaded performance still dominates most user experience), and the power savings in situations like SunSpider are what make asynchronous clocking worth it. This is a much bigger philosophical debate that would require far more than a couple of graphs to support and it's not one that I want to get into here. I suspect that given its current power management architecture, Qualcomm likely picked the best solution possible for delivering the best possible power consumption. It's more effort to manage multiple power/frequency domains, effort that I doubt Qualcomm would put in without seeing some benefit over the alternative. That being said, what works best for a Qualcomm SoC isn't necessarily what's best for a different architecture.

Introduction Krait: Idle Power
Comments Locked

140 Comments

View All Comments

  • tahyk - Friday, January 4, 2013 - link

    I've read that review. That's a total scam. They comparing the Atom N270, aka the original 2008 Netbook chip to the latest and greatest ARM. This article is about the Atom Z2760, so it compares 2012 x86 to 2012 ARM - a whole lot more fair.
  • mugiebahar - Friday, January 4, 2013 - link

    While I always enjoy reading here, I have to admit this article is not 1 of them. I'm not slinging mud or anything but rather I think it's highly subjective to consider @ this point intel is in good standing to make in roads. I agree intel has the technology/money/resources/will power to make a killer chip that sips power better then anyone. But as so many have pointed out and cannot be changed, Intel doesn't have the ability to do it for a cheap competitive price. ARM has always will always be better in that. It's all about how a company is built. ARM doesn't need to finance a foundry much less several like intel. With over head and size comes problems changing business models, especially in manufacturing. The thing is while we don't know yet what the future holds as to the amount of things we will do on a phone, I can guarantee I won't be ripping a DVD, making CAD drawings on it. So fundamentally we will hit a wall that the cost is not worth the money. Am I wrong? While I know what the article is pointing to, which is a strong class leading, watt sipping intel. But they cannot win or be as noteworthy as the article points out. You can't ask a company to devalue their products, Why? Because you lose either because 1) you look desperate or 2) you acknowledge that you where ripping people off before. While they may have had legitimate reasons for pricing, or that technology brought prices down its perception that's the killer. Is Atom bad? No. But ask regular joe, he'll tell you it's crap but why? Price and perception. Intel did something's right and sme wrong. They should have realized a while back desktops were good enough and push mobile chips to the better lower cost production. But now a company so heavy on top can switch just like that. Now ether they will have to restructure to be competitive in the mobile arena or just play second fiddle. But they can't right now (unless they change) be a mobile king as they are on the desk top only because of company structure nothing else.
  • jemima puddle-duck - Friday, January 4, 2013 - link

    I'd echo this sentiment. I'm getting less and less interested that so-and-so has made something slightly better than so-and-so. The chances are this Atom will never see the inside of more than 1 or 2 phones. I want to know why! This is the insight Anandtech, with its extensive contacts, can deliver! I guess what I'm asking for is more politics and less technology :-)
  • vngannxx - Friday, January 4, 2013 - link

    Anandtech should rerun the nexus 10 benchmark with the aosp browser.
  • mugiebahar - Friday, January 4, 2013 - link

    While Intel is the 800lb Gorilla that there is no doubt. Problem is its not stuck in a room with a monkey and a chimpanzee (AMD and VIA) this time it's in a room with a Lion (Apple) Tiger (say Samsung) Siberian Tiger (Qualcomm) mountain lion (TI) baby cub (AMD) and a litter full of Chinese cats. So the Gorilla is the strongest but now he might just have the hair bit of his ass cheeks if he doesn't watch it. Please feel free to reorder the cats to a better resemblance to whoever as I just thought about it quickly, but I think you can agree True?
  • UpSpin - Friday, January 4, 2013 - link

    Intel PR, nothing more.
    This article is misleading, confusing and compares totally different things. It's a shame to see such a bad written article on Anandtech, full with meaningless, misleading, graphs, which just sit there, without any further descriptive text. If a image isn't worth some text, it isn't worth to get shown at all!

    1. There's no use in showing and even comparing Total Power Consumption numbers, because the systems are totally different. So don't show them! Everything else is misleading, most probably on purpose because the absolutely low-end Intel device looks good in this comparison, logically. But please, if you can't compare things, don't try to compare them. And if you can't compare them, also don't further use such numbers, like in Task Energy Total Platform. It's useless.
    You can't measure Qualcomm chips correctly, thus include Total Platform power draw? Poor excuse. If you can't measure it, don't post it, but don't post false and misleading numbers.
    2. What is Average Power Draw? What's the use of it? You don't use those graphs in your Article at all! Do you know what this means? Exactly: Those graphs are useless and meaningless. Why do you post them? They are redundant because of the Energy graphs. So naturally, because of the much shorter run time of the A15 SoC, the Average graph looks disadvantageous for ARM, which is simply misleading. But well Intel is probably happy you posted hit and thanked you with cash, why else should a sane person post such misleading stuff.
    3. GPU Power: What game? How did it run? Off-Screen? The same resolution on every tablet? The same API? The same FPS? It's not surprising that a low-end GPU struggling to keep maybe 10FPS consumes less power than a high-end GPU displaying 60FPS at a higher resolution. You haven't said anything about this issue, yet happily compare meaningless numbers.

    I'm sorry but this article is, right now, garbage. And the only reason for posting such a poor written article is that Intel must have paid you a lot of money for doing so.

    It's nice that you post such semi-scientifical articles, but the way you do in this case isn't great.
    This article is very very hard to read, because the reader has to do ALL the interpretation.
    You could remove 2/3 of all graphs, and the article would contain the same information.
    By just looking at the graphs Intel is the overall winner, which is, if you do some further comparisons based on your article wrong. At most Intel is, according to your graphs, on par with A15, CPU wise, which still is a nice outcome for Intel.
    The GPU is awful in the Intel SoC, the CPU competive.
    The A15 GPU is perfect, the CPU at least as efficient as the Intel one, but much faster.
    Yet, because the article is so confusing and I don't want to waste any further time doing the work a good writer should have done, I see ARM as the clear winner.
    Same or more efficient CPU, much faster CPU, much better GPU, overall winner!
    Similar argumentation for Tegra and Krait.
    Intel has a good CPU, but the SoC looks awful.
  • powerarmour - Friday, January 4, 2013 - link

    "I see ARM as the clear winner.
    Same or more efficient CPU, much faster CPU, much better GPU, overall winner!
    Similar argumentation for Tegra and Krait.
    Intel has a good CPU, but the SoC looks awful."

    That was exactly my conclusion reading through it, I just couldn't be bothered to be eloquent enough to explain it like that as it seemed obvious to me.

    I look at the SoC as a whole, and apart from a 'slight' advantage on the CPU side in a few select (and likely x86 optimized) browser benchmarks, the Clover Trail SoC is really quite lacklustre.
  • mfergus - Friday, January 4, 2013 - link

    The GPU in the clover trail soc isn't even made by Intel. They could swap it out for anything they wanted tho they want it to be an in house gpu.
  • Cold Fussion - Saturday, January 5, 2013 - link

    I concur, the article is pretty bad as it stands. Apart from all the poorly presented information, it should have had tests done on an andriod tablet running the same krait SOC as the windows tablet so we establish how the different operating systems affect power draw. Without that I don't see how they can reasonably establishes the differences between A15 and the others.
  • wsw1982 - Friday, January 11, 2013 - link

    http://www.phonearena.com/news/Intel-Atom-powered-...

    check this out... The clove trail+ in smart phone lenovo K900 scores more than 25000 in Antutu on a 1080P display, which just crush snapdragon pro (4 krait) in Optimus G, and the beloved Samsung Exynos 5440 (2 A15) in Nexus 10...

    So, what gonna be the next far cry from ARMy: "Intel cannot make low power chips"? Oh, no, that's already busted. Then, I guess it gonna be "Intel cannot sell smartphone chips as cheap as others" or "We don't care performance and we don't care battery life, we just care the compatibility to IOS"

Log in

Don't have an account? Sign up now