Determining the TDP of Exynos 5 Dual

Throughout all of our Cortex A15 testing we kept bumping into that 4W ceiling with both the CPU and GPU - but we rarely saw both blocks use that much power at the same time. Intel actually tipped me off to this test to find out what happens if we try and force both the CPU and GPU to run at max performance at the same time. The graph below is divided into five distinct sections, denoted by colored bars above the sections. On this chart I have individual lines for GPU power consumption (green), CPU power consumption (blue) and total platform power consumption, including display, measured at the battery (red).

In the first section (yellow), we begin playing Modern Combat 3 - a GPU intensive first person shooter. GPU power consumption is just shy of 4W, while CPU power consumption remains below 1W. After about a minute of play we switch away from MC3 and you can see both CPU and GPU power consumption drop considerably. In the next section (orange), we fire up a multithreaded instance of CoreMark - a small CPU benchmark - and allow it to loop indefinitely. CPU power draw peaks at just over 4W, while GPU power consumption is understandably very low.

Next, while CoreMark is still running on both cores, we switch back to Modern Combat 3 (pink section of the graph). GPU voltage ramps way up, power consumption is around 4W, but note what happens to CPU power consumption. The CPU cores step down to a much lower voltage/frequency for the background task (~800MHz from 1.7GHz). Total SoC TDP jumps above 4W but the power controller quickly responds by reducing CPU voltage/frequency in order to keep things under control at ~4W. To confirm that CoreMark is still running, we then switch back to the benchmark (blue segment) and you see CPU performance ramps up as GPU performance winds down. Finally we switch back to MC3, combined CPU + GPU power is around 8W for a short period of time before the CPU is throttled.

Now this is a fairy contrived scenario, but it's necessary to understand the behavior of the Exynos 5250. The SoC is allowed to reach 8W, making that its max TDP by conventional definitions, but seems to strive for around 4W as its typical power under load. Why are these two numbers important? With Haswell, Intel has demonstrated interest (and ability) to deliver a part with an 8W TDP. In practice, Intel would need to deliver about half that to really fit into a device like the Nexus 10 but all of the sudden it seems a lot more feasible. Samsung hits 4W by throttling its CPU cores when both the CPU and GPU subsystems are being taxed, I wonder what an 8W Haswell would look like in a similar situation...

Cortex A15: GPU Power Consumption Final Words
Comments Locked

140 Comments

View All Comments

  • extide - Friday, January 4, 2013 - link

    When will you post an article about Bay Trail / Valley View?? Usually you guys are pretty fast to post stuff about topics like this yet I have seen some info on other sites already...
  • jpcy - Friday, January 4, 2013 - link

    ...which I bet CISC users thought had ended about 18 years ago...

    It's good to see a resurgence of this highly useful, extremely low-power and very hardy British CPU platform.

    I remember back in the day when ARMs were used in the Acorn computers (possibly too long ago for most to remember, now - I still have an A7000 and a RISC PC with both a StrongARM and a DX2-66 lol) was at war with Intel's Pentium CPU range and AMD's K6's, boasting an almost 1:1 ration of MIPS:MHz - Horsepower for your money (something Intel and AMD were severely lacking in, if I remember correctly.)

    And now, well, who'dve thought it... These ARM CPUs are now in nearly everything we use... Phones, smartphones, tablets, notebooks...

    Suppose I was right in the argument with my mate in school afterall... RISC, superior technology (IMHO) may well take over, yet!
  • nofumble62 - Friday, January 4, 2013 - link

    No performance advantage, no battery life advantage. Why anyone would bother with incompatible software?
  • sseemaku - Friday, January 4, 2013 - link

    Looks like people have changed religion from AMD to ARM. Thats what I see from some comments.
  • mugiebahar - Saturday, January 5, 2013 - link

    Yeah n no. They wanted a no paid opinions to screw with the outcome. But Intel hype won over real life .

    Intel better and will get better - yes
    Any chance they will compete (performance and PRICE) and legacy. Support to phone apps - Never in the near future which is the only time for them.
  • tuxRoller - Saturday, January 5, 2013 - link

    Also, any chance for an actual performance comparison between the platforms?
    Apple's performance and power use look awesome. Better than I had imagined.
    I'd love to see how they compare on the same tests, however.
  • Kogies - Saturday, January 5, 2013 - link

    It appears the war has begun, well two wars in fact. The one you have articulately described, and the oft ensuing war-of-words...

    Thanks Anand, I appreciate the analysis you have given. It is excellent to get to see the level of granularity you have been able to achieve with your balance of art and science, and knowing where to hook into! I am very interested to see how the L2 cache power draw effects the comparison, just a little jitter in my mind. If nothing else, it looks as if the delicate balance of process tech., and desired performance/power may have a greater bearing on this "war" than mere ISA.

    With Krait 300, Haswell, and more A15's this is going to be a tremendous year. Keep up the good work.
  • Torrijos - Saturday, January 5, 2013 - link

    Any chance we could see the same tests run on the latest Apple iPad?
    That way we could have a chance to see what Apple tried to improve compared to the A15 generation.
  • urielshun - Saturday, January 5, 2013 - link

    The whole discussion about ARM and x86 is not important when you go for the ecomonics of each platform. ARM is dirty cheap and works well. It's 1/10th of the price of any current Atom with decent perfomace (talking about RK3066).

    Don't underestimate the Chinese they are having a field day with ARM's pricing model and and have shown amazing chips.

    In 8 years from now all SoC's would have reached the usuable performace and the only thing that will matter will be power and cost of integration.
  • iwod - Saturday, January 5, 2013 - link

    Where are you getting 1/10 of a price from? Unless they are produced on good old 40nm LP Node with Nothing else, or crap included, otherwise there just aren't any Chinese SoC selling for $4

Log in

Don't have an account? Sign up now