Determining the TDP of Exynos 5 Dual

Throughout all of our Cortex A15 testing we kept bumping into that 4W ceiling with both the CPU and GPU - but we rarely saw both blocks use that much power at the same time. Intel actually tipped me off to this test to find out what happens if we try and force both the CPU and GPU to run at max performance at the same time. The graph below is divided into five distinct sections, denoted by colored bars above the sections. On this chart I have individual lines for GPU power consumption (green), CPU power consumption (blue) and total platform power consumption, including display, measured at the battery (red).

In the first section (yellow), we begin playing Modern Combat 3 - a GPU intensive first person shooter. GPU power consumption is just shy of 4W, while CPU power consumption remains below 1W. After about a minute of play we switch away from MC3 and you can see both CPU and GPU power consumption drop considerably. In the next section (orange), we fire up a multithreaded instance of CoreMark - a small CPU benchmark - and allow it to loop indefinitely. CPU power draw peaks at just over 4W, while GPU power consumption is understandably very low.

Next, while CoreMark is still running on both cores, we switch back to Modern Combat 3 (pink section of the graph). GPU voltage ramps way up, power consumption is around 4W, but note what happens to CPU power consumption. The CPU cores step down to a much lower voltage/frequency for the background task (~800MHz from 1.7GHz). Total SoC TDP jumps above 4W but the power controller quickly responds by reducing CPU voltage/frequency in order to keep things under control at ~4W. To confirm that CoreMark is still running, we then switch back to the benchmark (blue segment) and you see CPU performance ramps up as GPU performance winds down. Finally we switch back to MC3, combined CPU + GPU power is around 8W for a short period of time before the CPU is throttled.

Now this is a fairy contrived scenario, but it's necessary to understand the behavior of the Exynos 5250. The SoC is allowed to reach 8W, making that its max TDP by conventional definitions, but seems to strive for around 4W as its typical power under load. Why are these two numbers important? With Haswell, Intel has demonstrated interest (and ability) to deliver a part with an 8W TDP. In practice, Intel would need to deliver about half that to really fit into a device like the Nexus 10 but all of the sudden it seems a lot more feasible. Samsung hits 4W by throttling its CPU cores when both the CPU and GPU subsystems are being taxed, I wonder what an 8W Haswell would look like in a similar situation...

Cortex A15: GPU Power Consumption Final Words
Comments Locked

140 Comments

View All Comments

  • metafor - Friday, January 4, 2013 - link

    It matters to a degree. Look at the CPU power chart, the CPU is constantly being ramped from low to high frequencies and back.

    Tegra automatically switches the CPU to a low-leakage core at some frequency threshold. This helps in almost all situations except for workloads that constantly keep the CPU at above that threshold, which, if you look at the graph, isn't the case.

    That being said, that doesn't mean it'll be anywhere near enough to catch up to its Atom and Krait competitors.
  • jeffkro - Saturday, January 5, 2013 - link

    The tegra 3 is also not the post powerful arm processor, intel obviously chose it to make atom look better.
  • npoe1 - Wednesday, January 9, 2013 - link

    From one of Ananad's articles: "NVIDIA recently revealed it was doing something similar to this with its upcoming Tegra 3 (Kal-El) SoC. NVIDIA outfitted its next-generation SoC with five CPU cores, although only a maximum of four are visible to the OS. If you’re running light tasks (background checking for email, SMS/MMS, twitter updates while your phone is locked) then a single low power Cortex A9 core services those needs while the higher performance A9s remain power gated. Request more of the OS (e.g. unlock your phone and load a webpage) and the low power A9 goes to sleep and the 4 high performance cores wake up."

    http://www.anandtech.com/show/4991/arms-cortex-a7-...
  • jeffkro - Saturday, January 5, 2013 - link

    A15 currently pulls to much power for smartphone but it makes for a great tablet chip as well as providing enough horse power to power basic laptops.
  • djgandy - Friday, January 4, 2013 - link

    The most obvious thing here is that PowerVR graphics are far superior to Nvidia graphics.
  • Wolfpup - Friday, January 4, 2013 - link

    Actually no, that isn't obvious at all. Tegra 3 is a two year old design, on a 2 generations old process. The fact that it's still competitive today is just because it was so good to begin with. It'll be nessisary to look at the performance and power usage of upcoming Nvidia chips on the same process to actually say anything "obvious" about them.
  • Death666Angel - Friday, January 4, 2013 - link

    According to Wikipedia, the 545 is from January '10, so it's got its a 3 year old now. The only current gen thing here is the Mali. The 225 is just a 220 with a higher clock, so it's about 1.5 to 2 years old.
  • djgandy - Friday, January 4, 2013 - link

    And a 4/5 year old atom and the 2/3 year+ old SGX545 aren't old designs?

    Look at the power usage of Nvidia. It's way beyond what is acceptable for any SOC design. Phones from 2 years ago used far less power on older processes than the 40nm T3! Just look at GLbenchmark battery life tests for the HTC One X and you'll see how poor the T3 GPU is. In fact just take your Nvidia goggles off and re-read this whole article.
  • Wolfpup - Friday, January 4, 2013 - link

    Atom's basic design is old, the manufacturing process is newer. Tegra 3 is by default at the biggest disadvantage here. You accuse me of bias when it appears you're actually biased.
  • Chloiber - Tuesday, January 8, 2013 - link

    First of all it's still 40nm.

    Second of all: you mentioned the battery benchmarks yourself. Go look at the Nexus 4 review and look how the international version of the One X fares. Battery life on the T3 One X is very good, if you take into account that it's based on 40nm compared to 28nm of the One XL and uses 4 cores.

Log in

Don't have an account? Sign up now