Determining the TDP of Exynos 5 Dual

Throughout all of our Cortex A15 testing we kept bumping into that 4W ceiling with both the CPU and GPU - but we rarely saw both blocks use that much power at the same time. Intel actually tipped me off to this test to find out what happens if we try and force both the CPU and GPU to run at max performance at the same time. The graph below is divided into five distinct sections, denoted by colored bars above the sections. On this chart I have individual lines for GPU power consumption (green), CPU power consumption (blue) and total platform power consumption, including display, measured at the battery (red).

In the first section (yellow), we begin playing Modern Combat 3 - a GPU intensive first person shooter. GPU power consumption is just shy of 4W, while CPU power consumption remains below 1W. After about a minute of play we switch away from MC3 and you can see both CPU and GPU power consumption drop considerably. In the next section (orange), we fire up a multithreaded instance of CoreMark - a small CPU benchmark - and allow it to loop indefinitely. CPU power draw peaks at just over 4W, while GPU power consumption is understandably very low.

Next, while CoreMark is still running on both cores, we switch back to Modern Combat 3 (pink section of the graph). GPU voltage ramps way up, power consumption is around 4W, but note what happens to CPU power consumption. The CPU cores step down to a much lower voltage/frequency for the background task (~800MHz from 1.7GHz). Total SoC TDP jumps above 4W but the power controller quickly responds by reducing CPU voltage/frequency in order to keep things under control at ~4W. To confirm that CoreMark is still running, we then switch back to the benchmark (blue segment) and you see CPU performance ramps up as GPU performance winds down. Finally we switch back to MC3, combined CPU + GPU power is around 8W for a short period of time before the CPU is throttled.

Now this is a fairy contrived scenario, but it's necessary to understand the behavior of the Exynos 5250. The SoC is allowed to reach 8W, making that its max TDP by conventional definitions, but seems to strive for around 4W as its typical power under load. Why are these two numbers important? With Haswell, Intel has demonstrated interest (and ability) to deliver a part with an 8W TDP. In practice, Intel would need to deliver about half that to really fit into a device like the Nexus 10 but all of the sudden it seems a lot more feasible. Samsung hits 4W by throttling its CPU cores when both the CPU and GPU subsystems are being taxed, I wonder what an 8W Haswell would look like in a similar situation...

Cortex A15: GPU Power Consumption Final Words


View All Comments

  • powerarmour - Friday, January 04, 2013 - link

    "Intel doesn't want to create a chip that cuts into it's very profitable mainstream CPU market."

    Indeed, they've left Cedar Trail to fester and die by totally withdrawing driver support :-

    Quite a lot of desktop Atom hardware is still on the market, and they are trying their best to kill it off.
  • djgandy - Friday, January 04, 2013 - link

    All that says to me is that they don't care about Win7 i.e. non tablets. Reply
  • Krysto - Friday, January 04, 2013 - link

    Cortex A15 coupled with Cortex A7 will use half the power on average. Also, I told you before that Mali T604 is more efficient than PowerVR in the latest iPads, and that's why Apple managed to use a more powerful GPU - because it's more inefficient. They sacrificed energy efficiency for performance, because they can use a very large battery in the iPad.

    I saw you're trying hard to "prove something" about Intel lately, and I'm not sure why. Is Intel is biggest "client" when they pay you for reviews here? Is that why you're trying so hard to make them look good?

    You're also always making unebelivable claims about what Intel chips will do in the future. Even if they get Haswell to 8W (is that for CPU only? The whole SoC? Is it peak TDP? Will it still need fans?), you do realize a Haswell chip costs as much as the whole BOM of an iPhone 5 right? Haswell chips will never arrive in smartphones, or in tablets that are competitive on price.
  • Tetracycloide - Friday, January 04, 2013 - link

    You're always making "unebelivable" claims about what corruption does here. Do you have anything to back up your allegations to a normal person who would view any excitement about future possibilities as some kind of damning evidence that the writer must be on the take? It's like you think everyone that doesn't share your opinion of Intel is paid to have that opinion or something. Reply
  • trivik12 - Friday, January 04, 2013 - link

    Haswell ULV is a SOC. So the platform TDP was < 8W. You like it or lot intel has the best process technology and ultimately they will produce a platform which is faster and lower TDP.

    That being said ARM will dominate the smartphone market and even majority of low end laptops. I see intel existing only in mid to higher end smartphone plus tablets > $500.

    I am personally waiting for broadwell based tablet which should hopefully cut power even more in 14nm process.
  • djgandy - Friday, January 04, 2013 - link

    You'd hope two brand new technologies would be better than two 3/4 year old ones wouldn't you. Clearly you are blinded by your love for ARM in the same way many here are blinded by love for Nvidia and actually consider Tegra 3 a competitive SOC.

    I don't think many people would be astonished to find that the T604, an architecture only released a few months back, is more efficient than PowerVR Series 5, dating back to 2008.

    Why are people so shocked to find that Intel can make a low power chip? It's not some kind of magic, it is a business goal. Power is a trade off just like performance. When you have desktop systems the trade off for using more power is seen as a pro for a 40-50% performance gain.
  • mrdude - Friday, January 04, 2013 - link

    He's spot on about the pricing issue, though. Intel isn't going to start selling Haswell SoCs for $30, and if they do then they'll quickly go out of business. It's a completely different business model that they're trying to compete with. The Tegra 3 costs $15-$25 (and way closer to that $15 to date) while Intel charges $70+ for their CPU+GPU, and that's before you get to the chipset, WiFi and the rest. A low-TDP Haswell chip might offer great performance and fit in the same form factor (tablets), but if the tablet ends up costing $800+ and isn't Apple, well... nobody cares.

    It's not just a matter of performance but performance-per-dollar and design wins. Intel can't afford to drop prices to competitive levels on their Core products unless they can supplement it with very high volume. For very high volume you need to sell a lot of competitive SoCs that can do it all at a very reasonable price. The Tegra 3 was a big success not because it was an amazing performer, but because it offered decent performance for a very low price tag. Can Intel afford to do that with their cash cow business slipping? Remember that x86 seeing drops in sales and PCs aren't exactly doing very well right now. Intel already had to drop their margins and they've let fabs run idle and sent home engineers at their 14nm fab in Ireland all the while processor prices haven't decreased even a tiny bit. Those aren't signs of a company that's willing to compete on price
  • Homeles - Friday, January 04, 2013 - link

    I'm more than willing to pay for the performance premium. Reply
  • mrdude - Friday, January 04, 2013 - link

    While you may be willing to fork over that much cash, most people won't. If you don't believe me, check out the recent sales figures of Win8 devices. The Win8 tablets (excluding Surface RT) don't even make up 1% of all Win8 products sold. That's not poor, that's absolutely horrible. On the other side the cheap Android tablets and smartphones have been gaining significant market share and outselling even the iPhone and iPads. Price matters. A lot. Furthermore, device makers/OEMs are more likely to go with the cheaper SoC if the experience is roughly equal. Remember that a majority of tablet and smartphone buyers don't browse Anandtech for benchmarks but buy based on things like display quality or whether it's got a nice look (or brand name, in the case of Apple). If an OEM can take that $60 saved and put it towards a better display, a larger battery or more NAND then that means a lot more in differentiating yourself from the competition than being 10-15% faster in X benchmark.

    People forget that these are SoCs and not CPUs. They also forget that these aren't DIY computers but tablets. Think about how much people complain when they see a $900 Ultrabook with a crappy 1366x768 TN display but those same people don't utter a word about how Intel's ULVs cost the same as their 35W parts. If the Intel chip was cheaper you'd probably have a better display or a cheaper price tag. This same notion extends to tablets and smartphones.

    Qualcomm is in a place where they can offer something everybody wants; their LTE is second to none. What does Intel have to offer to warrant Intel prices? Currently Intel's chipsets cost as much as an entire Tegra 3 SoC. x86 PC/server and ARM SoCs are in a completely different universe when it comes to pricing, and unless you've got something special (see Qualcomm or Apple) or you're making and selling the device (Samsung), then you're going to have a very rough time of it.
  • jeffkro - Saturday, January 05, 2013 - link

    I paid $15 to "upgrade" my laptop and have since gone back to win 7. A lot of people simply don't want win 8 at any cost. Reply

Log in

Don't have an account? Sign up now