CPU Performance

The big news with Tegra 3 is that you get four ARM Cortex A9 cores with NEON support instead of just two (sans NEON) in the case of the Tegra 2 or most other smartphone class SoCs. In the short period of time I had to test the tablet I couldn't draw many definitive conclusions but I did come away with some observations.

Linpack showed us healthy gains over Tegra 2 thanks to full NEON support in Tegra 3:

Linpack - Single-threaded

Linpack - Multi-threaded

As expected, finding applications and usage models to task all four cores is pretty difficult. That being said, it's not hard to use the tablet in such a way that you do stress more than two cores. You won't see 100% CPU utilization across all four cores, but there will be a tangible benefit to having more than two. Whether or not the benefit is worth the cost in die area is irrelevant, it only means that NVIDIA (and/or its partners) have to pay more as the price of the end product to you is already pretty much capped.

SunSpider JavaScript Benchmark 0.9.1

Rightware BrowserMark

The bigger benefit I saw to having four cores vs. two is that you're pretty much never CPU limited in anything you do when multitasking. Per core performance can always go up but I found myself bound either by the broken WiFi or NAND speed. In fact, the only thing that would bring the Prime to a halt was if I happened to be doing a lot of writing to NAND over USB. Keyboard and touch interrupts were a low priority at that point, something I hope to see addressed as we are finally entering the era of performance good enough to bring on some I/O crushing multitasking workloads.

Despite having many cores at its disposal, NVIDIA appears to have erred on the side of caution when it comes to power consumption. While I often saw the third and fourth cores fire up when browsing the web or just using the tablet, NVIDIA did a good job of powering them down when their help wasn't needed. Furthermore, NVIDIA also seems to prefer running more cores at lower voltage/frequency settings than fewer cores at a higher point in the v/f curve. This makes sense given the non-linear relationship between voltage and power.

From a die area perspective I'm not entirely sure having four (technically, five) A9 cores is the best way to deliver high performance, but without a new microprocessor architecture it's surely more efficient than just ratcheting up clock speed. I plan on providing a more thorough look at Tegra 3 SoC performance as I spend more time with a fixed Prime, but my initial impressions are that the CPU performance isn't really holding the platform back.

A Lesson in How Not to Launch a Product Tegra 3 GPU: Making Honeycomb Buttery Smooth
POST A COMMENT

204 Comments

View All Comments

  • ATOmega - Tuesday, December 06, 2011 - link

    Exactly my reasoning, and it's easy enough to see that the Transformer Prime boasts better features like GPS, camera and display.

    After having owned an iPad2, there must be a reason why I want this thing. And it's for all the things Apple decided to fleece me on so that they could jack up their margins.

    I never liked Apple in the first place, but it's obvious they don't want to truly compete with the Android OS.
    Reply
  • sigmatau - Thursday, December 01, 2011 - link

    I was really hoping Nvidia would have something special with kal-el. This is pretty horrible when compared to hardware that has been available for about a year.

    It seems that the Nvidia-equiped tablets should be targeted at a much lower price point. They definetly should not be in the top tier tablets. I can see them selling at $300 or less.
    Reply
  • UpSpin - Thursday, December 01, 2011 - link

    Don't forget the display brightness. I don't think that the processor makes such a huge difference in power consumption, but the display on the Prime is much much brighter than the display of the iPad, and this will consume a huge amount of power. It's also brighter than the display of the first gen transformer, and the prime has a quad core and it gets the same or higher battery life than the first gen, I think that's really a big improvement.
    And if you want to use a tablet on the go, outside the house, you really need the brightest display possible. And as you can see in the picture, the difference between the sunlight visibility of the iPad and Prime is like day and night. Therefore I at least, sacrifice the 3 hours less battery life.
    Reply
  • name99 - Thursday, December 01, 2011 - link

    The display surely only uses more power if you drive it at a brighter level?

    And a well-designed tablet should have a light sensor, and should do a good job of auto-calibrating the brightness to the environment, so that most of the time it is NOT running the screen in bright mode. After all, that's what we expect regarding the core --- we throttle CPU when it's not needed. So, sorry, I don't think this is an acceptable answer.

    I continue to state my original thesis --- I suspect that DRAM power is substantially more important than most people believe, and that one of Apple's advantages is that they ship iOS devices with minimal DRAM. This is obviously a hassle for developers, and even for some power users, but that's the tradeoff one has to make.

    (Also what's the story with Android and VM? iOS does NOT do any "write" swapping --- code is paged in, but data is not paged out, and I expect that this is a power issue, nothing else --- Apple doesn't want the power hit of swapping.
    I thought Android was like this --- did not write pages --- but I have read stuff recently that said no, it is now using standard desktop type VM, which is likely also a power sink.)
    Reply
  • metafor - Thursday, December 01, 2011 - link

    nVidia's solution is actually much more sophisticated than that. It's similar to a method Intel started using a ways back for laptops.

    It's not just auto-dimming the entire screen; it's auto-dimming every pixel individually. There is fine-grain control over the LED backlight of the display. Areas in each frame that contains black will now not only have the LCD crystal for that pixel in a "block" mode, but it will also have the backlight for that specific pixel dim.

    This actually produces benefits other than just battery life; one of the biggest problems with LCD's is that blacks aren't really black because the crystal isn't able to block 100% of the backlight.

    By dimming the backlight, the parts of an image that is supposed to be black will be closer to true black. This improves contrast and provides for more accurate pictures.
    Reply
  • UpSpin - Friday, December 02, 2011 - link

    how shall this work if the backlight is created by an LED array at the edge, which almost all smaller (<20") displays are. So with an edge lit backlight you can't reduce the LED brightness block wise.

    This is different if the used panel has a full array of LEDs, which is expensive, more power consuming and thicker, so a no go on a mobile device that thin. And neither Intel nor Nvidia have impact on this, because this requires a different panel.
    Reply
  • TechAnandUser - Friday, December 02, 2011 - link

    BenchMarking App's are not yet optimized. So please wait !! Reply
  • fteoath64 - Friday, December 02, 2011 - link

    Why would you want a Win8 tablet ?. Its slow, heavy, short battery-life and probably cost 50% more!.You are better off with a mid-range slim laptop which you probably have. So unless you grab one of these or already have an ipad, you have no idea what a tablet can do for you. Reply
  • eddman - Friday, December 02, 2011 - link

    You might have a time machine then, cause right now there are no win 8 tablets, let alone ARM based ones. Reply
  • tipoo - Thursday, December 15, 2011 - link

    "Its slow, heavy, short battery-life and probably cost 50% more"

    Citation needed, citation needed, citation needed, and citation needed respectively. None of us have tested finalized W8 table hardware yet, unless you are from the future.
    Reply

Log in

Don't have an account? Sign up now