Tegra 3 GPU: Making Honeycomb Buttery Smooth

The bigger impact on the overall experience is the Tegra 3's GPU. If you remember back to our initial analysis of Tegra 3 you'll know that the GPU is not only clocked higher but it also has more execution resources at its disposal. To further improve performance, per "core" efficiency is up thanks to some larger internal data structures and tweaks. The end result is much better gaming performance as well as a much smoother UI.

Tasks like bringing up the apps launcher or even swiping between home screens are finally far above 30 fps. While Tegra 2 didn't have the fill rate to deal with some of the more complex overlays in Honeycomb, Tegra 3 does. The move to Tegra 3 makes the Honeycomb experience so much better. This is what it should've been like from the start.

Gaming performance is also significantly better as you can see from our standard collection of Android GPU benchmarks:

GLBenchmark 2.1 - Egypt - Offscreen 720p

GLBenchmark 2.1 - Pro - Offscreen 720p

BaseMark ES2.0 - Hover (1024 x 768)

BaseMark ES2.0 - Taiji (1024 x 768)

Performance is still not quite up to par with the iPad 2, but if we look at GLBenchmark's Egypt test Tegra 3 doesn't do too bad. The gap grows in more texture bound tests but in a heavier shader environment Tegra 3 isn't too shabby. While it's clear that Tegra 2 wasn't enough to deal with the 1280 x 752 resolution of Honeycomb tablets, Tegra 3 seems well matched.

Note that the BaseMark ES2.0 tests run at FP16 on Tegra 2 and 3 vs. FP24 on the PowerVR SGX 543MP2.

CPU Performance The Display: Perfect
Comments Locked

204 Comments

View All Comments

  • medi01 - Thursday, December 1, 2011 - link

    So am I.
    But as I recently discovered, it's much easier to switch on tethering on my phone and connect via wi-fi than to swap sim card between devices..

    It was hard for me to justify having 2 internet enabled sim cards, but it might be just me.
  • Kegetys - Thursday, December 1, 2011 - link

    I find tethering to be a huge battery drain for the cellphone and it's usually not that practical either for anything else than occasional "emergency" use. But I have three sim cards from my carrier all with unlimited use anyway so I dont need to do any sim swapping. I guess if you need to pay extra for that then tethering is a reasonable alternative.
  • MiSoFine - Thursday, December 1, 2011 - link

    3 SIM cards with unlimited use for no extra cost? Who's your carrier? I was going to suck it up and pay AT&T the extra money for tethering, but if that's an option, I'll take it!
  • Kegetys - Thursday, December 1, 2011 - link

    > Who's your carrier?

    Saunalahti :)
  • medi01 - Thursday, December 1, 2011 - link

    Hi,

    could you include "time it takes to fully charge" please?
    On samsung tab it takes surprisingly long (about 4 hours) for some it might matter.
  • metafor - Thursday, December 1, 2011 - link

    That's going to be true of any device that standardizes on a USB 2.0 connection -- which I think all Android tablets thus far use; it's just a different connector.

    iDevices sort of get around this by using a non-standard USB connection (up to 1A vs the standard 500mA) which is why it can charge faster.

    It won't be until USB 3.0 becomes more common that charging speeds will really pick up.
  • Mugur - Friday, December 2, 2011 - link

    Well, this is not quite right. My 2 phones have 1000mA chargers through USB even if the standard for pc is max 500mA. My Nook Color has a non standard but downwards compatible USB charger with around 1900mA.

    I agree though that the tablets recharge time is slow...
  • anandtech pirate - Thursday, December 1, 2011 - link

    the PowerVR SGX 543MP2 is a beast. I still remember waaaay back when powerVR used to make pc graphic cards.
  • Death666Angel - Thursday, December 1, 2011 - link

    They still do Intel integrated graphics in the Atom, if I'm not mistaken. :-) They were the supplier of all Intel motherboard IGPs as well, though those aren't around anymore. :D
  • Penti - Friday, December 2, 2011 - link

    They were not the supplier of Intel's IGP's, only the Atom US15W/L/US11L one and some (not all) of the Atom integrated graphics in the CPU and in variants of SoC. Intel has made their own graphics since i740. Thus Intel GMA is their own tech. Own drivers. And so on. Only GMA500 and GMA600 (SoC), and newer GMA3600 and 3650, and likely GMA5650 in D2600/2700 is PowerVR. They don't have exactly excellent drivers for Windows and GNU/Linux desktops.

    GMA3150 is Intel, which runs in the latest Intel Atom N4XX and N5XX series, D4XX and D5XX.

Log in

Don't have an account? Sign up now