Yesterday Apple unveiled its third generation iPad, simply called the new iPad, at an event in San Francisco. The form factor remains mostly unchanged with a 9.7-inch display, however the new device is thicker at 9.4mm vs. 8.8mm for its predecessor. The added thickness was necessary to support the iPad's new 2048 x 1536 Retina Display.

Tablet Specification Comparison
  ASUS Transformer Pad Infinity Apple's new iPad (2012) Apple iPad 2
Dimensions 263 x 180.8 x 8.5mm 241.2 x 185.7 x 9.4mm 241.2 x 185.7 x 8.8mm
Display 10.1-inch 1920 x 1200 Super IPS+ 9.7-inch 2048 x 1536 IPS 9.7-inch 1024 x 768 IPS
Weight (WiFi) 586g 652g 601g
Weight (4G LTE) 586g 662g 601g
Processor (WiFi)

1.6GHz NVIDIA Tegra 3 T33 (4 x Cortex A9)

Apple A5X (2 x Cortex A9, PowerVR SGX 543MP4)

1GHz Apple A5 (2 x Cortex A9, PowerVR SGX543MP2)
Processor (4G LTE) 1.5GHz Qualcomm Snapdragon S4 MSM8960 (2 x Krait)

Apple A5X (2 x Cortex A9, PowerVR SGX 543MP4)

1GHz Apple A5 (2 x Cortex A9, PowerVR SGX543MP2)
Connectivity WiFi , Optional 4G LTE WiFi , Optional 4G LTE WiFi , Optional 3G
Memory 1GB 1GB 512MB
Storage 16GB - 64GB 16GB - 64GB 16GB
Battery 25Whr 42.5Whr 25Whr
Pricing $599 - $799 est $499 - $829 $399, $529

Driving the new display is Apple's A5X SoC. Apple hasn't been too specific about what's inside the A5X other than to say it features "quad-core graphics". Upon further prodding Apple did confirm that there are two CPU cores inside the SoC. It's safe to assume that there are still a pair of Cortex A9s in the A5X but now paired with a PowerVR SGX543MP4 instead of the 543MP2 used in the iPad 2. The chart below gives us an indication of the performance Apple expects to see from the A5X's GPU vs what's in the A5:

Apple ran the PowerVR SGX 543MP2 in its A5 SoC at around 250MHz, which puts it at 16 GFLOPS of peak theoretical compute horsepower. NVIDIA claims the GPU in Tegra 3 is clocked higher than Tegra 2, which was around 300MHz. In practice, Tegra 3 GPU clocks range from 333MHz on the low end for smartphones and reach as high as 500MHz on the high end for tablets. If we assume a 333MHz GPU clock in Tegra 3, that puts NVIDIA at roughly 8 GFLOPS, which rationalizes the 2x advantage Apple claims in the chart above. The real world performance gap isn't anywhere near that large of course - particularly if you run on a device with a ~500MHz GPU clock (12 GFLOPS):

GLBenchmark 2.1.1 - Egypt - Offscreen (720p)

GLBenchmark 2.1.1's Egypt offscreen test pegs the PowerVR SGX 543MP2 advantage at just over 30%, at least at 1280 x 720. Based on the raw FP numbers for a 500MHz Tegra 3 GPU vs. a 250MHz PowerVR SGX 543MP2, around a 30% performance advantage is what you'd expect from a mostly compute limited workload. It's possible that the gap could grow at higher resolutions or with a different workload. For example, look at the older GLBenchmark PRO results and you will see a 2x gap in graphics performance:

GLBenchmark 2.1.1 - PRO - Offscreen (720p)

For most real world gaming workloads I do believe that the A5 is faster than Tegra 3, but the advantage is unlikely to be 2x at non-retinadisplay resolutions. The same applies to the A5X vs. Tegra 3 comparison. I fully expect there to be a significant performance gap at the same resolution, but I doubt it is 4x in a game.

Mobile SoC GPU Comparison
  Apple A4 Apple A5 Apple A5X Tegra 3 (max) Tegra 3 (min) Intel Z2580
GPU PowerVR SGX 535 PowerVR SGX 543MP2 PowerVR SGX 543MP4 GeForce GeForce PowerVR SGX 544MP2
MADs per Clock 4 32 64 12 12 32
Clock Speed 250MHz 250MHz 250MHz 500MHz 333MHz 533MHz
Peak Compute 2.0 GFLOPS 16.0 GFLOPS 32.0 GFLOPS 12.0 GFLOPS 8.0 GFLOPS 34.1 GFLOPS

The A5X doubles GPU execution resources compared to the A5. Imagination Technologies' PowerVR SGX 543 is modular - you can expand by simply increasing "core" count. Apple tells us all we need to know about clock speed in the chart above: with 2x the execution resources and 2x the performance of the A5, Apple hasn't changed the GPU clock of the A5X.

Assuming perfect scaling, I'd expect around a 2x performance gain over Tegra 3 in GLBenchmark (Egypt) at 720p. Again, not 4x but at the same time, hardly insignificant. It can take multiple generations of GPUs to deliver that sort of a performance advantage at a similar price point. Granted Apple has no problems eating the cost of a larger, more expensive die, but that doesn't change the fact that the GPU advantage Apple will hold thanks to the A5X is generational.

I'd also point out that the theoretical GPU performance of the A5X is identical to what Intel is promising with its Atom Z2580 SoC. Apple arrives there with four SGX 543 cores, while Intel gets there with two SGX 544 cores running at ~2x the frequency (533MHz vs. 250MHz).

With the new iPad's Retina Display delivering 4x the pixels of the iPad 2, a 2x increase in GPU horsepower isn't enough to maintain performance. If you remember back to our iPad 2 review however, the PowerVR SGX 543MP2 used in it was largely overkill for the 1024 x 768 display. It's likely that a 4x increase in GPU horsepower wasn't necessary to deliver a similar experience on games. Also keep in mind that memory bandwidth limitations will keep many titles from running at the new iPad's native resolution. Remember that we need huge GPUs with 100s of GB/s of memory bandwidth to deliver a high frame rate on 3 - 4MP PC displays. I'd expect many games to render at lower resolutions and possibly scale up to fit the panel.

What About the Display?

Performance specs aside, the iPad's Retina Display does look amazing. The 1024 x 768 panel in the older models was simply getting long in the tooth and the Retina Display ensures Apple won't need to increase screen resolution for a very long time. Apple also increased color gamut by 44% with the panel, but the increase in resolution alone is worth the upgrade for anyone who spends a lot of time reading on their iPad. The photos below give you an idea of just how sharp text and graphics are on the new display compared to its predecessor (iPad 2, left vs. new iPad, right):

The improvement is dramatic in these macro shots but I do believe that it's just as significant in normal use. 

Apple continues to invest heavily in the aspects of its devices that users interact with the most frequently. Spending a significant amount of money on the display makes a lot of sense. Kudos to Apple for pushing the industry forward here. The only downside is supply of these greater-than-HD panels is apparently very limited as a result of Apple buying up most of the production from as many as three different panel vendors. It will be a while before we see Android tablets with comparable resolutions, although we will see 1920 x 1200 Android tablets shipping in this half.

The CPU & More
Comments Locked

161 Comments

View All Comments

  • name99 - Friday, March 9, 2012 - link

    "We could still be looking at a 1GHz max operating frequency."

    In all the playing with demo models, was no-one able to sneak any sort of benchmark, or even get a "it feels faster" feeling?

    Ignore LTE, consider WiFi models.
    My iPad1 gets astonishing battery life (like 20hrs) if it is only playing movies. That tells me that the screen just doesn't consume much power (and neither do the h264 chip and flash).
    Reading PDFs in GoodReader gives me substantially less time, maybe 10hrs, which tells me the CPU (plus RAM) still uses a remarkably large fraction of the power (and, perhaps, that GoodReader really ought to do a better job of allowing the CPU to sleep frequently).

    OK, switch to iPad3. Why should we believe that the new screen uses substantially more power than the current screen, if run at the same brightness? Most of the power burned by a screen is in creating the actual light, not in the toggling of each LCD transformer.

    Yet we have basically twice the battery available. This suggests to me EITHER

    - Apple REALLY wants the device to have a long lifetime as a game machine, while the GPU is burning twice as much power as in iPad2. This is probably true --- Apple seem to be pushing the "replacement for XBox, PS and Wii, and their portable versions" angle pretty strongly, and maybe they have data somewhere that the number one complaint of parents who buy competing portable gaming systems is that they run out of juice half-way through an 8hr drive or flight, leaving junior screaming and whining.

    AND/OR

    - we have increased the CPU/RAM maximum clock by maybe 30% or so, allowing for higher speed than iPad2 with the same sort of battery life for CPU intensive tasks (and longer battery life for simpler tasks like movies or listening to audio)

    Why didn't Apple just say the CPU is 30% faster? For the same reason Apple never wants to trumpet specs. They want to give the impression that their users don't have to worry about and comparison-shop specs --- Apple will just make sure that what they are buying at any point is a reasonably well balanced compromise between cost, performance, and battery life. They usually choose one semi-technical headline item to trumpet as "rah rah, better than last year" while remaining silent about the rest --- look at these sorts of product announcements for the past ten years. So, for example, this year is was "ooh twice as powerful GPU" but they didn't, for example, mention twice as much RAM (that slipped out from a 3rd party on stage), let alone how fast it is. Likewise in the past for different phones and iPads they haven't mentioned when they switched to DDR3 RAM from DDR2. Occasionally (eg 3GS) the CPU speed IS the headline item and then we'll get the bar graph of how it is twice as fast as its predecessor, but usually "faster CPU" is just taken for granted.

    Point is, to me there are multiple lines of evidence to a CPU and/or RAM clock boost.

    Also it would be nice to know (if not possible while at the Apple press conference, then at least as soon as we have devices in the wild) the other two performance numbers
    - has the WIFi gained either 40MHz or 2x2:2, so it's faster than the current 65/72Mbps PHY?
    - is the flash any faster?
  • tipoo - Wednesday, March 21, 2012 - link

    Same CPU speed is confirmed now, it benchmarks the exact same in anything CPU bound.
  • Supa - Friday, March 9, 2012 - link

    Great review, easy to read, direct to the point yet quite informative.

    Some sites will bash Apple just to get attention, others write generic reviews that have little depth.

    It's been refreshing to read, to say the least.
  • WaltFrench - Friday, March 9, 2012 - link

    “Apple continues to invest heavily in the aspects of its devices that users interact with the most frequently. Spending a significant amount of money on the display makes a lot of sense. Kudos to Apple for pushing the industry forward here.”

    And AnandTech continues to emphasize the aspects of technology that end up actually mattering in the real world. Kudos to this fine site for not obsessing over features that nobody can get any benefit out of.

    Meanwhile, it'd be good to look at the usage pattern that is evolving. Apple's iMovie, for example, seems to have been unparalleled before they upgraded it this week. A customer can go into an Apple store and ask to see iMovie demo'd, but they are unlikely to get a good feeling AT ALL if they go into their Staples or Best Buy and ask to see what it'd be like to slap together a 60-second homebrew video for Facebook, on any other tablet. If music, photos and video are what drive users' buying decisions, then competitors are going to have to sink a fair amount of energy into finely-tuned apps for those areas.
  • spda242 - Saturday, March 10, 2012 - link

    Anand/Brian could you please consider to investigate/write an article about why we Europeans are screwed by Apple when it comes to LTE support for our frequencies?

    I just don't get it, is it about hardware/software/marketing decisions/antenna design?

    I have spent hours on the web trying to understand why Apple haven't released European version of the iPad but no one seem to know?
  • Pantsu - Saturday, March 10, 2012 - link

    US LTE is different from EU LTE, different frequencies, and in practice far slower too. On the other hand LTE support isn't all that important at the moment in Europe since the operators aren't ready yet.

    iPad does support DC-HSDPA in Europe which is pretty much equivalent to US LTE.
  • spda242 - Saturday, March 10, 2012 - link

    I am from Sweden and we have quite good LTE coverage and as far as I understand other European (Germany and rest of scandinavia for example) contries are getting there, but I also understand that UK for example are completely lost so far when it comes to LTE.

    To buy a new iPad (and later on the new iPhone?) without LTE support would feel like buying last years product. I don't buy it for this year only. I want to use it for some years and ofcourse Apple has "sold" me a LTE device and now I want it.

    But my question was rather if there are technical reasons, if so which ones or if it is marketing reasons?
  • Steelbom - Saturday, March 10, 2012 - link

    Anand, you said: "Also keep in mind that memory bandwidth limitations will keep many titles from running at the new iPad's native resolution. Remember that we need huge GPUs with 100s of GB/s of memory bandwidth to deliver a high frame rate on 3 - 4MP PC displays. I'd expect many games to render at lower resolutions and possibly scale up to fit the panel."

    However, Real Racing 2 supports 1080p output (not upscaled) on an HDTV at 30 FPS. That's 2 million pixels, 1536p is only another 1.1 million, and it's got two additional PowerVR SGX543's to help it along. I don't know what the memory bandwidth of a PowerVR SGX543 is, or if it stacks with multiples of them, but wouldn't the additional two 543's mean it could handle 4 million pixels at 30 FPS?
  • tipoo - Saturday, March 10, 2012 - link

    Bandwidth and core performance are two separate things, keep in mind these SoCs use shared memory for both the CPU cores and the GPU. The iPad 2's memory read score was only 334.2 MB/s

    http://www.anandtech.com/show/4215/apple-ipad-2-be...
  • Steelbom - Saturday, March 10, 2012 - link

    Ah, right. I see. What does that mean exactly? What would the 360's memory bandwidth be roughly? (Is that the bandwidth on the GPU?)

    Cheers

Log in

Don't have an account? Sign up now