The CPU

TI was one of the earliest partners with ARM on the Cortex A15 and silicon just came back from the fab at the beginning of this year. Even if Apple were similarly instrumental in the definition of the Cortex A15 architecture, it would be Q3 at the earliest before it could have working silicon available in volume. With no A15 design ready and presumably no desire to jump into the custom-designed ARM CPU market quite yet, Apple once again turned to the Cortex A9 for the A5X.

Apple confirmed that there are only two Cortex A9 cores on the A5X and it neglected to mention operating frequency. I suspect the lack of talk about CPU clocks indicates that they perhaps haven't changed. We could still be looking at a 1GHz max operating frequency.

Although we've speculated that Apple moved to a 32nm design with the A5X, it is entirely possible that we're still dealing with mature 45nm silicon here. It would explain the relatively conservative GPU clocks, although the additional GPU cores would balloon die size to 150 - 160mm^2 (roughly twice the size of Tegra 3). If A5X is 32nm, assuming a relatively conservative 80% scaling factor Apple would be able to maintain a die size of around 125mm^2, similar to the previous generation A5.

A quad-core CPU design does make some sense on a tablet, but only one that is either running heavily threaded workloads or is subjected to pretty intense multitasking. As we found in our iPhone 4S review, many iOS apps are still not very well threaded and have a difficult time utilizing two cores, much less four. On the multitasking front, Apple has enabled task switching but there's still no way to run two applications side-by-side. The most CPU intensive workloads on iOS still require that the app is active in the foreground for user interaction. Apps can work in the background but it's not all that constant/common, and again, they aren't pegging multiple cores. Apple built a very efficient, low overhead platform with iOS - it had to thanks to the hardware specs of the original iPhone. A result of iOS' low-overhead, very efficient design is expectedly low CPU utilization for most tasks. This is not to say that CPU performance isn't important under iOS, just that it's hard to find apps that regularly require more than a single core and definitely hard to find those that can benefit from more than two cores.

I will say though, Apple could easily add more cores if it wanted to spend the die area without a significant impact on power consumption. Remember that idle cores can be fully power gated, effectively reducing their power consumption while idle to zero. Apple could also assume a fairly conservative CPU governor and only wake up the third and fourth cores when absolutely necessary (similar to what we see happening with Tegra 3 on Android).

What about the Next iPhone?

Apple has traditionally used the iPad SoC in the subsequent iPhone release that followed later in the same year. It would make sense to assume that we'll see a smartphone version of the A5X SoC (at lower clocks) later this year. The A6? That'll probably debut next year with the 4th generation iPad.

Memory Capacity

Apple wouldn't let us run any third party applications on the new iPad so we couldn't confirm the actual memory capacity of the new model. On stage at the event, Epic mentioned that the new iPad has more memory and a higher output resolution than the Xbox 360 or PlayStation 3. The Xbox 360 has 512MB of memory, and Apple's A5/A5X has a dual-channel LPDDR2 memory controller. Each channel needs to be populated evenly in order to maintain peak bandwidth, which greatly narrows the options for memory capacity on the new iPad. 768MB would imply 512MB on one channel and 256MB on the other, delivering peak performance for apps and data in the first 512MB but lower performance for the upper 256MB. Given the low cost of DRAM these days, I think it's safe to assume that Apple simply went with two 512MB DRAM devices in a PoP configuration on the A5X for a total of 1GB of LPDDR2 memory in the new iPad.

4G LTE Support

Brian did an excellent analysis on the LTE baseband in the new iPad here. Qualcomm's MDM9600, a 40nm design appears to be used by Apple instead of the 28nm MDM9615. In hindsight, speculating the use of a 28nm LTE baseband for the new iPad was likely short sighted. Apple had to be in the mass production phase for the new iPad somewhere in the January/February timeframe. Although 28nm silicon is shipping to customers today, that was likely too aggressive of a schedule to make it work for an early-March launch.

Apple iPad Pricing
  16GB 32GB 64GB
WiFi $499 $599 $699
WiFi + 4G $629 $729 $829

Apple offers carrier specific iPad 4G models on AT&T and Verizon, although both versions can roam on 3G networks around the world. Apparently the iPad 4G isn't SIM locked, so you'll be able to toss in a SIM from other carriers with compatible networks. LTE data plans are available from AT&T and Verizon with no long-term contract:

iPad LTE Plan Pricing (Monthly)
  $14.99 $20 $30 $50
AT&T 250MB - 3GB 5GB
Verizon - 1GB 2GB 5GB

 

The Name

Apple surprised many by referring to the 3rd generation iPad simply as "the new iPad". The naming seems awkward today, but it's clearly a step towards what Apple does across many of its product lines. The MacBook Air, MacBook Pro and iPod all receive the same simple branding treatment; newer models are differentiated by a quietly spoken year or generation marker.

I still remember back several years ago when PC OEMs were intrigued by the idea of selling desktops based on model year and not on specs. Apple has effectively attained the holy grail here.

The GPU A Much Larger Battery
Comments Locked

161 Comments

View All Comments

  • name99 - Friday, March 9, 2012 - link

    "We could still be looking at a 1GHz max operating frequency."

    In all the playing with demo models, was no-one able to sneak any sort of benchmark, or even get a "it feels faster" feeling?

    Ignore LTE, consider WiFi models.
    My iPad1 gets astonishing battery life (like 20hrs) if it is only playing movies. That tells me that the screen just doesn't consume much power (and neither do the h264 chip and flash).
    Reading PDFs in GoodReader gives me substantially less time, maybe 10hrs, which tells me the CPU (plus RAM) still uses a remarkably large fraction of the power (and, perhaps, that GoodReader really ought to do a better job of allowing the CPU to sleep frequently).

    OK, switch to iPad3. Why should we believe that the new screen uses substantially more power than the current screen, if run at the same brightness? Most of the power burned by a screen is in creating the actual light, not in the toggling of each LCD transformer.

    Yet we have basically twice the battery available. This suggests to me EITHER

    - Apple REALLY wants the device to have a long lifetime as a game machine, while the GPU is burning twice as much power as in iPad2. This is probably true --- Apple seem to be pushing the "replacement for XBox, PS and Wii, and their portable versions" angle pretty strongly, and maybe they have data somewhere that the number one complaint of parents who buy competing portable gaming systems is that they run out of juice half-way through an 8hr drive or flight, leaving junior screaming and whining.

    AND/OR

    - we have increased the CPU/RAM maximum clock by maybe 30% or so, allowing for higher speed than iPad2 with the same sort of battery life for CPU intensive tasks (and longer battery life for simpler tasks like movies or listening to audio)

    Why didn't Apple just say the CPU is 30% faster? For the same reason Apple never wants to trumpet specs. They want to give the impression that their users don't have to worry about and comparison-shop specs --- Apple will just make sure that what they are buying at any point is a reasonably well balanced compromise between cost, performance, and battery life. They usually choose one semi-technical headline item to trumpet as "rah rah, better than last year" while remaining silent about the rest --- look at these sorts of product announcements for the past ten years. So, for example, this year is was "ooh twice as powerful GPU" but they didn't, for example, mention twice as much RAM (that slipped out from a 3rd party on stage), let alone how fast it is. Likewise in the past for different phones and iPads they haven't mentioned when they switched to DDR3 RAM from DDR2. Occasionally (eg 3GS) the CPU speed IS the headline item and then we'll get the bar graph of how it is twice as fast as its predecessor, but usually "faster CPU" is just taken for granted.

    Point is, to me there are multiple lines of evidence to a CPU and/or RAM clock boost.

    Also it would be nice to know (if not possible while at the Apple press conference, then at least as soon as we have devices in the wild) the other two performance numbers
    - has the WIFi gained either 40MHz or 2x2:2, so it's faster than the current 65/72Mbps PHY?
    - is the flash any faster?
  • tipoo - Wednesday, March 21, 2012 - link

    Same CPU speed is confirmed now, it benchmarks the exact same in anything CPU bound.
  • Supa - Friday, March 9, 2012 - link

    Great review, easy to read, direct to the point yet quite informative.

    Some sites will bash Apple just to get attention, others write generic reviews that have little depth.

    It's been refreshing to read, to say the least.
  • WaltFrench - Friday, March 9, 2012 - link

    “Apple continues to invest heavily in the aspects of its devices that users interact with the most frequently. Spending a significant amount of money on the display makes a lot of sense. Kudos to Apple for pushing the industry forward here.”

    And AnandTech continues to emphasize the aspects of technology that end up actually mattering in the real world. Kudos to this fine site for not obsessing over features that nobody can get any benefit out of.

    Meanwhile, it'd be good to look at the usage pattern that is evolving. Apple's iMovie, for example, seems to have been unparalleled before they upgraded it this week. A customer can go into an Apple store and ask to see iMovie demo'd, but they are unlikely to get a good feeling AT ALL if they go into their Staples or Best Buy and ask to see what it'd be like to slap together a 60-second homebrew video for Facebook, on any other tablet. If music, photos and video are what drive users' buying decisions, then competitors are going to have to sink a fair amount of energy into finely-tuned apps for those areas.
  • spda242 - Saturday, March 10, 2012 - link

    Anand/Brian could you please consider to investigate/write an article about why we Europeans are screwed by Apple when it comes to LTE support for our frequencies?

    I just don't get it, is it about hardware/software/marketing decisions/antenna design?

    I have spent hours on the web trying to understand why Apple haven't released European version of the iPad but no one seem to know?
  • Pantsu - Saturday, March 10, 2012 - link

    US LTE is different from EU LTE, different frequencies, and in practice far slower too. On the other hand LTE support isn't all that important at the moment in Europe since the operators aren't ready yet.

    iPad does support DC-HSDPA in Europe which is pretty much equivalent to US LTE.
  • spda242 - Saturday, March 10, 2012 - link

    I am from Sweden and we have quite good LTE coverage and as far as I understand other European (Germany and rest of scandinavia for example) contries are getting there, but I also understand that UK for example are completely lost so far when it comes to LTE.

    To buy a new iPad (and later on the new iPhone?) without LTE support would feel like buying last years product. I don't buy it for this year only. I want to use it for some years and ofcourse Apple has "sold" me a LTE device and now I want it.

    But my question was rather if there are technical reasons, if so which ones or if it is marketing reasons?
  • Steelbom - Saturday, March 10, 2012 - link

    Anand, you said: "Also keep in mind that memory bandwidth limitations will keep many titles from running at the new iPad's native resolution. Remember that we need huge GPUs with 100s of GB/s of memory bandwidth to deliver a high frame rate on 3 - 4MP PC displays. I'd expect many games to render at lower resolutions and possibly scale up to fit the panel."

    However, Real Racing 2 supports 1080p output (not upscaled) on an HDTV at 30 FPS. That's 2 million pixels, 1536p is only another 1.1 million, and it's got two additional PowerVR SGX543's to help it along. I don't know what the memory bandwidth of a PowerVR SGX543 is, or if it stacks with multiples of them, but wouldn't the additional two 543's mean it could handle 4 million pixels at 30 FPS?
  • tipoo - Saturday, March 10, 2012 - link

    Bandwidth and core performance are two separate things, keep in mind these SoCs use shared memory for both the CPU cores and the GPU. The iPad 2's memory read score was only 334.2 MB/s

    http://www.anandtech.com/show/4215/apple-ipad-2-be...
  • Steelbom - Saturday, March 10, 2012 - link

    Ah, right. I see. What does that mean exactly? What would the 360's memory bandwidth be roughly? (Is that the bandwidth on the GPU?)

    Cheers

Log in

Don't have an account? Sign up now