Back to Article

  • MrPhilo - Wednesday, September 12, 2012 - link

    If it was the Cortex A15, I'm sure Apple will include that in its presentation since it'll be the first company to launch a A15, it is just a rebranded Exynos chip IMO, and a higher clocked GPU. Reply
  • EnzoFX - Wednesday, September 12, 2012 - link

    I don't think so. However if it were quad core, that's easy to sell, "4 cores" is simple and easy. A15 doesn't mean a whole lot to the average user. Reply
  • pravin_tavagad - Wednesday, September 12, 2012 - link

    A15 is much better and advanced architecture than A9.
    It is very power efficient and fast that event dual core A15 can easily beat quad core A9 in every aspect.
    TI launched video comparing A15 and quad A9
  • Gich - Wednesday, September 12, 2012 - link

    Qualcomm does SoCs its own way, but at the end of the day isn't Krait an A15? Reply
  • lowlymarine - Wednesday, September 12, 2012 - link

    Not quite. It's somewhere between A9 and A15, much like Scorpion was somewhere between A8 and A9. That said, dual-core Krait @ 1.5 GHz is still much more than twice as fast as dual-core A9 @ 800 MHz, so even taking Apple's statements at face value, this isn't going to top the frustratingly ubiquitous. MSM8960. Reply
  • Aenean144 - Wednesday, September 12, 2012 - link

    Exynos GPUs will not have 2x the GPU power of the A5 GPU. It's either a SGX543MP4 or a Rogue 6x00 of some sort.

    This is an Apple SoC through and through, not a rebrand.
  • glenns - Wednesday, September 12, 2012 - link

    I would be surprised if it is a15 bit we'll soon see Reply
  • 1008anan - Wednesday, September 12, 2012 - link

    Look forward to frequent updates Anand and Brian.

    Unrelated, how does Apple keep some of their SoC tech from being stolen by their fab Samsung? Strictly as a practical matter?

    Any word yet that Apple might split SoC production between Samsung's 32nm LP HK+MG process and another foundry?

    single-chip 28nm MDM9615 LTE baseband. What is the TDP on it?
  • lowlymarine - Wednesday, September 12, 2012 - link

    What's to steal? Apple is a fabless licensee. They buy pre-designed CPU cores from ARM and pre-designed GPU cores from Imagination Technologies and slap them together. They aren't like Qualcomm, who design their own architecture that implements the ARMv7 instruction set; all they "design" is the interconnects.

    Or as I saw it put once: they don't "design" the Ax SoCs, they just create "layouts." The same is true of Samsung and TI, of course.
  • 1008anan - Wednesday, September 12, 2012 - link

    This is true of previous A series designs by AAPL. But Apple's ambition is to increasingly add their own customized IP.

    Let us see if A6 turns out to be rebranded A-15 Cortex designs almost straight from Arm without customization.

    Brian and Anand, any comments?

    I think you underestimate the amount of hardware customization that TXN does with OMAP that doesn't come from ARM. Not enough customization IMHO, but still.
  • raptorious - Wednesday, September 12, 2012 - link

    Anand, the Cortex A15 is significantly larger than A9 (it's out-of-order after all). Even with the shrink to 32nm, I don't see how Apple could double graphics perf (almost always leads to roughly double the transistor count of the graphics) AND move to Cortex A15, all while making the die _22%_ smaller. Not possible. Reply
  • Godofmosquitos - Wednesday, September 12, 2012 - link

    The A9 is also out-of-order. That's not what differentiates them. Reply
  • raptorious - Wednesday, September 12, 2012 - link

    A9 is very limited out-of-order, not even close to the degree that A15 is.In any case, that is beside the point, A15 is a much beefier core, power and area wise. Reply
  • smartypnt4 - Wednesday, September 12, 2012 - link

    They were unclear on the 22% figure. It could be the package or the die that has shrunk, or both. All they said was that the "chip" is 22% smaller. We won't know for a while yet. Chipworks needs to get one to tear the A6 out and analyze.

    However, just a shrink would lead to a 28.9% reduction in two dimensions (assuming the architecture is identical). With die sizes, the die shrink will be pretty much linear in both directions, for a difference of 50.5% total. See the iPad 2,4 review:

    Given that, SOMETHING got way bigger here if we're talking about a 22% smaller die. Either the GPU exploded, which is possible if they used an SGX543MP4 like the iPad 3, or the A15 cores are far larger than the A9's. I can't venture a guess on which of those is correct since we know nothing about the GPU yet.

    So, basically: yes, it's entirely possible the cores are A15's. Anand wouldn't have said they were A15's unless he was sure; he doesn't make a habit of speculating on anything.
  • raptorious - Wednesday, September 12, 2012 - link

    I looked the iPad 2,4 review just before you posted this, and given that the 32 nm A5 is about 57% the size of the 45nm, it does seem plausible that this is both dual-A15 and an upgrade to SGX543MP4. Reply
  • smartypnt4 - Thursday, September 13, 2012 - link

    Brian just said on his twitter that he's pretty sure it's an SGX 543MP3. Which actually makes a lot of sense. Saves on die area and idle power, and you can clock the GPU higher to reach the 2x performance claim. Makes a lot of sense. I'd never thought of using a 3-core GPU. It's not that far-fetched an idea, though. Very plausible. Reply
  • lukarak - Wednesday, September 12, 2012 - link

    2GB in an S3?

    Also, with the iP4 and iP4S you have the horizontal resolution first, and with the iP5 and S3 you have the vertical first?
  • lukarak - Wednesday, September 12, 2012 - link

    NVM, that's the US one, not International which has 1 GB Reply
  • cspan123 - Wednesday, September 12, 2012 - link

    It's a Samsung made chip. I think much much is undisputed. It's worth noting then that even Samsung hasn't yet announced (let alone shipped) a product with Cortex A-15 processors in it. The Galaxy Note II is quad core A9.

    In this article, it was written:
    "The GPU side isn't entirely clear at this point, but the 2x gains could be had through a move to 4 PowerVR SGX543 cores up from 2 in the iPhone 4S."
    Well, the same logic could be applied to a move to quad core Cortex A9s, which is exactly what the Exynos 4412 is. Apple didn't give the benchmarks used for their 2x claim and have been known to fudge things in their favor. Maybe they claim 4 cores is douple the speed of 2?

    Finally, is claiming the A6 processor is "ARM Cortex-A9 cores with NEON SIMD accelerators, quad-core PowerVR SGX543MP2 graphics adapter."
  • Aenean144 - Wednesday, September 12, 2012 - link

    Apple quoted app launch times that were 1.5 to 1.9 times faster. You don't get that with more cores. You only get that with a higher performance core or faster storage. Reply
  • lowlymarine - Wednesday, September 12, 2012 - link

    Application launch times are 95% about the speed of your storage and memory, which makes it a fantastically bullshit benchmark to talk about CPU performance with. A 1.5x increase in app launch speed could easily be accomplished by improvements in NAND and controller performance. You don't buy a faster CPU to make your programs launch faster, you buy a faster SSD. While I realize that current ARM CPUs are dog slow compared to even the lowest of low-end current x86 CPUs, I'm not going to believe the marketing noise until actual fully-CPU-dependent benchmarks can be run. Reply
  • mavere - Wednesday, September 12, 2012 - link

    In terms of marketing speak, "quad core" is a more impressive selling line than "2x faster". I think Apple would have specifically wrote quad core for any promotional materials if they could have.

    Also, I was gonna mention something about preferring to defer to Anand over Xbit, but I looked at Xbit's article and they wrote:

    "iPhone sports brand-new Apple A6 system-on-chip with two ARM Cortex-A15 cores with NEON SIMD accelerators, quad-core PowerVR SGX543MP2 graphics adapter"

    They too are jumping on the A15 bandwagon.
  • lowlymarine - Wednesday, September 12, 2012 - link

    Then again the presence of "quad-core PowerVR SGX543MP2 graphics adapter" doesn't inspire confidence in their sources. Reply
  • zzing123 - Thursday, September 13, 2012 - link

    With Passbook but no NFC in the iPhone 5 causes a massive set back to NFC adoption. The problem is that for retailers to make use of NFC, they need a large install base of phones, and if the iPhone began to support it, it becomes a viable solution.

    Apple could also gain, because they could create a Google Wallet-style service on the back of iTunes that allows Apple to control the NFC transactions.

    Nevertheless the lack of NFC is a feature that the iPhone 5 should have had, and it's pretty stupid that it doesn't, as it's to everyone's detriment, including Apple.

    As for the rest of the iPhone 5? Beyond the A6 and LTE, it's rather underwhelming, and iOS6 really only serves to lock in customers even more, tbh.

    I also think that the new iPod touch should have had the full 8MP camera as the iPhone at the very least, if not have *a better* camera than the iPhone to make it a defacto point and shoot device that also is a gaming platform. I don't understand why Apple continues to make the iPhone the 'better' device and not allow the iPod touch to carve out it's own niche as a fully-enabled device.

    Missed opportunities and underwhelming.
  • name99 - Thursday, September 13, 2012 - link

    "Apple could also gain, because they could create a Google Wallet-style service on the back of iTunes that allows Apple to control the NFC transactions."

    Let's live in the real world here, as opposed to fairy land.

    The credit card companies (which is all the banks) make money off transactions right now. They aren't willing to give up that control and reduce their take by helping NFC move forward.
    The phone companies aren't interested in NFC and would rather implement their own half-assed unsuccessful solutions (see how VZW has treated Google Wallet).

    So how do you move forward? Either Apple
    - works through credit card companies and makes no money on the transactions (so why bother?) OR
    - Apple makes money on the transactions (so they cost more than credit card transactions)

    Neither of these seem like a particularly winning strategy. Google may be willing to lose a little money on Google Wallet because it fits in with their model of learning everything about everyone (in this case their financial transactions) so they can sell more targeted advertising, but that doesn't fit Apple's business model.

    Their are alternative models.
    * Apple could essentially become a bank --- you deposit some amount with them, and then use NFC as a debit card against that amount. Yeah, you think Apple want's all the hassle and regulatory scrutiny that will invite to get into a low-margin business?
    * Apple works with money-market companies to withdraw money from those accounts rather than bank accounts. MIGHT work, but
    (a) only a fraction of the population have money market accounts AND
    (b) many of those are held either AT banks, or at institutions with such close ties to/dependencies on banks that they aren't going to go against the banks' wishes.
  • Impulses - Friday, September 14, 2012 - link

    Last I checked the credit card companies AND the carriers have mostly lined up behind the ISIS initiative to push NFC... It's taking forever but it's got the most backers AFAIK. Then there's Google Wallet and half a dozen other competing players because apparently we enjoy birthing new products and standards in the most painful and competitive way possible! Reply
  • orresearch - Sunday, September 23, 2012 - link

    Anyone has any insight into the 3 microphone noise reduction usage on the iPhone 5?

    Since 1 of the microphones is located so close to the earpiece, I believe that one is being used for ANC (Active Noise Cancellation) function and uses the earpiece to cancel some of that ambience noise. The remaining two microphones are for voice communications beamforming type of usages.

    Normally, you do not put a microphone so close to the earpiece due to acoustic coupling of the earpiece signal back into the microphone and the acoustic echo is then difficult to cancel.

Log in

Don't have an account? Sign up now