Analysis of the new Apple iPad

by Anand Lal Shimpi on 3/9/2012 1:37 AM EST
POST A COMMENT

161 Comments

Back to Article

  • Confusador - Friday, March 09, 2012 - link

    "It really is great to see Apple pushing display technology so aggressively and at reasonable price points. I do hope it's only a matter of time before we see a similar trend on the Mac side."

    I have to agree with this sentiment, as I've been hoping for higher resolution monitors for some time. Not that I use a Mac, but still it would be nice to see pressure coming from somewhere - I certainly expect to benefit from the pressures this display will put on Android tablets.
    Reply
  • antef - Friday, March 09, 2012 - link

    It matters a lot less on a desktop monitor that you're probably sitting a good couple feet away from. During normal usage of my 1920x1200 24" display I don't really feel like it needs extra resolution or sharpness. And yes there's an iPhone 4S in my household so I have that to compare to. Reply
  • ZeDestructor - Friday, March 09, 2012 - link

    I have used 1920x1200 displays (not mine), and I'd quite like a boost. Text in partcular doesn't render properly at small sizes, and as someone who deals a lot with text (IRC, code), finding that one font-size that's small enough to show a lot of code but big enough to render properly is annoying. Especially if you like having smooth edges. Reply
  • WaltFrench - Friday, March 09, 2012 - link

    I don't imagine that many people will buy an iPad for coding, even if it gives more visual bandwidth.

    But I'm a bit surprised that you don't find acceptable fonts for such high-pixel-count screens. Maybe it's a rendering issue???
    Reply
  • chemist1 - Sunday, March 11, 2012 - link

    "Especially if you like having smooth edges."

    Yes, I think that's precisely what it comes down to. I use my 24" Dell 2408WFP Ultrasharp, 1920x1200, principally for text, and I'm happy with the resolution (94 ppi) -- because, on my Mac, I defeat as much of the text smoothing as possible (for coding, I find unsmoothed Monaco is the sharpest-looking in Terminal). OTOH, even with the higher-res 129 ppi screen on my notebook, anti-aliased fonts look blurry to me. I don't know what pixel density would be necessary to make them look sharp -- I suspect even doubling the res. of the 24" display would not be sufficient. Laser printer output looks sharp enough to me, but I understand that is 600 -1800 dpi.
    Reply
  • dasgetier - Friday, March 09, 2012 - link

    As an enthusiast hobby photographer and developer, I would definitely love to see 24" or 27" monitors with 4K resolutions at 16:10 aspect ratios. Reply
  • B3an - Friday, March 09, 2012 - link

    I'd love 4K res on a 30" monitor. That would be so perfect for image editing... and playing games of course.

    And as for the iPad 3, totally not interested. Waiting for Windows 8 tablets. Preferably one with a keyboard dock, because then it will also replace my laptop. A dockable Win 8 tablet is pretty much the ultimate portable device and i've no idea why anyone would buy an iPad once these things are out later this year.
    Reply
  • french toast - Friday, March 09, 2012 - link

    Yes i totally agree with you, wp7 Mango is already as slick and fast as ios 5.1... on far less powerfull hardware.

    W8 will bring the same level of slickness on comparable hardware..just with the added benefit of more freedom and more power....Microsoft is back in the game baby.
    Reply
  • Michiel - Friday, March 09, 2012 - link

    Dreams are good, dreams are fun, dreams are interesting.

    Keep on dreaming.

    Microsoft is down and out !
    Reply
  • WaltFrench - Friday, March 09, 2012 - link

    Microsoft has a great opportunity in the Enterprise, because companies are already on board with management tools, development tools, training and the like.

    That's not what moves phones, nor is it what moves consumer products. Win8 doesn't seem to offer desktop users much of anything they don't already have, and the tablet-specific stuff comes with a bunch of new limitations (no Flash in the mobile IE; no legacy apps in Windows on ARM) that will require careful attention to deployment plans over the next couple of years. With PC sales on the wane in the developed world, and zero presence in the consumer space, the very nice and useful Win8 features are not likely to result in a lot of sales at least thru the first half of 2013.

    By which time it wouldn't be surprising to see “the new iPad (4th generation; early 2013” together with even better tools to exploit Apple's expansion into business. And their so-far total domination of the non-subsidized tablet business. And their continuing success with consumer handsets.

    Microsoft is more than capable of first-class software development. Its mistakes in the phone area alone are enough that I'm surprised Ballmer still has his job. They have their work cut out to communicate to individuals what Win8 ARM/X86 tablets and multiple phone brands will do for consumers that THEY want to do.
    Reply
  • robinthakur - Tuesday, March 13, 2012 - link

    I've noticed that the trends in the organisation are to enable user's personal devices for company data. Spearheading that is the iPad/iPhones. In the last 4 companies I've worked for as a contractor, they were both issuing company iPads and enabling people's own devices, having changed the IT policy in response to user/management pressure. This is not just small companies either. BoA have now started doing it. Using Exchange you can enforce passcodes and remote wipe them. This is the current trend, and I find it hard to believe that MS will have anything on the market which will remotely worry Apple. It's not like you can use all the old windows software on a Windows 8 tablet running on Arm, and users still have to learn a new and somewhat unintuitive interface, so where' s the win there? People have got used to the super intuitive iOS since 2007 and they like it, hence the change. To most users, even in business, Windows is far too complicated for its own good. I see this everyday.

    The only possible win is decent Office/SharePoint integration, but since Office will shortly be arriving on iPad and there are already many solutions in the App Store for SharePoint integration, it's something of a moot point IMO. The only thing Windows had going for it were compatibility and a familiar interface which people had grown up with, which the decent Windows 7 offers. This is why the phones failed and this is also why Windows 8 will underperform. Windows 7 will survive in a similar way to XP, far past MS's predictions.
    Reply
  • wilmarkj - Sunday, March 11, 2012 - link

    All Current 30" monitors are already 4 MP. But they cost around 1100$ but they are high quality (16 bit color and SIPS), at thats at the same dpi as current 24" 1900x1200 monitors not anywhere close to retina quality, which i thought was over 300 dpi. Reply
  • solipsism - Sunday, March 11, 2012 - link

    It's a factor dpi/ppi + distance from your retina. That's why an HDTV could be considered Retina Display at pixels than a handheld device.

    The equation for 20/20 vision is: 3438 * (1/x) = y, where x is the minimum distance away from your eyes it has to be placed and y is the number of pixels per inch.

    If you have a 46" 1080p HDTV that is 48 PPI so the equation is: 3438 * (1/x) = 48, which means you need to sit over 6" away for the pixels to become indistinguishable. Of course, there are other factors involved but that is the basis of the definition.
    Reply
  • Mitch89 - Tuesday, March 13, 2012 - link

    Hence why most people can't tell the difference between 720p and 1080p on their HDTVs. Reply
  • Mitch89 - Tuesday, March 13, 2012 - link

    4K does not equal 4 megapixel. A 4K display would have more than 7 megapixels.

    4K refers to the width of the display in pixels, likely 4096x2560 in a 16:10 aspect.

    To put that in perspective, most current 30" displays (Dell, Apple, etc) are 2560px wide.

    I for one and looking forward to that with some better UI scaling. My 27" Dell 2709W looks great at 1920x1200 for readability, but if that resolution were doubled like the new iPad (more pixels, same view), it would be incredibly smooth.
    Reply
  • steven75 - Sunday, March 11, 2012 - link

    They'll buy iPads because Win8 (metro interface) won't have the 200,000 apps the iPad does. Reply
  • tdawg - Friday, March 09, 2012 - link

    Agreed. I'd love a high resolution 27" or 30" monitor, but I'm not willing to pay more than $400-$500 for a PC monitor. If panel prices can be driven down to my price point, I'd be happy. Reply
  • Mitch89 - Tuesday, March 13, 2012 - link

    I'm tempted to replace my Dell 2709W with a U2711 just for the relatively minor res bump (2560x1440 vs 1920x1200). The drop to 16x9 is a shame, but having recently purchased dual U2711's for an editing suite I built for a friend, they are awesome, not to mention sharply priced. Reply
  • RHurst - Friday, March 09, 2012 - link

    YES! Couldn't agree more. Reply
  • Sabresiberian - Tuesday, March 13, 2012 - link

    I agree, a pixel density higher than 200 PPI and 16:10, about 27" or 30" size.

    200Hz would be great, too. (I'll settle for 120 though.)

    You know, this technology isn't new. There were LCD panels made with that kind of density over a decade ago:

    http://en.wikipedia.org/wiki/IBM_T220/T221_LCD_mon...

    Of course, they were extremely expensive back then ($18,000), but tablets prove they don't have to be today.

    One thing I DO NOT want is a screen with a ratio less than 16:10, and a lot of these "4K" displays are worse than 16:9.
    Reply
  • Guspaz - Friday, March 09, 2012 - link

    You can get 1920x1200 in a 13" notebook monitor. Until recently, the highest resolution you could get on a 27" display was 1920x1200. Clearly this doesn't make any sense; there is a 4.3x increase in surface area there, but no increase in resolution...

    I had a 1080p 24" monitor, and it felt low-res. I've since upgraded to a 2560x1440 27" monitor, and it does seem a bit better. As it should, it's a 25% increase in surface area, but a 78% increase in pixel count.
    Reply
  • ZeDestructor - Friday, March 09, 2012 - link

    I suspect it stagnated because windows offers extremely poor scaling by default, despite being given all the info via EDID/HWID... Reply
  • Exodite - Friday, March 09, 2012 - link

    Now if some enterprising soul could just cut away all the useless gunk adhered to the back of those displays, and ideally scale them up to 20-24", I'd be almost happy to pay ~499 USD!

    Imagine for a moment a 4x3 resolution desktop screen with large vertical resolution and low cost. *sigh*

    As a software engineer I have yet to see any reason to move past my two 1280x1024 displays from '06.

    I don't need color accuracy, widescreen formats, USB hubs or exotic display connectivity. I just want a display with high vertical resolution, high refresh rate and a good price.
    Reply
  • Guspaz - Friday, March 09, 2012 - link

    4:3? No thanks, even for coding.

    I'm sitting here on two 1280x1024 monitors, while at home I have a single 2560x1440 monitor. The difference is painful; lines of code don't fit in 1280 pixels, so stuff gets split between two, which isn't easy to work with. My single monitor at home is higher resolution than my two at work, I'd gladly trade these for a single higher res 27" monitor.
    Reply
  • Exodite - Friday, March 09, 2012 - link

    We obviously have very different coding styles then.

    Strictly speaking I don't need more horizontal space than 80 characters, though additional space for menus and toolbars are obviously an advantage.
    Reply
  • michael2k - Friday, March 09, 2012 - link

    Wait, what? The iPad has more lines in either direction than 1280, 1440! Reply
  • michael2k - Friday, March 09, 2012 - link

    Why not mount it on a stand, place it 10" away from you (so that it appears equivalent to a 22" screen), and use something like Parallels to run your desktop on it? Reply
  • Exodite - Friday, March 09, 2012 - link

    I'm sitting pretty much 10" away from two 19" screens as it is, unfortunately it'd have to be a bit bigger than 9.7" to be ideal. :P Reply
  • medi01 - Friday, March 09, 2012 - link

    -Our beloved customers are ready to give us more of their money for trendy gadget upgrade. We gotta sell them 'The New iPad', what have we got to "revolutionize"?
    -Nothing.
    -Eh, not even some app like Siri that we can remove from app store and shamelessly claim to make yet another breakthrough?
    -Nope.
    -Bah, let's play 'retina" display card then.
    -Well, we only got 264 points per inch display, we claimed "retina display" to be 326 ppi, can we still claim 264ppi is retina?
    -We are Apple dude, we can claim anything! (besides dudes like Anand will find excuses to demonstrate we were right)
    -Right. Anyway, we'd need more powerful hardware too keep up old speeds at new resolution. (dudes like Anand will still make us look better, by using useless to customers off-screen benchmark) so we'd need a bigger battery.
    -So it will be bigger, heavier and take longer to charge?
    -Yep.
    -Oh well, but it will still have "retina display". Should still sell well.
    Reply
  • WaltFrench - Friday, March 09, 2012 - link

    “Retina” display was a pixel density such that <b>at the likely distance from the eye,</b> the user would have a hard time distinguishing pixels. Works out to about 1 arc-minute (1/60th of a degree) for human eyes.

    I see lots of people using iPads on my commute route, on my frequent flights, in the occasional coffee-shop stop. Nobody tries to hold them the 10" – 12" away from their eyes, as Jobs cited for the iPhone4S.

    In other words, by failing to know a damn thing about vision (or elementary trig), your comment blathers ignorantly.

    PS, a word to the wise: it wouldn't have been very clever even if your point was in the least valid. Don't give up your day job to be a comedy writer. Even if your local Android Klub fans like it.
    Reply
  • medi01 - Saturday, March 10, 2012 - link

    Don't iZombie much, please.

    I keep my phone and tablet at the same distance, I guess I "hold it wrong way" in Hypnosteve's books.

    The point of "retina" was that density was so high, that pixels were indistinguishable for a human eye. (distance matters a lot here) at some magical distance.

    Indeed by playing with distance one could reduce resolution yet claim "it's "retina"". But then one could apply that "retina" buzzword to many pieces of older hardware.

    Off-screen benchmarks show no practical results to the customers and are only deceiving. Nobody uses CPU/GPU on their own, it's used only with particular resolution screen and decoupling them is just a way to deceive.
    Reply
  • doobydoo - Monday, March 12, 2012 - link

    How far you personally hold your tablet away is irrelevant. 'Retina' term isn't about you. It's about a typical user, with typical vision, holding the tablet at a typical distance, being unable to distinguish pixels.

    Typical users DO hold tablets further away, so it's perfectly logical.

    By 'Playing with the distance' you could indeed claim anything is retina - but that would make your claim incorrect because people don't hold the device at that distance, on average. The consensus amongst scientists and tech experts is that people DO hold tablets at the distance required to make this display retina.

    Off screen benchmarks eliminate both resolution and v-sync as factors (v-sync on screen benchmarks are the only reason the iPad 2 was slower in any GPU benchmarks - it limits FPS). As a result, you are given an accurate comparison of GPU performance. 'Practical Results' that you describe is a very difficult metric to calculate. While you would seemingly advocate a raw FPS metric, that fails to take into account resolution.

    For example, is 100 FPS at 10 x 10 resolution better than 60 FPS at 2000 x 1000? Of course not.

    Whichever way you look at it, the new iPad has a GPU which is up to 4x faster than the fastest Android tablet. It also has the best resolution. Any games designed to run on that high resolution will be tested to make sure they run at a playable FPS so the 'real world' performance will be both higher resolution and just as fast as any Android tablet.

    You seem to be completely bitter and unable to admit Apple has the technological lead right now.
    Reply
  • seanleeforever - Monday, March 12, 2012 - link

    i didn't realize my 2 year old 1080p 65 inch TV was 'retina' display. Reply
  • Michiel - Friday, March 09, 2012 - link

    Envy eats you alive. Go see a shrink ! Reply
  • medi01 - Saturday, March 10, 2012 - link

    Oh, sorry, I've forgotten it's a status thing.
    People paying 20-50 Euros less for a Samsung Galaxy obviously can not afford these über - revolutionary devices, hence they could only envy.
    Reply
  • ripshank - Sunday, March 11, 2012 - link

    medi01: So sad. Your remarks only show your insecurity to the world.

    Relax, breathe and just let others enjoy their gadgets of choice rather than resorting to name calling and mockery. Realize these are friggin gadgets, not politics or religion. But from your comments, it's like Apple killed your family, took away your job and stole your wife.

    What is wrong with the world today when people get so worked up over an object?
    Reply
  • medi01 - Sunday, March 11, 2012 - link

    Ad hominem, eh?

    There is nothing wrong with objecting to lies.

    Reviewers "forgetting iPhone in the pocket" on comparison photos where it would look pale, including nVidia's cherry picked card vs AMD's stock on marketing department's request and "off-screen benchmarks" all over the place are not simply bad, it stinks.
    Reply
  • stsk - Monday, March 12, 2012 - link

    Seriously. Seek help. Reply
  • doobydoo - Monday, March 12, 2012 - link

    1 - There is something wrong with objecting to lies INCORRECTLY. That's your own failing.

    2 - Ad hominem? I'll never understand why you Americans try to use that phrase all the time, as well as 'Straw man' - it not only makes you sound pretentious, trying to sound more intelligent than you are, it's also hypocritical:

    'Don't iZombie much, please.'

    Just say 'insults' - jeez.

    3 - Off-screen benchmarks are used by impartial review sites, as I explained above, because that is the only way to properly compare GPU performance. On-screen benchmarks have different resolutions and are limited by v-sync.

    4 - Claims of conspiracies on photos is just ridiculous.
    Reply
  • Greg512 - Monday, March 12, 2012 - link

    "you Americans"

    Way to be a pretentious hypocrite.
    Reply
  • cootang2 - Sunday, March 11, 2012 - link

    display technology and apple?? this display designed, developed, manufactured from other company. What apple did for this display is buying. Reply
  • stsk - Monday, March 12, 2012 - link

    And what "company" would that be? Please educate yourself about how contracted manufacturing works before proclaiming your ignorance. If Foxconn assembles for a variety of different manufacturers, none of which have similar design, does that make them all Foxconn devices? No, it doesn't. Most observers believe at least 3 separate manufacturers make displays for the new iPad. (and one of which apparently couldn't successfully get sufficient yield from emerging technology to stay in the game.) Apple designed the device, spec'd the parts, contracted the assembly. If you can find another product with a similar display, buy it. Otherwise, STFU. Reply
  • ZeDestructor - Friday, March 09, 2012 - link

    Page 2 (CPU) "768MB would imply 512MB on one channel and 128MB on the other, delivering peak performance for apps and data in the first 512MB but lower performance for the upper 128MB"

    512MB + 128MB = 640MB. I think you meant to put in 256MB.
    Reply
  • JarredWalton - Friday, March 09, 2012 - link

    Fixed, thanks -- saw that as well when I was reading. :) Reply
  • quiksilvr - Friday, March 09, 2012 - link

    Anothing thing worth mentioning is that those Tegra 3 numbers come from a lower end Tegra 3 on HONEYCOMB and not ICE CREAM SANDWICH. So please make sure when the eventual battle occurs between the Asus Transformer Infinity 700 and iPad 3 that this is addressed. Reply
  • jjj - Friday, March 09, 2012 - link

    I would rather see a tablet with a sane screen and such a large battery ,would be way more useful.
    There is no reason to use this SoC in the next iPhone,would be pointless to add such a GPU and the vanilla A5 would be a better fit, so chances are we'll see a new one .
    The biggest drain on the battery should be the display's LEDs since it likely requires 2x also the extra bulk could be due to the screen being thicker,there was a lot of space to fit a bigger battery so i doubt the battery is thicker.

    In the first table for the new ipad baseband you added the CPU/GPU instead of the Qualcomm chip used.
    Reply
  • ZeDestructor - Friday, March 09, 2012 - link

    Not any more than the iPad or iPad 2. Anand is correct here to talk about LTE. Radios do consume a fair bit of power when stressed. Especially with regards to data. Reply
  • jjj - Friday, March 09, 2012 - link

    nm about the first table supposed error,i guess for some reason that row is for the LTE model SoC not the baseband. Reply
  • Snowshredder102 - Friday, March 09, 2012 - link

    I tend to disagree, if you're going to be spending hours looking at something it better look good. The resolution is incredible just being a tad bit under the iPhones PPI. Insane battery life on these tablets aren't an issue with Apple, the battery life is already pretty good. Most people spend time around areas where they have access to a charger being at work, in their car, at home. I think the performance increase far outweighs a larger battery. Reply
  • zanon - Friday, March 09, 2012 - link

    Thanks for the summary, it was thorough and covers everything that can be done until physical units are available for tear-down and benchmarking. It sounds like the iPad 3 is a good example of practicality in engineering. As you say, Apple has put a great deal of emphasis on a few key user-facing elements, but was willing to make a few sacrifices against the optimal in places in order to hit reasonable volume, price, and timelines. Or in other words, "Real artists ship". It would be nice if full 28nm for all chips, A15/A7 big.LITTLE and so on were ready now, but that's the tech world I guess, always something new just on the horizon. Have to draw the line somewhere.

    Apple does seem to be falling behind a bit in iOS UI paradigms. Beyond side-loading, I continue to think that the single biggest core flaw/missing feature in iOS is a replacement for the file system, a new and updated data interaction and interchange UI. So far rather then address that Apple has just gone with "nothing", and other attempts seem to just be giving up and using the traditional filesystem in one form or another, but there's a big opportunity there to push things forward and it's key to actually making these devices real computer replacements IMO.

    Thanks again for this. I only had one minor quibble:
    >"The only downside is supply of these greater-than-HD panels is apparently very limited as a result of Apple buying up most of the production from as many as three different panel vendors."
    I think that's only a downside if the panels would have been available at the same time anyway. Often though Apple has gotten these deals by effectively fronting part or all of the immense amount of capital needed to get new factories and manufacturing lines online, which effectively brings forward when the stuff can get produced in volume anyway. Relatively quickly (6-12 months seems to be common) production will ramp up and exclusivity will expire, and everyone will have it all. Doesn't seem any different then if no one had accelerated things in the first place, except that a few get it earlier.
    Reply
  • joelypolly - Saturday, March 10, 2012 - link

    Regarding the filesystem, I believe that Apple already showed us a bit with ML's Save To iCloud functionality. In reality people don't really care how it is stored as long as it is easy to access. Reply
  • ltcommanderdata - Friday, March 09, 2012 - link

    Any idea about what the memory clocks are? Memory bandwidth seems to be an important limiting factor in driving the Retina Display.

    Does iOS 5.1 include new GPU drivers? If there were performance improvements, Apple might not be stretching so much in claiming 2x performance difference between the A5 and Tegra 3.

    Hopefully, lack of mention about CPU clocks just means the difference is small rather than nonexistent. Every new Apple SoC has always improved CPU performance, but when the improvements were small, namely the 33% clock increase in the 2nd gen iPod Touch, it wasn't mentioned until discovered by developers. A small CPU clock speed bump to 1.2GHz for the A5X would be nice.
    Reply
  • ZeDestructor - Friday, March 09, 2012 - link

    I suspect its similar to how the Samsing Exynos 3110 (Hummingbird/S5PC110) is configured with the baseclock driving everything else. In this case, 250MHz. Admittedly in a dual-channel configuration here. Reply
  • ltcommanderdata - Friday, March 09, 2012 - link

    Well Anand reported 400MHz LPDDR for the iPhone 4 and 800MHz LPDDR2 for the iPhone 4S. A bump to 1066MHz LPDDR2 seems reasonable. Reply
  • solipsism - Friday, March 09, 2012 - link

    I assume it's the same as detailed in their iPhone 4S article. If I recall correctly that is 3.8Gb/s. Reply
  • AMDJunkie - Friday, March 09, 2012 - link

    Under "A Much Larger Battery," the comparison chart has says that the 11-inch MacBook Pro has a 63.5 MWh battery. That is the 13-inch MacBook Pro that has that capacity; I think the author may have mixed it up with the 11-inch MacBook Air. Reply
  • JarredWalton - Friday, March 09, 2012 - link

    Fixed (also goes for the comment below -- same error). Thanks! Reply
  • gorash - Friday, March 09, 2012 - link

    It says 11" MacBook Pro on the chart, I think that's a mistake. Reply
  • gorash - Friday, March 09, 2012 - link

    Yes, Apple is pushing high-res, but then again, the Galaxy Nexus already has 720p, and Transformer Infinity has 1200p. It's hard to say that it'll be a while before we start seeing high-res Android tablets. It seems that Samsung will easily make a 1200p tablet, hopefully on an OLED screen. Reply
  • ZeDestructor - Friday, March 09, 2012 - link

    There's a rumour at gsmarena (http://blog.gsmarena.com/samsung-galaxy-tab-11-6-w... that Samsung is planning to give us a 2560x1600 11.6" tablet. If they do that around the 500$ mark, it will be quite the succes. I for one just want 12 of those panels to build myself a 3x5120x3200 desktop monitor. I'll also need 4 GPUs...... Reply
  • Mike1111 - Friday, March 09, 2012 - link

    I think that a 2560x1600 11.6" Samsung tablet has a good chance of being announced in the next two months. And the specs will look great. But a 2560x1600 11.6" Samsung display will have a slightly lower ppi compared to the new iPad and it will be a Pen-Tile display, so the true ppi will be even lower. Reply
  • ZeDestructor - Friday, March 09, 2012 - link

    I'd say its most likely an LCD, so most likely a standard RGB matrix. Reply
  • steven75 - Sunday, March 11, 2012 - link

    A Samsung Android tablet at those resolutions will make for a lot of extremely scaled up phone apps. Sounds great! Reply
  • Mike1111 - Friday, March 09, 2012 - link

    Well, that's the difference between announcing and selling. Asus announcing the Transformer Infinity first doesn't really mean anything. In just a week Apple will sell millions of new iPads in various countries around the world while Asus still hasn't even announced a price or date of availability of even one SKU in one country. Reply
  • gorash - Friday, March 09, 2012 - link

    They will be selling Transformer Infinity. Reply
  • iSayuSay - Saturday, March 10, 2012 - link

    Of course Asus and Samsung will sell their next gen hi resolution tablet. The difference is Apple announcing new iPad, or in fact any of their product AFTER they are READY to sell it. They have the real stocks, ready to be distributed on their retail system, online or offline. Thus Apple makes money real fast, and here they are today, the most valuable company on earth!

    While Asus, Samsung, HP and most consumer technology company always babbling, panic, and showing off about what they are gonna do to face the competition, Apple or not. I wouldn't surprised if Samsung came with 2560x1600 resolution on 11" display. No doubt they're the king of LCD technology. But again, they just shows you news, issue, press release, maybe some bonky prototype months and months BEFORE they are ready to sell it. And it comes directly from company's PR. Again, they panic!!

    See the difference there, buddy? All will sell the product in the end, but not all will be bought by market. One is full of $hi1t and big mouth, the other is consistent, they only do when they are ready. That's a big difference dude!
    Reply
  • gorash - Saturday, March 10, 2012 - link

    Nice fanboyism, dude. Reply
  • Focher - Saturday, March 10, 2012 - link

    The "fanboy" / "fanboi" accusation is the new Godwin's Law. Reply
  • iSayuSay - Sunday, March 11, 2012 - link

    It's not fanboyism when it's true. Sure Apple rely on Samsung for some chips, components or display technology. But the way they market and sell product, is different.

    Maybe it's better for those tech companies to just shut up and steadily supply Apple for their iToys product, like Qualcomm. Because they make make more money that way, in a more respectable and elegant way. Because so far, their marketing team is just beat up n fails.
    Reply
  • medi01 - Sunday, March 11, 2012 - link

    It sure is better for apple, could increase their margins even further.

    But how on Earth is that better for customers? Does idiotic FUD have too much influence on you, so that you only consider buying from a single company? Oh well, good for you. But leave others alone.

    In smartphone market iOS to Android is about 1 to 3, and declining further. In a couple of years we'll have the same situation for tablets.
    Reply
  • steven75 - Sunday, March 11, 2012 - link

    Yes when you see actual usage stats (website hits, etc), it's quite clear that the majority of iPhone owners use their phone as a smartphone and the majority of Android owner use their phone as a replacement for their dumbphone. In other words, Android is the new dumbphone for whatever reason.

    Also it's been TWO YEARS since the original iPad cam out, and still Android fans say Android tablets are "eventually" going to overtake iPads in sales.

    Didn't happen so far and there are no signs of that changing anytime soon. There still are barely any Antroid tablet apps!
    Reply
  • iSayuSay - Sunday, March 11, 2012 - link

    I don't have problems with competition. In fact, competition is what makes iOS and Android being progressive as fast as they are today.

    Like I said earlier, I'm aware that Apple needs suppliers, and Samsung is one of them.

    But I really hate the way company like Samsung compete with Apple. No, I'm not talking about copying, it's an old story. But let's just see

    After the new iPad announced last week and turns out it has 2048x1536 resolution (and ready to sell it of course), oh yes Samsung come with another bluff that they will release (it's still prototype of course) another tablet with 2560x1600.. REALLY?

    I thought Android has made 16:9 aspect ratio as standard for their tablet? Why go with 16:10? I know why! Because Samsung want to make things just bigger than iPad, 2560x1440 (16:9) ratio will be perceived that horizontal lines still worse than iPad (1536 vs. 1440). Samsung want everything LOOKS bigger than Apple, albeit it does not always better.

    Company like Samsung is so ambitious about taking down Apple, they don't think that much anymore in terms of ergonomic, a careful design choice and comfortability, NO. As long as it looks bigger/more powerful than iToys, they'll sell it and boast it around. LOOK, our Galaxy Note or S2 have bigger screen, and it has 720p! nothing like a small/inferior iPhone. REALLY? Their design will make it to the market as long as it makes iToys looks puny (while the fact is it still sell extremely well no matter what Samsung do), other design considerations are not really that important.
    Reply
  • doobydoo - Monday, March 12, 2012 - link

    'But how on Earth is that better for customers? '

    He didn't say it was. 'Straw man'!!! lol

    I do agree with you that it's better for customers to keep Samsung and others trying to compete with Apple - but the original post never said anything which contradicts this.

    iOS to Android is not 1 to 3, either. It's about 3:5 (iOS has 30% ish, Android has 48.6%). Google even admits it gets more search revenue from iOS users, and the iOS app store is many times more profitable than Android.
    Reply
  • doobydoo - Monday, March 12, 2012 - link

    Nice Apple-hating, dude?

    And good addressing of the points made.

    The obvious fact is that Apple is way ahead of both Samsung and Asus. The response is always 'but a new device out in x months will compete' - the point is that's too late. Everyone else is playing catchup right now.
    Reply
  • Penti - Saturday, March 10, 2012 - link

    Won't happen, Apple uses off the shelf parts. The same parts as any one else, the tech does allow you to create a 9.7" 2048x1536 display, just like they did that resolution on the same tech in lower end/last gen older fabs ten years ago at 15" laptops. It's a standard resolution, one normally supported by CRT monitors and VGA-cards via the analog connections. At least since we had 300MHz RAMDAC. Which ends you up back in the 90's. Much higher then our current resolutions won't be supported on laptops though, so you can forget about 2880x1800 resolution on Macbooks (the is no reason to go 4 times the resolution here, laptops hasn't been stuck at a specific resolution and held back for that matter) as the integrated Intel graphics won't have the bandwidth to drive that resolution at 60Hz (or above) and GCN is the first architecture that has the bandwidth to drive 4k displays over DP/HDMI or anything over 2560x1600 @ 60Hz for that matter. You will need enough LVDS, TMDS or Displayport bandwidth to transmit a resolution like that as well as video card support. Others have showed up a range of other resolutions up to 1920x1200 because that's what is out in the supplychain or what they custom order, they of course has other considerations then to market it as as much pixels the eye can see. While Apple was intentionally not upgrading it's 1024x768 display others where doing 1280x800 way down to 7.7" on the same display tech, planning on going 1920x1200 on displays that is far less prototypes then the devices showing them, but major product updates and releases won't come at any time, so you would have to wait until they replace their top products. Just as you would have to wait for Apples release cycle to put out new stuff. Of course Apple uses parts from these companies that is "babbling", like displays, memories, NAND-storage, batteries, SSDs, harddrives, gpus, wireless hardware, mobile graphics vendors and so on. The PowerVR SGX 543MP4(+) is already out in consumer products like PS Vita for example.

    Samsung is one of the major manufacturers of these new iPad LCD-panels. It's Samsung's, LG Display's or Sharp's own tech.
    Reply
  • vFunct - Friday, March 09, 2012 - link

    It's entirely possible for Apple to add an ARM cpu to a Macbook Air and allow for iOS operations to happen in a separate window.

    Actually, this could be a function of the monitor itself.
    Reply
  • tipoo - Friday, March 09, 2012 - link

    Intels own SoCs can use binary compatibility for any ARM apps, so two chips would be unnecessary. They could do it for power savings though with an instant on iOS mode or something, but I doubt they would since they already worked on instant on with SSDs. Reply
  • macs - Friday, March 09, 2012 - link

    The sad part is that probably we won't see an A15 iPhone for the next 18 months... Reply
  • tipoo - Saturday, March 10, 2012 - link

    Cortex A15 wasn't due for another year anyways. Reply
  • Subzero0000 - Friday, March 09, 2012 - link

    Does it really look that bad ?

    'cus I tried really hard looking at my iPad 2 screen and the Safari icon looks nothing like the image on this article.
    Maybe it's a good thing that I can't tell the difference...

    Texts are a bit blurry, that's true though.
    Reply
  • tipoo - Saturday, March 10, 2012 - link

    Maybe I'm more sensitive to it but I can definitely see the pixels in the borders of things like the Safari icon. The AT images are zoomed in of course so you notice it more, but its pretty easy to see at a glance on the iPad 1 and 2. Reply
  • ananduser - Friday, March 09, 2012 - link

    What you didn't mention in your analysis is that the "new" resolution is the result of a need and not a wish to trump competition on specs. Apple couldn't have chosen 1600x1200 or 1920x1200 or standard 1080p because of ios' lack of resolution independence. As they did with the iphone, Apple invested in a custom screen size just so that the ecosystem would introduce a x2 and voila, instant upgrade. Reply
  • ZeDestructor - Friday, March 09, 2012 - link

    Not really custom. High-end 19"+ CRTs back in the late 90s did 2048x1536@72+Hz as a matter of routine. Reply
  • Roland00Address - Friday, March 09, 2012 - link

    Nobody is shipping a device with a 1600x1200 or 1920x1200. Yes there will be competitors later on this year with those resolutions but no body is shipping an ips panel with those resolutions in a 10 inch form factor right now. Thus Apple is not saving money by merely retooling some other panel.

    Yet there is a reason besides higher dots per inch for choosing something with greater than 1200 height. When you turn the tablet so the skinner side is going from left to right that means the maximum webpage it can draw is 1200 pixels wide without zooming or not showing the entire website. Most webpages are designed for the following resolutions. 1024x768, 1280x1024, 1366x768. By picking something that is at least 1366 wide you are guaranteed to display the entire webpage.

    Furthermore merely doubling the resolution makes it a lot easier to port apps to the higher screen resolution.

    If you need to get custom panels anyway why not pick a resolution that makes sense from an app development perspective as well as being more useful at displaying webpages. Why do you need to follow the resolutions that are traditional on desktops, an ipad is not a desktop device so why situate it with desktop baggage when you are starting from scratch?
    Reply
  • joelypolly - Saturday, March 10, 2012 - link

    It isn't a new resolution and has been a standard LCD resolution call QXGA for many years. In fact IBM shipped a 15" Thinkpad with QXGA screens which looked awesome compared to the standard XGA screens at the time. Reply
  • doobydoo - Monday, March 12, 2012 - link

    It's not because of a 'lack of resolution independence' at all. The iPhone and iPad both have different pixel ratios and both run iOS.

    It's to make life easier for app developers, and means that any existing apps for the iPad 1 or 2 can be scaled up automatically.

    The Samsung Galaxy got this wrong by switching the ratio between generations. It quickly gets very messy for developers.
    Reply
  • prophet001 - Friday, March 09, 2012 - link

    Hey Anand,
    I was wondering if you could elaborate more on the differences between the usability of a tablet and that of a laptop. Perhaps write a short article. I know that there are substantial hardware differences. However, what are the OS level differences that restrict what things you can do on a tablet? Thanks for this article and all you guys do.
    Reply
  • classy - Friday, March 09, 2012 - link

    It looks like a true top notch tablet, but the price just seems to high. I have found more often than not, unless you are a reader, a laptop is better. Reply
  • jihe - Friday, March 09, 2012 - link

    And if you are a serious reader, kindle is better Reply
  • SixOfSeven - Friday, March 09, 2012 - link

    Depends on what you're a serious reader of. If it's technical literature, you need size and resolution. If it's a novel, pretty much anything will do (hence the Kindle). The first two iPads require too much panning and zooming for the sort of stuff I read; perhaps this one will be better. Reply
  • name99 - Friday, March 09, 2012 - link

    Buy yourself a copy of GoodReader NOW!

    Most of what I read on my iPad is technical PDFs, and GoodReader does a good job of allowing you to define crop margins so that the relevant text covers the entire page. For PDFs targeting A4 or US Letter it works really well.

    I assume you are currently reading PDFs in iBooks? That's garbage --- iBooks is pure crap when it comes to handling PDFs. Not to mention, GoodReader also does a much better job of allowing you to file a large number of PDFs in a hierarchical system.
    Reply
  • Michiel - Friday, March 09, 2012 - link

    Why do you think serious readers exclude serous gamers, mailers, twitters, facebookers, bloggers, movie-fans, TV watchers, musicians, photographers, students, etc, etc...

    Oh, by the way; The Kindle excludes all of them.

    If you want to mock the iPad, think first !
    Reply
  • Michiel - Friday, March 09, 2012 - link

    Are so clever or am i a retard ?

    Why in the name of whatever is it so damn hard to figure out what the purpose is of an iPad ?

    I'll go to the store on my bike.
    Wait ! A motorbike is better.
    Wait ! A car is better.
    Wait ! A Ferrari is better.
    Wait ! Give me an airplane.
    Wait ! Give me a 747.

    Is it so hard to see things in perspective ?
    Reply
  • Lucian Armasu - Friday, March 09, 2012 - link

    If the CPU hasn't changed and the GPU doesn't even compensate for the increase in resolution, can you really say that iPad 3 is faster than iPad 2, then? Somehow I think it isn't, and even if it was, as you said, some apps and games that worked on iPad 2, will not work on iPad 3 with the new resolution (only if they keep the old one).

    As for Windows 8 tablets, to do that you'd need the x86 version of Windows 8. As far as I know Atom doesn't support such high resolutions, so it will not be competitive with high-end ARM chips - unless you're really thinking of using Ivy Bridge Core i3-i7 chips in your $1000+ tablet. Not to mention that the graphics performance will be terrible even then, and you'd need an even bigger battery than the one in iPad 3, just to have half the battery life.

    So far I still still Android as the only one that can seamlessly work between a tablet and a laptop form factor, although it does need a few more improvements to allow for a more "desktop-like" feel when in laptop mode (even though Windows 8 is wrongly going in the opposite direction).
    Reply
  • vision33r - Friday, March 09, 2012 - link

    This is why Apple sold over 100mil iPads because it's target towards people who don't want to be stuck behind a PC or a notebook.

    Avg folks just need a web pad for viewing sports scores, lottery numbers, news, read a few books, and play a nifty game.

    They don't need a quad core i7 CPU overclocked and 1TB of disk space.

    iPad sells because the buyer does not want to understand specs.

    Specs is the reason Android tablets won't succeed. It is killing the Android tablet market as more vendors will bail as profits are so razor thin and depreciation is fast and high.

    When you Android folks only look at paper specs, 90% of Android tabs are already out dated in 3-4 months as you guys would only buy the highest paper specs.

    Zero incentive for a vendor to push out a high quality tablet if the margins are so thin.
    Reply
  • Icehawk - Friday, March 09, 2012 - link

    I totally agree as both a tech guy and an iPad owner - I use my PC for most "computing" tasks but I use the iPad for reading, looking something up quickly in front of my TV, and casual gaming - and for that specs don't matter just how it works. I passed on the iPad2 waiting for the resolution bump the iPad has desperately needed along with increased memory - IME RAM and poor coding are why the majority of apps that crash do so.

    I'm curious whether app size will increase significantly due to higher rez resources?
    Reply
  • gorash - Friday, March 09, 2012 - link

    Android tablets don't sell because Honeycomb is a temporary OS. With Honeycomb and Jelly Bean they should do well. Reply
  • WaltFrench - Friday, March 09, 2012 - link

    When I look at the apps that *I* see people using on the iPad (in coffee shops, airplanes, bart, friends' homes…), essentially NONE are maxing the GPU, and CPUs are maxed only on occasion.

    That's cuz I don't bump into gamers. But the sales figures say iPad users are buying lots of games, so that's the only likely reason for the GPU upgrade: to support hi-res gaming, which was already no slouch on the iPad versus other tablets. Game devs can chip in, but with relatively little fragmentation of the iPad line (accentuated by the fair assumption that the really hardcore types will immediately upgrade to the iPad3), developers have a pretty easy target to optimize for. Should result in extremely playable, very good-looking games.

    Operations such as surfing, which spend relatively little time rendering text or elementary graphics, won't be bottlenecked by speed, either.

    The new GPU power seems to be exploited very nicely in the iMovie and iPhoto apps. Here, the transformations are applied to the whole photo image, while only a relatively low-res screen view needs updating. Since the number of pixels on the screen is not the limiting factor, the net transformation should be quite a bit faster (assuming it's actually done in real time, as opposed to a series of filter codes stored with the base image).

    Already, iMovie was a killer app on the iPad2; the only Android app I saw was hopelessly buggy and with the level of Android tablet sales that have happened in 2 years, there's exactly zero developer incentive to tweak for all the CPU/GPU variations. Again, Apple is extending its lead in quality of user experience.
    Reply
  • doobydoo - Monday, March 12, 2012 - link

    'If the CPU hasn't changed and the GPU doesn't even compensate for the increase in resolution, can you really say that iPad 3 is faster than iPad 2, then?'

    Arguably, yes. Is 100 fps at 10 x 10 better than 60 fps at 2000 x 1000? No.

    You have to take resolution into account when assessing speed. Also, bear in mind that the iPad 2 was not 'too slow' - it ran games at a perfect 60 FPS since games were designed to run on it. The iPad 3 therefore doesn't have to be 'faster' in raw FPS terms, its aim, clearly, was to increase the gaming quality without sacrificing performance.

    I also disagree that the GPU necessarily doesn't compensate for the increase in resolution. Despite what Anand says in this article, FPS does not reduce proportionally with resolution, so it may well be that the MP4 compensates for the increase in resolution perfectly well.

    'some apps and games that worked on iPad 2, will not work on iPad 3 with the new resolution'

    I suspect this is simply wrong. Either the iPad 3 will have been tested with some mainstream old apps to test, or it may retain the ability to process games at the old resolution and upscale. Giving the best of both worlds to developers, better quality, or better performance. Or both.

    'So far I still still Android as the only one that can seamlessly work between a tablet and a laptop form factor'

    The problem is that the Android tablets are nowhere near as polished or as high performance, with lower resolution screens and slower hardware. Using onlive for the iPad you can get a complete desktop experience - not that that's what most people who buy tablets are looking for.
    Reply
  • JK6959 - Friday, March 09, 2012 - link

    The upgrade in GPU was to push the 4x increase in pixels at a useable speed, but the cost of more GPU and Resolution seems to be a large fall in effective battery life per WHr. Given they can't chunk up the phone for a fat battery and the doubling of GPU power would be wasted I cant see the benefit of putting this into the phone.

    If they increases screen to 3.7-4.0 inches and use the old iPad resolution to maintain retina DPI and some overlap of ipad/iphone resolution, the old A5 would still have more power than needed. For example a Full HD gaming laptop may get a GTX 560 in it, but you're not going to put that in a 1024x600 netbook as it is simply too much

    Apple have shown they're willing to delay until a product is ready, I think they'll get something more efficient in the next iPhone and that being more than just a die shrink or increasing GPU cores. I hope for A15 28nm, like TI have shown only 800mhz is needed for a fast performance
    Reply
  • WaltFrench - Friday, March 09, 2012 - link

    Methinks much of the extra battery drain is (1) LTE and even more, (2) the hi-res screen, which appears less efficient. Assuming it's important enough to spend a few engineer-months on it, turning off unused GPU capacity would seem to moot the question of extra units. Reply
  • tipoo - Friday, March 09, 2012 - link

    Possibly a stupid question, but are we sure they are the same? They specifically said saturation at the keynote, and they've used the term gamut before. Some Android phones displays have lots of saturation, but less gamut. Reply
  • gorash - Friday, March 09, 2012 - link

    It would suck if all they did was increase the saturation via software. That's kinda what I assumed. Reply
  • jabber - Friday, March 09, 2012 - link

    ...seems far less press hyperbole this time around.

    Maybe folks are just bored with tablets now.
    Reply
  • Torrijos - Friday, March 09, 2012 - link

    First off I've been trying to find the anandtech article on mobile multi-threading analysis mentioned in the article to no avail so if anybody can point me in the right direction I would appreciate it.

    On the subject of multi-threading, it feels to me that besides benchmarks, there isn't a lot of usage model that benefit from 4 vs. 2 cores on mobiles platform right now.

    While Apple has a tight control on multitasking, leading to a lower charge on the CPU, they also have been trying to facilitate the real life use of multiple core with Grand Central Dispatch, that they were pushing even a year before the release of their first multi-core mobile device.

    My question would be are the benefits of multi-core architecture real in mobile devices right now?
    Even on mainstream desktop computers we now barely see multi-threaded mainstream software (not talking about pro or engineering), so how many software are optimized for multi-cores on each mobile platform (iOS, Android, W7)?
    Also to benefit from better multi-thread performances what version of the OS have to be used in Android?

    Even the web benchmarks used to test multi-threaded performances have to be taken with a grain of salt since network performances would probably end up annihilating the benefits of a faster CPU in real life usage.

    While it's always nice to speak of hardware specs, I feel we still lack a good usability measure on mobile platforms, besides ideal webpages loading that would be hampered by real life networks.
    Reply
  • WaltFrench - Friday, March 09, 2012 - link

    Mostof the time, most apps are twiddling their thumbs waiting for user events. Other functions maybe look for network events. But if few iOS apps make use of multi-threading, it must be because the extra work and risk of introducing bugs doesn't result in an app that is perceptibly better to the user. Reply
  • tipoo - Friday, March 09, 2012 - link

    70% is a massive increase in battery capacity. I wonder if that was mainly for the LTE, or if the Retina display sucks lots more power too? For less intensive work where the CPU and GPU can idle a lot like web browsing, would it likely last longer than the 2? Reply
  • tipoo - Friday, March 09, 2012 - link

    Also I've been wondering, they didn't say anything about the CPU part in the keynote but would it require that new heat spreader just for moving from the MP2 to the MP4 graphics? The PS Vita does not have one yet has twice the CPU cores and the same MP4 graphics, maybe the CPU cores are clocked higher? Reply
  • solipsism - Friday, March 09, 2012 - link

    it's for the Retina Display and associated components as the system still gets 10 hours for web surfing, watching video, or listening to music with the WiFi-only model, just like with the previous iPads. They do omit the standby time this time around. WIth a 70% larger battery isn't in safe to assume it should have increased by around 70%? Reply
  • tipoo - Friday, March 09, 2012 - link

    I feel like if it went up by 70% they would have said so. But on the other hand in the one without the wireless radios, I don't see why it wouldn't have. Reply
  • solipsism - Friday, March 09, 2012 - link

    The reason you typically not state something is because it's not favourable but it's hard to imagine it's lower. After all we are talking about a 70% larger battery with no display or active system resources being used in standby mode.

    I hope AT or someone tests this but obviously this will take a very long time.
    Reply
  • joelypolly - Saturday, March 10, 2012 - link

    Might take a few weeks to get the results. Idle standby (radio & wifi off) on the iPhone 4 I had was around 14 days. Reply
  • Death666Angel - Friday, March 09, 2012 - link

    I'm not Apple fan (own no product and would advise my family and friends to go with other brands 90% of the time). But if I had the spare change for a toy like this, I'd totally buy an iPad 3 over the Android competitors. As it stands now, I don't have the money (need better graphics cards when the 28nm generation is fully launched to support my new 27" display) and so I will wait a few years until such a device can replace my subnotbook. I'll be interested to see if it will be running Android, iOS or Windows. :-) Reply
  • berma001 - Friday, March 09, 2012 - link

    any chance flash will be viewable on new ipad? Reply
  • vision33r - Friday, March 09, 2012 - link

    Flash is dead, Adobe already stopped development. You can still use it on Android but they won't do any new releases with new features. Reply
  • WaltFrench - Friday, March 09, 2012 - link

    hahahahaha.

    Yeah, for apps that a developer converts to Air.

    Adobe announced that they were NOT porting Flash to any new mobile platforms. And it's still Copyright © Adobe, Inc., so even if Apple had an ahem, “change of heart,” they still couldn't.

    There are now something like a billion mobile devices that can't view Flash. Smartphone sales outpaced PC sales last year, and with tablets (uhh, iPads) growing explosively into the notebook space, they are likely to outpace PCs in a year or two, also. Windows on ARM ain't gonna support it either.

    In other words, websites that rely exclusively on Flash are soon to be unwatchable on the majority of web-accessing devices. The smart ones see that coming. Those that cater to people running Flash games or even (gasp!) work-related videos on their desktop office computers have a bit more time become they, too, watch page views go to zero.
    Reply
  • Michiel - Friday, March 09, 2012 - link

    People who wait for Flash on a tablet, iPad, make me sad somehow.

    It started as Macromedia Flash about 150 years ago. Back than it was just as bad as it is now.

    Hurray for the death of Flash ! Long live HTML-5 and thank you so much Apple for helping Flash out of its misery !
    Reply
  • doobydoo - Monday, March 12, 2012 - link

    If you're desperate for flash, try Skyfire, iSwifter, or Onlive, all of which offer flash on iPad. Reply
  • flyguy29 - Friday, March 09, 2012 - link

    more power new form factor Reply
  • xype - Friday, March 09, 2012 - link

    "It's clear to me that Apple is trying to move the iPad closer to the MacBook Air in its product line, but it's unclear to me whether (or when) we'll see convergence there."

    Uhm, can you elaborate on that? How is that "clear" in any way? Because both use aluminium and are thin while also sporting a display?

    If anything, Apple has been very vocal about their thinking that these are two very different kinds of devices—they even call it PC vs Post-PC. Do you think that’s just marketing? That it’s not actually how they feel about it?

    This feels a bit like an argument from someone who would _want_ the devices to all be the same, ultimately converging into a PC-like experience again. I don’t think that’s really going to happen with Apple, at least not in a big way like Microsoft is attempting. Sure, they’ll port some of iOS to OS X and vice versa, but I don’t think they consider merging the two at least for a couple of years, still.

    Microsoft is the first doing it (and hats off to them, the Metro UI is a very bold step), but they were the first doing tablets, too, remember?
    Reply
  • c4v3man - Friday, March 09, 2012 - link

    Why not spend another $5-10 on components and make a $600 32GB transformer the base model? That way you still maintain most of the profit margin you want to have, while also being competitive cost-wise. I can appreciate that you are using some components that may be considered better than the newiPad, but you are also using some that can be considered worse. Past experience shows that tablets priced higher than Apple fail in the marketplace since people can't accept a reality where Apple isn't the "premium offering". Reply
  • Lucian Armasu - Friday, March 09, 2012 - link

    By the way Anand. Is there any way to test the graphics performance on their native resolutions anymore? I think you should bring that back and show the fixed resolution vs native resolution tests side by side. Because I actually think the iPad 3 suffered a very significant performance drop due to the new resolution, just like the iPhone 4 was always the device at the bottom of the graphics test because of its retina display.

    So I'm aware that the chip itself should be faster when comparing everything at the same 720p resolution. But that doesn't really mean much for the regular user does it? What matters is real world performance, and that means it matters how fast the iPad is at its *own* resolution, not a theoretical lower resolution that has nothing to do with it.
    Reply
  • WaltFrench - Friday, March 09, 2012 - link

    Let's think this through a bit. Users don't run GLBench or that sort of stuff; they run games that a developer has tweaked for a platform. Subject to budget — I haven't had the pleasure myself, but hear tell that it requires good, solid engineering and lots of it — the developer puts out the best mix of resolution & speed that will please the customers the most. (Who would do otherwise?)

    Obviously, if you don't have the resolution, you go for fps. So it's conceivable that a 720p device could show better speed. But that'd only be true if the dev was pushing so hard on the iPad's 4X of pixels that he sacrificed play speed. If it came to that, he'd pull back on AA or other detail/texture quality efforts. Right? Wouldn't you?

    So what I think it comes to is how hard a given game dev will work on a particular platform's capabilities. Here, fragmentation and total sales come to play, big time. Anand might be able to give you a theoretical tradeoff that a dev faces, but it might be quite the challenge to translate that into how well gamers would like a given device for stuff they can actually play.
    Reply
  • medi01 - Saturday, March 10, 2012 - link

    In other words, did Apple's marketing department forbid you doing native resolution benchmarks? Reply
  • doobydoo - Monday, March 12, 2012 - link

    Native resolution test is a flawed test.

    As I've explained to your other comments, performance has to take into account the resolution. IE, 100 fps at 10 x 10 is clearly worse than 60 fps at 2000 x 1000.

    It's very telling that you make this suggestion now Apple has come out with the highest resolution device. Not something you requested previously when Android tablets had higher resolution.

    The iPhone 4 was never bottom of any sensible benchmarks because of its retina display. The tests, as always, were done at the same resolution as they always should be. The iPhone 4 was low down in the benchmarks because it had a slow GPU.
    Reply
  • rashomon_too - Friday, March 09, 2012 - link

    If displaysearchblog.com is correct (http://www.displaysearchblog.com/2012/03/ipad-3-cl... most of the extra power consumption is for the display. Because of the lower aperture ratio at the higher pixel density, more backlighting is needed, requiring perhaps twice as many LEDs. Reply
  • jacobdrj - Friday, March 09, 2012 - link

    I am no coder, but even with my gaming rig, I have a 27" 1900x1200 display (that I admittedly paid too much for), but I flanked it with 2 inexpensive 1080p displays, rotated vertically in portrait mode for eyefinity and web browsing. Reply
  • IHateMyJob2004 - Friday, March 09, 2012 - link

    Save your money.

    Buy a Playbook
    Reply
  • tipoo - Friday, March 09, 2012 - link

    Save your money. Buy a toaster.

    Wait no, I forgot the part where they do different things :P
    Reply
  • KoolAidMan1 - Monday, March 12, 2012 - link

    Not to mention that toasters are actually useful Reply
  • name99 - Friday, March 09, 2012 - link

    "We could still be looking at a 1GHz max operating frequency."

    In all the playing with demo models, was no-one able to sneak any sort of benchmark, or even get a "it feels faster" feeling?

    Ignore LTE, consider WiFi models.
    My iPad1 gets astonishing battery life (like 20hrs) if it is only playing movies. That tells me that the screen just doesn't consume much power (and neither do the h264 chip and flash).
    Reading PDFs in GoodReader gives me substantially less time, maybe 10hrs, which tells me the CPU (plus RAM) still uses a remarkably large fraction of the power (and, perhaps, that GoodReader really ought to do a better job of allowing the CPU to sleep frequently).

    OK, switch to iPad3. Why should we believe that the new screen uses substantially more power than the current screen, if run at the same brightness? Most of the power burned by a screen is in creating the actual light, not in the toggling of each LCD transformer.

    Yet we have basically twice the battery available. This suggests to me EITHER

    - Apple REALLY wants the device to have a long lifetime as a game machine, while the GPU is burning twice as much power as in iPad2. This is probably true --- Apple seem to be pushing the "replacement for XBox, PS and Wii, and their portable versions" angle pretty strongly, and maybe they have data somewhere that the number one complaint of parents who buy competing portable gaming systems is that they run out of juice half-way through an 8hr drive or flight, leaving junior screaming and whining.

    AND/OR

    - we have increased the CPU/RAM maximum clock by maybe 30% or so, allowing for higher speed than iPad2 with the same sort of battery life for CPU intensive tasks (and longer battery life for simpler tasks like movies or listening to audio)

    Why didn't Apple just say the CPU is 30% faster? For the same reason Apple never wants to trumpet specs. They want to give the impression that their users don't have to worry about and comparison-shop specs --- Apple will just make sure that what they are buying at any point is a reasonably well balanced compromise between cost, performance, and battery life. They usually choose one semi-technical headline item to trumpet as "rah rah, better than last year" while remaining silent about the rest --- look at these sorts of product announcements for the past ten years. So, for example, this year is was "ooh twice as powerful GPU" but they didn't, for example, mention twice as much RAM (that slipped out from a 3rd party on stage), let alone how fast it is. Likewise in the past for different phones and iPads they haven't mentioned when they switched to DDR3 RAM from DDR2. Occasionally (eg 3GS) the CPU speed IS the headline item and then we'll get the bar graph of how it is twice as fast as its predecessor, but usually "faster CPU" is just taken for granted.

    Point is, to me there are multiple lines of evidence to a CPU and/or RAM clock boost.

    Also it would be nice to know (if not possible while at the Apple press conference, then at least as soon as we have devices in the wild) the other two performance numbers
    - has the WIFi gained either 40MHz or 2x2:2, so it's faster than the current 65/72Mbps PHY?
    - is the flash any faster?
    Reply
  • tipoo - Wednesday, March 21, 2012 - link

    Same CPU speed is confirmed now, it benchmarks the exact same in anything CPU bound. Reply
  • Supa - Friday, March 09, 2012 - link

    Great review, easy to read, direct to the point yet quite informative.

    Some sites will bash Apple just to get attention, others write generic reviews that have little depth.

    It's been refreshing to read, to say the least.
    Reply
  • WaltFrench - Friday, March 09, 2012 - link

    “Apple continues to invest heavily in the aspects of its devices that users interact with the most frequently. Spending a significant amount of money on the display makes a lot of sense. Kudos to Apple for pushing the industry forward here.”

    And AnandTech continues to emphasize the aspects of technology that end up actually mattering in the real world. Kudos to this fine site for not obsessing over features that nobody can get any benefit out of.

    Meanwhile, it'd be good to look at the usage pattern that is evolving. Apple's iMovie, for example, seems to have been unparalleled before they upgraded it this week. A customer can go into an Apple store and ask to see iMovie demo'd, but they are unlikely to get a good feeling AT ALL if they go into their Staples or Best Buy and ask to see what it'd be like to slap together a 60-second homebrew video for Facebook, on any other tablet. If music, photos and video are what drive users' buying decisions, then competitors are going to have to sink a fair amount of energy into finely-tuned apps for those areas.
    Reply
  • spda242 - Saturday, March 10, 2012 - link

    Anand/Brian could you please consider to investigate/write an article about why we Europeans are screwed by Apple when it comes to LTE support for our frequencies?

    I just don't get it, is it about hardware/software/marketing decisions/antenna design?

    I have spent hours on the web trying to understand why Apple haven't released European version of the iPad but no one seem to know?
    Reply
  • Pantsu - Saturday, March 10, 2012 - link

    US LTE is different from EU LTE, different frequencies, and in practice far slower too. On the other hand LTE support isn't all that important at the moment in Europe since the operators aren't ready yet.

    iPad does support DC-HSDPA in Europe which is pretty much equivalent to US LTE.
    Reply
  • spda242 - Saturday, March 10, 2012 - link

    I am from Sweden and we have quite good LTE coverage and as far as I understand other European (Germany and rest of scandinavia for example) contries are getting there, but I also understand that UK for example are completely lost so far when it comes to LTE.

    To buy a new iPad (and later on the new iPhone?) without LTE support would feel like buying last years product. I don't buy it for this year only. I want to use it for some years and ofcourse Apple has "sold" me a LTE device and now I want it.

    But my question was rather if there are technical reasons, if so which ones or if it is marketing reasons?
    Reply
  • Steelbom - Saturday, March 10, 2012 - link

    Anand, you said: "Also keep in mind that memory bandwidth limitations will keep many titles from running at the new iPad's native resolution. Remember that we need huge GPUs with 100s of GB/s of memory bandwidth to deliver a high frame rate on 3 - 4MP PC displays. I'd expect many games to render at lower resolutions and possibly scale up to fit the panel."

    However, Real Racing 2 supports 1080p output (not upscaled) on an HDTV at 30 FPS. That's 2 million pixels, 1536p is only another 1.1 million, and it's got two additional PowerVR SGX543's to help it along. I don't know what the memory bandwidth of a PowerVR SGX543 is, or if it stacks with multiples of them, but wouldn't the additional two 543's mean it could handle 4 million pixels at 30 FPS?
    Reply
  • tipoo - Saturday, March 10, 2012 - link

    Bandwidth and core performance are two separate things, keep in mind these SoCs use shared memory for both the CPU cores and the GPU. The iPad 2's memory read score was only 334.2 MB/s

    http://www.anandtech.com/show/4215/apple-ipad-2-be...
    Reply
  • Steelbom - Saturday, March 10, 2012 - link

    Ah, right. I see. What does that mean exactly? What would the 360's memory bandwidth be roughly? (Is that the bandwidth on the GPU?)

    Cheers
    Reply
  • solipsism - Sunday, March 11, 2012 - link

    Doesn't the 360 only output 720p? Reply
  • Steelbom - Sunday, March 11, 2012 - link

    It varies from 1024x600 to 1080p but the majority are 720p. (I'd imagine the few at 1080p would be less impressive graphically.) Reply
  • Ryan Smith - Monday, March 12, 2012 - link

    The 360 has 22.4GB/sec of main memory bandwidth (the eDRAM is even higher). That's 1.4GHz on a 128bit bus. The A5 meanwhile is 800MHz on a 64bit bus, which on paper is 6.4GB/sec. Reply
  • Steelbom - Tuesday, March 13, 2012 - link

    Ah right, I see. Thanks. Reply
  • tipoo - Wednesday, March 21, 2012 - link

    So 3.5X the bandwidth for far less resolution. I think what we're bound to see is games that advertise native retina resolution, but the textures and whatnot will still be the old resolution. Games like Infinity Blade have separate resolutions for menus, shadows, etc. Reply
  • thebeastie - Saturday, March 10, 2012 - link

    I am hoping to see you do some oversized bluray rip mkv benchmarks, I have my avatar 3d bluray mkv and the ipad2 just can't handle it via local or over network, very annoying if I want to watch a movie in the other room via my Sony HMZ-T1 via apple hdmi out connector.

    I want a tablet that can at least handle this kind of work load, it should be able to as it still fits into content consumption and nothing to do with content creation...

    Thanks, been a big anandtech fan for over 10 years!
    Reply
  • realbabilu - Sunday, March 11, 2012 - link

    Dear beastie, the idevices is supporting playing 1080p, even the iPhone 3GS can done it. How? The QuickTime plugin inside iOS is so powerful that can play that specs. However there is some caveat. You must using mp4 container that has x264 avc and mp4 AAC stereo format or 6ch AAC, rather than Mkv.

    I assume you have Mkv format of bdripped movie. Check that movie has x264 avc track, and AAC audio. Usually Mkv has ac3 sound track or stereo AAC like he AAC Nero codec has. Form here you can re mux or video copy the Mkv into mp4 container, the process only take 1 minute. Ac3 should be converted, but it takes no longer than 5 minutes.

    How to Re mux it? If you are using Mac, try subler, the best ever free Mp4 from Mkv for Mac, includes metadata like pros, and ac3 convert, also multi subtitle, and multi sound track. If you are in the pc, you can copy Remux thru free xmedia recode, and copy the AAC stereo or convert the ac3 to stereo AAC.unfortunately no subtitle muxer approved in ios of including mp4box srtiphone.

    If you want to retaining the ac3 you can put the ac3 to sound track 2 as pastry, and put the AAC stereo as soundtrack number 1. The editing multi soundtrack needs subler. Or if you have want to convert not Remux from DVD or blue ray you can use handbrake pc/ Mac that can convert video,multi sound, multi subtitle as subler did but takes hours since it converting instead Remux.

    The results 1080p can be opened directly using third party like oplayerhd with QuickTime plugin enabled. Or you can use videos iPad default app. To insert 1080p inside iPad movies library you have to use copytrans manager free that only availableo on pc. Be sure to have iPad manually manage songs enabled on iTunes desktop.

    http://www.youtube.com/watch?v=yRWOUcgdXvw
    Reply
  • ueharaf - Saturday, March 10, 2012 - link

    how about atrix and atrix 2 in with ubuntu?
    lapdocks,multimedia docks,etc.
    Android and linux has an alternative to windows 8...dont forget it!!
    Reply
  • wilmarkj - Sunday, March 11, 2012 - link

    This has got to be one of the best summaries i have read online about the whole tablet issue and its future. I am fedup of all those 'pundits' out there who are jumping on the "tablets are going to take over everything" bandwagon. I dont see how tablets in its current state could be used for writing documents, programming, graphics, serious gaming, spreadsheets, doing animations, etc or any 'real work' Its even questionable for casual things like gaming or even browsing the internet or writing an email. Once it involves text i would much prefer a system with a PROPER keyboard. However this has brought many casual users to the computing fray. And i agree with your sentiments with windows 8. I am wondering where we 'serious' users will get our next os from?? Reply
  • AnnonymousCoward - Sunday, March 11, 2012 - link

    Does The New Ipad still require you to perform the pointless "slide to unlock" and "slide to power down"? Dumbest requirement ever. Reply
  • doobydoo - Monday, March 12, 2012 - link

    Yeah it's so dumb. Because nobody ever leaves their iPad in their bag whilst, for example, listening to music, and because no screen has ever been 'pressed' accidentally when in a pocket or a bag. Reply
  • tipoo - Wednesday, March 21, 2012 - link

    Agreed, just imagine the tens of seconds a week you would save! Reply
  • ffletchs - Monday, March 12, 2012 - link

    I much more prefer the Amoled screen technology, like the one in Samsung Galaxy tab 7.7. Much better colors and insane black levels. Reply
  • Bearman - Tuesday, March 13, 2012 - link

    Then you should buy that. Reply
  • star-affinity - Wednesday, March 14, 2012 - link

    The new iPad is supposed to have 44% wider color gamut, so you have to compare it to that.

    I've always thought the opposite, that AMOLED screens have over saturated colors. So I prefer the way iOS devices handles the colors, even if the screens color can vary quite a lot between the devices, especially iPhones. Some has a blueish tone and some are more yellow.
    Reply
  • star-affinity - Wednesday, March 14, 2012 - link

    Hmm... Oversaturated is one word I guess. Reply
  • deathwalker - Tuesday, March 13, 2012 - link

    Ok..so I am not a Apple fanboy. But that having been admitted up front the improvements that the iPad3 bring to the table are very tempting. I still have one major gripe with Apple on the iPad series and that is the lack of storage expansion (if I am wrong I will accept correction). Just about every Android table on the market offers affordable storage expansion via memory card. Why after 3 generations is this still not available on iPad's? Is this a compatibility issue or something or just Apples way of maximizing profits by forcing you to pay 2x the value of additional storage to go from 16 to 32? Help me understand this. Reply
  • bobsmith1492 - Tuesday, March 13, 2012 - link

    Why allow consumers the option of a $10 micro SD card when they can charge $100 for the same thing and pocket $90? It's the Apple way. Actually it's the way to do business if you can; for example Dell charges $200 for a 4GB RAM upgrade when you can buy 16GB of better RAM for $75 at Newegg. Reply
  • KaRRiLLioN - Sunday, March 18, 2012 - link

    I totally agree that this thing needs removable storage. I've had an iPhone and didn't much care for having to use iTunes for everything when it came to backing up photos, etc. so I ended up going with a Droid 4 this time around. Plus, I much prefer a physical keyboard.

    The unfortunate reality is that I haven't seen an Android tablet that looks quite as good as the iPad, although honestly, I'd just use a tablet for some light web browsing which is why I am, to this date, still tablet-less.

    So far, I prefer using my Dell Latitude 13 since I can do almost anything with it I could do with a tablet or a Windows desktop, aside from playing Angry Birds or Bunny Shooter.

    In any case, Apple's strategy is smart, targeting the less tech-savvy people who would be completely lost on an Android tablet. Obviously the vast majority of users on this website are far more in the know about tech than the average consumer who purchases an iPad or any other Apple product for that matter.
    Reply
  • Autisticgramma - Wednesday, March 14, 2012 - link

    Lets stop this madness, and just call it Ipad 2S with wings. Reply
  • worldbfree4me - Friday, March 16, 2012 - link

    After giving the new Pad much thought, I think Apple is conspiring to ultimately charge those who purchase it more down the road. Let me explain. If you take a DVD and compare it to a Blu Ray you have a difference in space of atleast 4:1 (4.7 gigs versus 25 to 50 gigs for Blu) granted a lot that space includeds the much lauded dolby digital audio tracks but let me finish. Think of the iPad 2 as a DVD and the iPad 3 as a Blu Ray. Right now developers are scrambling to enhance or optimize their applications to be able to display the proper resolution on the new Pad and that my friend is gonna require more information or data. So, if your iPad 1 or 2 with 16 gigs was barely storing all your apps then as each and every developer updates their app, your storage space will slowly begin to diminish and you will either need more storage space locally or remotely and Apple will be more than happy to sell you either a New 32 gig or 64 gig iPad or even better more cloud storage for more money! Reply

Log in

Don't have an account? Sign up now