Since Apple launched the first iPad two years ago, the tablet market has evolved rapidly. While slate tablets were nothing new, the original iPad was the first serious tablet to be built around smartphone components and a user interface designed specifically for touchscreen input. The hardware was enough to run the OS smoothly while maintaining good battery life, the thin and light form factor lent itself to easy portability, and the touch-based user experience was miles better than earlier devices based on desktop operating systems. 

We take it for granted now, but this was all news back in 2010, and the iPad was practically in a category of its own, with no real competitors to speak of. After Apple started shipping the iPad, the segment basically exploded—we had Google jump in with Honeycomb, HP got into it (and then out of it) with webOS, RIM had a go with the PlayBook, Amazon pushed the Kindle line into the tablet space, and Microsoft created its next release of Windows with tablets in mind. Along the way, Apple updated the iPad, both on the software side with multitasking, a new notifications system, and a myriad of UI updates, as well as launching second generation iPad hardware. The iPad 2 was a comprehensive update, bringing a dual core processor, unrivaled graphics performance, cameras fore and aft, and a ground up redesign that brought a thinner and lighter form factor. 

The iPad 2 was a significant improvement over the original—faster, more portable, and generally a far more polished device. Not that it was perfect: iOS 4 still had issues with smooth multitasking and an archaic notifications system, the cameras were mediocre, and the XGA display, while a great quality panel, didn’t have the kind of pixel density expected of a premium mobile device. The iPad 2 hit market around the same time as Honeycomb (in Motorola’s Xoom) early last year, and at first Apple still held a major edge in terms of hardware. As more impressive Honeycomb devices like Samsung’s Galaxy Tab 10.1 and the ASUS Transformer Prime were launched, along with Ice Cream Sandwich looming on the horizon, Android became a much more viable tablet alternative to iOS. And with Microsoft planning for a major push later this year for ARM-based Windows 8 tablets centered around the Metro UI, Apple has never faced such stiff competition in the tablet space. Which brings us to the third generation of iPad hardware.

It has a display resolution that dwarfs most high-end desktop displays. The panel also puts a real emphasis on quality, not just resolution. For a computing device targeted squarely at the consumer market, both of these things are rarities.

Its SoC is the absolute largest ever squeezed into an ARM based tablet. The chip itself is even bigger than what you find in most mainstream notebooks. It’s expensive, it puts out a ton of heat and it offers a tremendous GPU performance advantage over anything else in its class.

And it has a battery that’s larger than what ships in the current crop of similarly sized ultraportables and Ultrabooks.

The new iPad doesn’t significantly change the tablet usage paradigm, but it does put all previous attempts at building hardware in this space to shame. It’s the sort of no holds barred, performance at any expense design that we’re used to seeing from enthusiast PC component vendors—but in a tablet...from Apple. 

Welcome to the new iPad.

The new iPad
Comments Locked

234 Comments

View All Comments

  • doobydoo - Saturday, March 31, 2012 - link

    Lucian Armasu, you talk the most complete nonsense of anyone I've ever seen on here.

    The performance is not worse, by any stretch of the imagination, and lets remember that the iPad 2 runs rings around the Android competition graphically anyway. If you want to run the same game at the same resolution, which wont look worse, at all (it would look exactly the same) it will run at 2x the FPS or more (up-scaled). Alternatively, for games for which it is beneficial, you can quadruple the quality and still run the game at perfectly acceptable FPS, since the game will be specifically designed to run on that device. Attempting anything like that quality on any other tablet is not only impossible by virtue of their inferior screens, they don't have the necessary GPU either.

    In other words, you EITHER have a massive improvement in quality, or a massive improvement in performance, over a device (iPad 2) which was still the fastest performing GPU tablet even a year after it came out. The game developers get to make this decision - so they just got 2 great new options on a clearly much more powerful device. To describe that as not worth an upgrade is quite frankly ludicrous, you have zero credibility from here on in.
  • thejoelhansen - Wednesday, March 28, 2012 - link

    Hey Anand,

    First of all - thank you so much for the quality reviews and benchmarks. You've helped me build a number of servers and gaming rigs. :)

    Secondly, I'm not sure I know what you mean when you state that "Prioritizing GPU performance over a CPU upgrade is nothing new for Apple..." (Page 11).

    The only time I can remember Apple doing so is when keeping the 13" Macbook/ MBPs on C2Ds w/ Nvidia until eventually relying on Intel's (still) anemic "HD" graphics... Is that what you're referring to?

    I seem to remember Apple constantly ignoring the GPU in favor of CPU upgrades, other than that one scenario... Could be mistaken. ;)

    And again - thanks for the great reviews! :)
  • AnnonymousCoward - Wednesday, March 28, 2012 - link

    "Retina Display" is a stupid name. Retinas sense light, which the display doesn't do.
  • xype - Thursday, March 29, 2012 - link

    GeForce is a stupid name, as the video cards don’t have anything to do with influencing the gravitational acceleration of an object or anything close to that.

    Retina Display sounds fancy and is lightyears ahead of "QXGA IPS TFT Panel" when talking about it. :P
  • Sabresiberian - Thursday, March 29, 2012 - link

    While I agree that "Retina Display" is a cool enough sounding name, and that's pretty much all you need for a product, unless it's totally misleading, it's not an original use of the phrase. The term has been used in various science fiction stories and tends to mean a display device that projects an image directly onto the retina.

    I always thought of "GeForce" as being an artist's licensed reference to the cards being a Force in Graphics, so the name had a certain logic behind it.

    ;)
  • seapeople - Tuesday, April 3, 2012 - link

    Wait, so "Retina Display" gets you in a tizzy but "GeForce" makes perfect sense to you? You must have interesting interactions in everyday life.
  • ThreeDee912 - Thursday, March 29, 2012 - link

    It's basically the same concept with Gorilla Glass or UltraSharp displays. It obviously doesn't mean that Corning makes glass out of gorillas, or that Dell will cut your eyes out and put them on display. It's just a marketing name.
  • SR81 - Saturday, March 31, 2012 - link

    Funny I always believed the "Ge" was short for geometry. Whatever the case you can blame the name on the fans who came up with it.
  • tipoo - Thursday, March 29, 2012 - link

    iPad is a stupid name. Pads collect blood from...Well, never mind. But since when are names always literal?
  • doobydoo - Saturday, March 31, 2012 - link

    What would you call a display which had been optimised for use by retinas?

    Retina display.

    They aren't saying the display IS a retina, they are saying it is designed with retinas in mind.

    The scientific point is very clear and as such I don't think the name is misleading at all. The point is the device has sufficient PPI at typical viewing distance that a person with typical eyesight wont be able to discern the individual pixels.

    As it happens, strictly speaking, the retina itself is capable of discerning more pixels at typical viewing distance than the PPI of the new iPad, but the other elements of the human eye introduce loss in the quality of the image which is then ultimately sent on to the brain. While scientifically this is a distinction, to end consumers it is a distinction without a difference, so the name makes sense in my opinion.

Log in

Don't have an account? Sign up now