POST A COMMENT

232 Comments

Back to Article

  • sciwizam - Wednesday, March 28, 2012 - link

    Did the test suite change from your review of the Galaxy Tab 10.1 LTE? Reply
  • Anand Lal Shimpi - Wednesday, March 28, 2012 - link

    It did indeed, let me see if I can't run the updated suite on the Galaxy Tab 10.1 LTE though...

    Take care,
    Anand
    Reply
  • jadawgis732 - Wednesday, March 28, 2012 - link

    Hey Anand, I always wanted to say how much I admire your reviews, and how unbiased and untouched by corporate dollars they are. Keep up the great work! Also, it's very nice to see you responding to replies in the comments! Reply
  • Anand Lal Shimpi - Thursday, March 29, 2012 - link

    Thank you for the kind words and for reading the site :)

    Take care,
    Anand
    Reply
  • medi01 - Thursday, March 29, 2012 - link

    Samsung Galaxy Tab randomly dissapearing from charts where it would look favourably indeed leaves "unbiased" impression. (For instance color gamut is on par with "ipad 3" and way above ipad2) Reply
  • Anand Lal Shimpi - Friday, March 30, 2012 - link

    I promise it's not that sinister :) I simply opted to compare to the best of the best out there today as far as Android tablets are concerned: the ASUS Transformer Prime.

    I understand the desire for more results so I'll be working on them in the background over the weekend.

    Take care,
    Anand
    Reply
  • xytc - Wednesday, March 28, 2012 - link

    I've heard that Apple's strong marketing campaign penetrates pussies, so if you have an Apple product you're nothing but a pussy. LMFAO Reply
  • PeteH - Wednesday, March 28, 2012 - link

    Really? Reply
  • MobiusStrip - Thursday, March 29, 2012 - link

    "LMFAO"?

    Wow, low humor threshold.
    Reply
  • Wardawg - Thursday, April 05, 2012 - link

    Well most people adept in technology or use tech in their buisness use apple products: djs programmers, and others, and frankly I see more iPhones being used than androids. So apparently most of the world is pussies by your definition. Please do research before you bash a company's products Reply
  • sjael - Wednesday, March 28, 2012 - link

    On the 'A5X vs Tegra 3 In the Real World' page, you mention Modern Warfare 3 as a iOS+Android game.

    I think, since I haven't seen this game ported to phones/tablets, you *might* be thinking Modern Combat 3.

    And then of course you show the market page for it further down..
    Reply
  • Anand Lal Shimpi - Wednesday, March 28, 2012 - link

    Correct - thanks for the heads up!

    Take care,
    Anand
    Reply
  • Celestion - Wednesday, March 28, 2012 - link

    Looks like the 3rd gen iPad was CPU limited in that first GlBenchmark test. Reply
  • Anand Lal Shimpi - Wednesday, March 28, 2012 - link

    That would be vsync :)

    Take care,
    Anand
    Reply
  • Celestion - Saturday, March 31, 2012 - link

    I see. Thanks! Reply
  • Kevin G - Wednesday, March 28, 2012 - link

    Memory bandwidth tests just seem to be off for what you'd expect a quad channel 128 bit wide memory bus to perform as. Performance didn't move from the dual channel 64 bit wide bus in the iPad 2. Could there be a software bug (Geekbench or iOS) limiting performance there? It'd be nice to revisit the memory tests after the next major revision of iOS and in conjunction with a later release of Geek Bench.

    Any chance of getting the exact resolution that Infinity Blade 2 runs on the rev 3 iPad? I'm assuming it'd be either 1536 x 1152 or 1368 x 1024 for quick scaling purposes.
    Reply
  • slashbinslashbash - Wednesday, March 28, 2012 - link

    They addressed this in the article.

    "It would appear that only the GPU has access to all four channels." - Page 12

    The GPU is hooked up to the RAM controllers. The CPU communicates to the RAM through the GPU. The GPU gets all 4 channels, the CPU only gets 2. The benchmark measures CPU-RAM bandwidth, not GPU-RAM bandwidth.

    It's actually kind of interesting, as it's an inversion of the typical architecture that we're all used to from PCs. But it makes sense, since the new iPad is basically a very nice screen with a smartphone CPU attached. The very nice screen requires a very nice GPU to drive it, so the GPU is more important (and would be memory starved with only 64 bits). The CPU just has to be "good enough" while any shortcomings in the GPU would be magnified at this resolution.
    Reply
  • tipoo - Wednesday, March 28, 2012 - link

    Which way is the PS Vita configured? That has the same quad core GPU and a quad core CPU as well. Reply
  • tipoo - Wednesday, March 28, 2012 - link

    Huh, the Vita actually has 128mb dedicated video memory, can't find the bandwidth though. Reply
  • pickica - Monday, April 02, 2012 - link

    We should also consider a possible higher clock on Vita. Reply
  • antef - Wednesday, March 28, 2012 - link

    Yes it's nice, no one will argue that. But I don't see it as the huge advancement the authors indicate. Using it in the store it seemed fine, but honestly just walking right up to it, I wasn't even sure if I was using the new or old iPad. I had to go over to the iPad 2 to recognize the difference. And even then, after being back at the new iPad for a couple minutes, I completely forgot about it. If you are looking for pixels, sure, you'll notice. If you're just using your device and thinking about other things, probably not so much. Reply
  • PeteH - Wednesday, March 28, 2012 - link

    Eh, I think it depends on what application you use the iPad for. Web browsing and Tweeting? You're probably right, you wouldn't notice the difference in displays. But if you use it to view images I could see it being a big deal. Reply
  • zorxd - Wednesday, March 28, 2012 - link

    I am pretty sure extra resolution is more noticeable when reading text than when looking at images Reply
  • PeteH - Wednesday, March 28, 2012 - link

    I didn't mean "notice" as in you couldn't tell the difference, just that the difference wouldn't be something that you would constantly be aware of if you were simply web browsing.

    If you were reading an e-book? Absolutely, but if that's your only use case I'd get a Kindle and save the money.

    Regularly viewing quality images is something that can't be done on an e-ink reader, but for which the improved display would make a huge difference.
    Reply
  • Sabresiberian - Thursday, March 29, 2012 - link

    I would say this is a perfect example of why it's better to use "I"" statements than say "YOU won't notice, YOU won't care, there isn't that much difference" - those kinds of statements. "I didn't notice much of a difference, it wasn't a big change in MY experience. . .)

    Displays can very very personal in experience, and things that bug the heck out of me may not be a problem to someone else. For example, a pixel pitch of around .270mm is just too big for me, in a monitor, and it bugs me. Always.

    Frame rates are a good example of something I'm not consciously aware of all the time, but I can sure tell the difference on some level, and some displays are more effected than others. There are extra factors in LCD screens that can make the problem worse for some of us - others don't notice so much, or it's just not a problem for them.

    One thing I believe, is that as more people use really better screens, they'll understand more why some of us call for them every chance we get.

    ;)
    Reply
  • darkcrayon - Wednesday, March 28, 2012 - link

    I can *immediately* notice the difference in web browsing, which is primarily focused on reading text... Reply
  • tipoo - Wednesday, March 28, 2012 - link

    I found it a noticeable difference, just not neuron melting like some reviews led me to think. For 100 or more less I'd still be plenty happy with an iPad 2, especially given the CPU and battery life performance are about the same. Reply
  • MobiusStrip - Thursday, March 29, 2012 - link

    Unfortunately the iPad 2's camera is a disgrace. It should've had the iPhone 4 camera, which was already out by that time. Reply
  • repoman27 - Thursday, March 29, 2012 - link

    The iPad 2 was also thinner than the iPhone 4. Now that it is the same width, it has the same camera. It's not really Apple's style to add thickness to a device just to support one feature that isn't heavily used anyway (tablets are not a very good form factor for a camera.) Reply
  • zanon - Wednesday, March 28, 2012 - link

    Human vision varies significantly from person to person, as do use patterns for machines. Someone who is more near sighted or simply has better vision in general, and/or uses their system at a closer distance, may see a truly dramatic change. To take my personal example, I have excellent color vision and am also near sighted, and tend to hold my devices relatively close (or use glasses at my machine). I can see the pixels on the iPhone 4 screens (326 ppi) if I focus a bit, and for the older screens (or old iPads) they're massively pixelated to me (not that that made them useless). The High DPI screens are a night/day difference personally, making all types of reading in particular (be it on a terminal session, the web, PDF manuals, ebooks, or whatever) massively more functional (and everything else more beautiful).

    But that's just me, and is that awesome? No, it's kind of meh, I'd love it if I didn't need glasses to use my desktop without being hunched over the keyboard to drive. But understand that you'll see raves about the screen that are completely justified, just not for you. 20/20 vision puts the critical distance around 13" I think, but in the end everyone will need to take a look for themselves.
    Reply
  • zanon - Wednesday, March 28, 2012 - link

    In the article:
    Alternatively, we're used to a higher resolution enabling us to see more on a screen at one time. In the case of the new iPad, the higher resolution just makes things look sharper.

    The higher resolution does make smaller fonts readable. For something like an SSH session, that really will mean significantly more stuff can be on a screen at once.
    Reply
  • MobiusStrip - Thursday, March 29, 2012 - link

    A more useful change would be abandoning the ridiculous glossy screens. It's sad that Apple takes its cues from the plastic schlock being peddled at Best Buy, and participates in this fraud of shoving glossy screens down customers' throats. Reply
  • repoman27 - Thursday, March 29, 2012 - link

    The plastic schlock at Best Buy has a glossy plastic film applied to a cheap TN panel. Apple puts a piece of glass in front of their much more expensive IPS panels to protect them. The only way to make that glass (or the glass of the LCD panel itself) matte would be to apply an antiglare plastic film coating to the glass. These films have drawbacks (they block and scatter light making small details and text blurry.) The drawbacks become more exaggerated the farther the front surface of the glass is from the plane of the actual LCD.

    But you're right, it's probably Apple copying the design language of sub $500 laptops in order to somehow defraud the general public and force their customers to buy the products they actually produce.

    And seeing as how this discussion is about the new iPad screen, I'd like to point out that you're complaining about the lack of an antiglare coating on a touchscreen device... Strong work.
    Reply
  • Sabresiberian - Thursday, March 29, 2012 - link

    How is it fraud? Apple isn't, like, saying their screens are anti-reflective and then giving you totally reflective glossy screens.

    Many people prefer a glossy screen and simply aren't bothered by background reflections.

    ;)
    Reply
  • Henk Poley - Monday, April 02, 2012 - link

    Yes, Apple really should use Schott Conturan/Amiran/Mirogard antireflective technology.

    btw, not-glossy does not mean matte. Air is not matte either. Glass can be see-through too ;)
    Reply
  • Watwatwat - Monday, April 02, 2012 - link

    Nope, steve gibson has tested even using screen protectors on the new ipad vs not, it seems to affect the resolution at that level, matte might not be a good idea at all for high density display. Reply
  • KoolAidMan1 - Thursday, March 29, 2012 - link

    I wasn't initially blown away, but then after a day of using it every other display seemed bad in comparison. It is one of those things you didn't realize was needed until using it, now I want very high DPI in all of my monitors. Reply
  • menting - Wednesday, March 28, 2012 - link

    Is it just me, or do Shadowgun and GTA screenshots look more detailed in Transformer Prime than in the iPad? Reply
  • menting - Wednesday, March 28, 2012 - link

    nm..i just noticed that it's scaled up in new ipad, so it's definitely not as sharp.
    However, how can fps be fairly compared in this case then?
    Reply
  • TheJian - Wednesday, March 28, 2012 - link

    Basically because of the way Nvidia and Apple approach games so far, you can expect games on Tegra3 to just look better as they seem to aim for more graphics and fewer games (they spend money on fewer projects that produce better results), as opposed to apple who spreads the wealth but just ends up with more cannon fodder if you ask me :) You should get more variety on Apple I'd guess, but a better experience with fewer choices on Tegra3/Android. I like QUALITY over QUANTITY personally and hope Apple leans the way of Nvidia in the future. I would rather have 10 games that I'd play for weeks or months (if I'm playing on my hdtv through one of these I want better water, buildings etc) rather than games I fire up for less than 20 minutes as their just another angry birds variant and arguably useless on your TV.

    I want these devices to KILL the consoles next year and make MS/Nintendo etc give it up in 2015 or whenever the next revs should come. I hope they just realize we won't buy them anymore. DirectX11 on my phone/tablet and probably standard 25x14 resolutions by then (won't all be retina by 2015?) make a console purchase STUPID. This could be the merging of console/pc we need since phones/tablets rev yearly like pc's instead of 10yr console's stuck in stone stagnating gaming. Your phone as a portable console with xbox/ps3/pc gamepad support would be excellent. Pump it out to a monitor and keyboard/mouse setup and you have a notebook replacement too...LOL Now if they'd just put in a few extra cores by then that will disable if on their own screen but turn on when on a larger display like TV/Monitor and we have exactly what we want in both cases :)

    Pipe dreams? Retina is here now, and gamepads sort of. Next stop cores that only turn on depending on display output :) Awesome battery on the road, and great power in the dock at home pushing your 27in monitor. :) The 28nm versions by xmas of everyone's chips should come close to console power or surpass them. Interesting times.
    Reply
  • seanleeforever - Thursday, March 29, 2012 - link

    Correction: YOU won't buy it doesn't mean the rest of us won't buy it.

    PS3/XBOX came out in 2005, or about 7 years now. i have no issues buying the latest game and still play.

    what phone or pad did you have 7 years ago? oh, you have nothing... heck, the phones/pads you bought 3 years ago probably wont' be able to run today's game.
    Reply
  • tipoo - Wednesday, March 28, 2012 - link

    Tegra Zone enhancements, the article mentions that. Reply
  • PeteH - Wednesday, March 28, 2012 - link

    What mechanism is being used to upscale legacy (1024x768) apps? Pixel doubling? Bi-cubic? Bi-linear? Something else? Reply
  • Guspaz - Wednesday, March 28, 2012 - link

    At the most basic level, pixel doubling. However, text that is rendered through iOS gets a free resolution boost so long as the app was compiled with the latest version of xcode. It's pretty common on the iPad 3 to see apps where the interface elements are low-res, but all the text is high-res. And in apps that are predominantly text (like an SSH client, for example), that's all that really matters. Who cares if the triangle picture on a button isn't high res?

    For stuff like games, that stuff is just pixel doubled.
    Reply
  • PeteH - Wednesday, March 28, 2012 - link

    I'm not saying you're wrong, but how do you know games (for example) utilize pixel doubling? Reply
  • Guspaz - Wednesday, March 28, 2012 - link

    I know because I can look at a game that doesn't support the new screen (such as Plants vs Zombies HD) on my iPad 3 and see that it's using pixel doubling? It does the same for iPhone apps when you use the 2x zoom option. One thing I have not tried is an old 320x480 iPhone app. I'm curious, since that would require 4x zoom.

    Newer games may choose to render at a lower resolution and then upscale using some sort of filter (perhaps even on the GPU), but at that point they are specifically targeting the new display. An older game that is completely oblivious to the newer display is scaled by the OS using pixel doubling without any interaction from the game.
    Reply
  • Steelbom - Thursday, March 29, 2012 - link

    Actually, when using iPhone apps, the iPad uses the 640x960 version rather than the 320x480 version, if available. Reply
  • mosu - Wednesday, March 28, 2012 - link

    A person in his 50's doesn't care about 300dpi res because he only sees 200dpi, so retina display is just for kids? I really don't get why Apple did not use a standard res panel like 1920x 1200 if they wanted a greater quality image.It means they're stuck with a single form factor? Reply
  • PeteH - Wednesday, March 28, 2012 - link

    It's much easier to stick with a single aspect ratio, especially for the developers. Your app looks the same on every device (albeit sharper on higher DPI displays), no need to tweak things for multiple aspect ratios. Reply
  • Sabresiberian - Thursday, March 29, 2012 - link

    Umm, where do you get this idea?

    Generalized statements about vision limitations in humans are usually taken out of context, at best.

    ;)
    Reply
  • Ammaross - Wednesday, March 28, 2012 - link

    "It has the fastest and best of nearly every component inside and out."

    Except the CPU is the same as in the iPad2, and by far not the "best" by any stretch of the imagination. Hey, what's the problem though? I have this nice shiny new tower, loads of RAM, bluray, SSD, and terabytes of hard drive space. Oh, don't mind that Pentium D processor, it's "good enough," or you must be using it wrong.
    Reply
  • tipoo - Wednesday, March 28, 2012 - link

    What's better that's shipping today? Higher clocked A9s, or quad core ones like the T3? Either would mean less battery life, worse thermal issues, or higher costs. Krait isn't in a shipping product yet. Tegra 3's additional cores still have dubious benefit. These operating systems don't have true multitasking, you basically have one thing running at a time plus some background services like music, and even on desktops after YEARS few applications scale well past four cores outside of the professional space. The next iPad will be out before quad core on tablets becomes useful, that I assure you of. Reply
  • zorxd - Wednesday, March 28, 2012 - link

    I'd gladly trade GPU power for CPU power.
    That GPU is power hungry too, probably more than two extra A9 cores, and the benefit is even more dubious unless you are a hardcore tablet gamer.
    Reply
  • TheJian - Wednesday, March 28, 2012 - link

    LOL, the problem is you'll have to buy that new ipad to take advantage because YOURS doesn't have those cores now. Once apps become available that utilize these cores (trust me their coming, anyone making an app today knows they'll have at least quad cpu and gpu in their phones their programming for next year, heck end of this year), the tegra 3 won't need to be thrown away to multitask. Google just has to put out the next rev of android and these tegra3's etc should become even better (I say etc because everyone else has quad coming at 28nm).

    The writing is on the wall for single/dual. The quad race on phones/tables is moving FAR faster than it did on PC's. After win8 these things will start playing a lot more nicely with our current desktops. Imagine an Intel x86 based quad (hopefully) with someone else's graphics running the same stuff as your desktop without making you cringe over the performance hit.

    I'm not quite sure how you get to Tegra3 costing more, having higher thermals (umm, ipad 3 is hot, not tegra3). The die is less than 1/2 the size of A5x. Seems they could easily slap double the gpus and come out about even with QUAD cpu too. IF NV double the gpus what would the die size be? 162mm or smaller I'd say. They should have went 1920x1200 which would have made it faster than ipad 2 no matter what game etc you ran. Unfortunately the retina screen makes it slower (which is why apple isn't pushing TEGRA ZONE quality graphics in their games for the most part...Just blade?). They could have made this comparison a no brainer if they would have went 1920x1200. I'm still waiting to see how long these last running HOT for a lot of people. I'm not a fan of roasted nuts :) Too bad they didn't put it off for 3 months and die shrink it to at least 32nm or even 40nm would have helped the heat issue, upclock the cpu a bit to make up for 2 core etc. More options to even things out. Translation everything at xmas or later will be better...Just wait if you can no matter what you want. I'm salivating over a galaxy S2 but it's just not quite powerful enough until the shrinks for s3 etc.
    Reply
  • tipoo - Wednesday, March 28, 2012 - link

    I didn't say the Tegra 3 is more expensive or has higher thermals; I said the A5X, with higher clocked cores or more cores would be, and we all know Apple likes comfortable margins. Would I like a quad core A5X? Sure. Would I pay more for it? Nope. Would I switch for reduced battery life and an even hotter chip than what Apple already made? Nope. With the retina display, the choice to put more focus on the GPU made sense, with Android tablets resolution maybe Tegra 3 makes more sense, so you can stop attacking straw man arguments I never made. There are still only a handful of apps that won't run on the first iPad and that's two years old, "only" two cores won't hold you back for a while, plus iOS devs have less variation of specs to deal with so I'm sure compatibility with this iPad will be assured for at least two or three years. If I was buying one today, which I am not, I wouldn't be worried about that.

    Heck, even the 3GS runs most apps still and gets iOS updates.
    Reply
  • pickica - Monday, April 02, 2012 - link

    The New Ipad 2 is probably gonna have a dual A15, which means dual cores will stay. Reply
  • Peter_St - Monday, April 02, 2012 - link

    The problem here is that most people have no idea what they are talking about. It was just few years ago that we all used Dual Core CPUs on our Desktop Computers and we ran way more CPU load intensive applications, and now all of a sudden some marketing bonzo from HTC and Samsung is telling me that I need Quasd Core CPU for Tablets and mobile devices, and 2+ GB of RAM,
    If you really need that hardware to run your mobile OS, then I would recommend you to fire all your OS developers, get a new crew, and start from scratch...
    Reply
  • BSMonitor - Wednesday, March 28, 2012 - link

    If you were to run the same applications a tablet is designed to, then yes, your Pentium D would actually be overkill. Reply
  • PeteH - Wednesday, March 28, 2012 - link

    The point is made in the article is that it would be impossible provide the quad GPUs (necessary to handle that display) AND quad CPUs. Given you can only do one or the other, quad GPUs is the right choice. Reply
  • zorxd - Wednesday, March 28, 2012 - link

    was it also the right choice to NOT upgrade the GPU when going from the iPhone 3GS to iPhone 4? Reply
  • PeteH - Wednesday, March 28, 2012 - link

    No idea. Was it necessary to upgrade the GPU to get an equivalent experience on the larger screen in that case, or was performance on the 3GS limited by the CPU (or RAM, or something else)? Reply
  • zorxd - Thursday, March 29, 2012 - link

    just look at benchmarks on this web site

    The iPhone 3GS gets more FPS in 3D games because of the lower resolution.

    So in short, yes, it would have been necessary to upgrade the GPU to keep the same performance.

    But no matter what Apple does, people will always say it's the right choice.
    Reply
  • PeteH - Thursday, March 29, 2012 - link

    I looked for a comparison between 3GS and 4 game FPS comparison and couldn't find anything. Can you point me to it?

    I'm looking for hard numbers because just increasing the resolution doesn't necessarily mean a GPU upgrade is necessary. If (and this is completely hypothetical) the 3GS was performance limited because of its CPU, improving the CPU in the 4 could allow it to achieve the same performance at a higher resolution.

    I'm not remotely saying this is the case, just that I've seen no numbers demonstrating a drop in frame rate from the 3GS to the 4.
    Reply
  • dagamer34 - Friday, March 30, 2012 - link

    I believe the GPU got a clock speed increase when it went from the 3GS to the 4. Reply
  • Peter_St - Monday, April 02, 2012 - link

    Oh wait, let me rephrase this: I have this nice shiny tower with 2GB of RAM and newest CPU out there but shitty OS with java hogs and memory leaks, but who cares, I'll just go and jerk off on the specs.

    I think that's what you wanted to say...
    Reply
  • tipoo - Wednesday, March 28, 2012 - link

    GPUs which consume hundreds of times more watts than SoCs like this and have much more memory bandwidth at their disposal still struggle with the resolution this thing is displaying. The Xbox 360 GPU has, if I recall, around 25GB/s vs 6 in this, and that struggles to run games at 720p in a constant 30FPS. So far, it seems like the retina compatible games do display at native res, but there aren't any improvements in textures, effects, etc. So would the additional GPU power effectively be negated by the resolution for native apps, and still be constrained to games that look straight out of 2003-4? Or is Imagination Tech's video memory compression that much more advanced than AMDs/Nvidias so bandwidth doesn't matter as much? Reply
  • zorxd - Wednesday, March 28, 2012 - link

    It's not only about the resolution. You could probably play Doom just fine with the SGX543MP4 at this resolution. The problem is when you have more complex level of details, shaders, etc. The iPad couldn't play a game like Crysis even at half resolution. But even at 2048x1536, Doom will still look like a game of the 90s. Reply
  • tipoo - Wednesday, March 28, 2012 - link

    *12.8GB/s, my mistake Reply
  • BSMonitor - Wednesday, March 28, 2012 - link

    What's battery life watching a bunch of movies.. say from New York to Hawaii? Will I be able to get 9 hours??

    Can run all the compute benchies we want, but primarily these are portable entertainment devices. The simplest use being the most common.
    Reply
  • PeteH - Wednesday, March 28, 2012 - link

    Depends how bright you want the display, but from the number they're posting you should be fine at < 70% max brightness.

    I would argue that the most common use case is probably web browsing though, not movie watching. Unless... how often are you on these flights from New York to Hawaii?
    Reply
  • BSMonitor - Thursday, March 29, 2012 - link

    Common might have been the wrong word. But for the masses of users(mostly consumer space), internet browsing is probably the most frequent task. But overall time spent on the devices, I would guess movies/videos are #1. Reply
  • mavere - Wednesday, March 28, 2012 - link

    The iPad's h.264 decoder has always been especially efficient.

    If Apple's battery life claim is 10 hours, I'd expect 11-12 hours non-streaming video playback.
    Reply
  • Openmindeo - Wednesday, March 28, 2012 - link

    In the second page it says that the ipad 1 has a memory of 256GB .

    The entire article was fine.
    Regards.
    Reply
  • Anand Lal Shimpi - Wednesday, March 28, 2012 - link

    Thanks for the correction!

    Take care,
    Anand
    Reply
  • isoJ - Wednesday, March 28, 2012 - link

    Good points about the future and a clear demand for low-power bandwidth. Isn't PS Vita already shipping with Wide-IO? Reply
  • BSMonitor - Wednesday, March 28, 2012 - link

    I only have an iphone so I have not played around with the home sharing and all that in iTunes.

    Can they communicate locally via WiFi for Movies, Music, etc(without a PC or iCloud)?? My impression is that they must be connected to iTunes or iCloud to access/transfer media content.

    i.e. I have a 64GB iPhone 4, and load it with 20 movies and GB's of music. And say a 16GB iPad. Can I transfer a movie from the iPhone to iPad with it going to the PC/Mac or iCloud first?
    Reply
  • darkcrayon - Wednesday, March 28, 2012 - link

    You can transfer movies if you have a movie player app that supports it, several apps support file transfers over wifi (for example GoodReader can copy any of its files to GoodReader running on another device). You could use GoodReader and similar apps instead of Music to play songs, though it's not as well integrated into the OS (but songs will still play and switch in the background, etc).

    You can not transfer things you've previously loaded into the Music or Videos (ie the built in "Apple apps") between two iPads though.
    Reply
  • BSMonitor - Thursday, March 29, 2012 - link

    Thanks. I figured as much. Reply
  • ltcommanderdata - Wednesday, March 28, 2012 - link

    On page 11 you say:
    "Perhaps this is why Apple forbids the application from running on a first generation iPad, with only one CPU core."

    I don't think the single CPU core is the primary reason why iPhoto isn't supported on the first gen iPad. Afterall, the same single core A4 iPhone 4 support iPhoto. What's more, the iPhone 4's A4 is clocked lower than the first gen iPad so CPU performance isn't the primary reason. RAM appears to be the main concern with iPhoto since every supported device has at least 512MB of RAM.
    Reply
  • Anand Lal Shimpi - Wednesday, March 28, 2012 - link

    Very good point, updated :)

    Take care,
    Anand
    Reply
  • Jamezrp - Wednesday, March 28, 2012 - link

    Didn't want to cause Verizon too much trouble? Heh, very funny. I am amazed at how the iPad ends up being an amazingly good Wi-Fi hotspot. It almost seems like business users should opt to get an iPad for that function alone. I know plenty of people who would be happy to keep it in their bag, with the hotspot feature enabled constantly, while travelling about. Even for the price there is nothing even close that can compare.

    Plus, you know, you get the tablet too.
    Reply
  • supertwister - Wednesday, March 28, 2012 - link

    "It’s a quantum leap from the noisy, 0.7MP mess that was the iPad 2 camera."

    Interesting choice of word considering a quantum is the smallest possible division for a quantity...
    Reply
  • omion - Wednesday, March 28, 2012 - link

    Quantum leap:
    (n) an abrupt change, sudden increase, or dramatic advance

    The phrase comes from the ability of particles to make a sudden jump between two energy levels. It is a leap (of any amount) between two quantization levels, not a leap of the smallest possible amount.
    Reply
  • drwho9437 - Wednesday, March 28, 2012 - link

    A large fraction of the die doesn't seem to have a known use? Wondering what could be taking up all that area if not GPU, CPU and memory interfaces/caches... Most other I/O would have small footprints... Reply
  • tipoo - Wednesday, March 28, 2012 - link

    The 4S had a larger than usual die for its voice cancellation features that were needed to make Siri work well, the iPad does't have that but it does have voice dictation so some space is probably for that. Reply
  • PeteH - Wednesday, March 28, 2012 - link

    A big chunk of it is probably the ISP they talked about when the 4S debuted. Reply
  • Lucian Armasu - Wednesday, March 28, 2012 - link

    So this is how I assumed. The new iPad is in fact slower than the iPad 2, if games actually start using the 2048x1536 resolution for their apps, which everyone seems to be encouraging them to do. But once they do that the graphics will either look poorer, or they will be slower than they were on the old resolution, even on an iPad 2.

    Add that to the fact that apps are much bigger in size with the retina resolution, and the CPU is the same as last year. The new display might look great, but it's obvious that the new iPad is absolutely a step-back in terms of performance, whether it's GPU or CPU we're talking about. Hardly worth an upgrade, especially for iPad 2 owners.
    Reply
  • xype - Thursday, March 29, 2012 - link

    Blah blah blah performance blah not worth it.

    I don’t give a shit about theoretical performance that I might be getting if DNA folding software was available for tablets. I really, really give a shit about being able to read website and ebook text without my eyes straining after an hour.

    One would think that 10 years after "No wireless. Less space than a nomad. Lame." and Apple raking in millions and billions of profit, those Geek Metrics™ that people are so fond of here (nothing wrong with that, it’s interesting stuff!), would be recognized as completely and utterly worthless to the average population. But apparently not.

    The iPad was never ment to replace PCs and Consoles as a hardcore gaming device, and it was never ment as a render farm server replacement. It would be really nice if people realized that, at some point. In the next 5 years, perhaps.
    Reply
  • tipoo - Thursday, March 29, 2012 - link

    It seems a bit like the 3GS-4 transition, it used the same GPU despite higher resolution and so performed worse at native, although in this case the CPU is unchanged and the GPU is "only" 2x better for 4x the pixels. Developers got around that on the 4 by making games for the old resolution and using upscaling mode. I'd imagine they will do the same here once games hit the limits of the GPU at native. Games like Infinity Blade 2 also use separate resolutions for things like the menus vs shadows vs terrain textures. Reply
  • darkcrayon - Thursday, March 29, 2012 - link

    I guess if the only thing you bought an iPad for were games, and you could only consider a game to be worthwhile if it were drawn directly at 2048x1536, you'd have a point. But of course the new iPad could play games at the "iPad 2" resolution at much higher detail, or at a slightly resolution with the same detail, etc.

    It doesn't make sense to say it's a step backward in performance overall- it simply has the option to display much higher resolution graphics that the old model didn't have. The iPad 2 displays 2048 x 1536 text at "0 mhz" so to speak. It's not like you are losing anything by having the option of ultra high resolution if the type of game (or app) can use it within the hardware capabilities.
    Reply
  • doobydoo - Saturday, March 31, 2012 - link

    Lucian Armasu, you talk the most complete nonsense of anyone I've ever seen on here.

    The performance is not worse, by any stretch of the imagination, and lets remember that the iPad 2 runs rings around the Android competition graphically anyway. If you want to run the same game at the same resolution, which wont look worse, at all (it would look exactly the same) it will run at 2x the FPS or more (up-scaled). Alternatively, for games for which it is beneficial, you can quadruple the quality and still run the game at perfectly acceptable FPS, since the game will be specifically designed to run on that device. Attempting anything like that quality on any other tablet is not only impossible by virtue of their inferior screens, they don't have the necessary GPU either.

    In other words, you EITHER have a massive improvement in quality, or a massive improvement in performance, over a device (iPad 2) which was still the fastest performing GPU tablet even a year after it came out. The game developers get to make this decision - so they just got 2 great new options on a clearly much more powerful device. To describe that as not worth an upgrade is quite frankly ludicrous, you have zero credibility from here on in.
    Reply
  • thejoelhansen - Wednesday, March 28, 2012 - link

    Hey Anand,

    First of all - thank you so much for the quality reviews and benchmarks. You've helped me build a number of servers and gaming rigs. :)

    Secondly, I'm not sure I know what you mean when you state that "Prioritizing GPU performance over a CPU upgrade is nothing new for Apple..." (Page 11).

    The only time I can remember Apple doing so is when keeping the 13" Macbook/ MBPs on C2Ds w/ Nvidia until eventually relying on Intel's (still) anemic "HD" graphics... Is that what you're referring to?

    I seem to remember Apple constantly ignoring the GPU in favor of CPU upgrades, other than that one scenario... Could be mistaken. ;)

    And again - thanks for the great reviews! :)
    Reply
  • AnnonymousCoward - Wednesday, March 28, 2012 - link

    "Retina Display" is a stupid name. Retinas sense light, which the display doesn't do. Reply
  • xype - Thursday, March 29, 2012 - link

    GeForce is a stupid name, as the video cards don’t have anything to do with influencing the gravitational acceleration of an object or anything close to that.

    Retina Display sounds fancy and is lightyears ahead of "QXGA IPS TFT Panel" when talking about it. :P
    Reply
  • Sabresiberian - Thursday, March 29, 2012 - link

    While I agree that "Retina Display" is a cool enough sounding name, and that's pretty much all you need for a product, unless it's totally misleading, it's not an original use of the phrase. The term has been used in various science fiction stories and tends to mean a display device that projects an image directly onto the retina.

    I always thought of "GeForce" as being an artist's licensed reference to the cards being a Force in Graphics, so the name had a certain logic behind it.

    ;)
    Reply
  • seapeople - Tuesday, April 03, 2012 - link

    Wait, so "Retina Display" gets you in a tizzy but "GeForce" makes perfect sense to you? You must have interesting interactions in everyday life. Reply
  • ThreeDee912 - Thursday, March 29, 2012 - link

    It's basically the same concept with Gorilla Glass or UltraSharp displays. It obviously doesn't mean that Corning makes glass out of gorillas, or that Dell will cut your eyes out and put them on display. It's just a marketing name. Reply
  • SR81 - Saturday, March 31, 2012 - link

    Funny I always believed the "Ge" was short for geometry. Whatever the case you can blame the name on the fans who came up with it. Reply
  • tipoo - Thursday, March 29, 2012 - link

    iPad is a stupid name. Pads collect blood from...Well, never mind. But since when are names always literal? Reply
  • doobydoo - Saturday, March 31, 2012 - link

    What would you call a display which had been optimised for use by retinas?

    Retina display.

    They aren't saying the display IS a retina, they are saying it is designed with retinas in mind.

    The scientific point is very clear and as such I don't think the name is misleading at all. The point is the device has sufficient PPI at typical viewing distance that a person with typical eyesight wont be able to discern the individual pixels.

    As it happens, strictly speaking, the retina itself is capable of discerning more pixels at typical viewing distance than the PPI of the new iPad, but the other elements of the human eye introduce loss in the quality of the image which is then ultimately sent on to the brain. While scientifically this is a distinction, to end consumers it is a distinction without a difference, so the name makes sense in my opinion.
    Reply
  • jjj - Wednesday, March 28, 2012 - link

    Testing battery life only in web browsing ? Maybve that would be ok for a 100$ device.As it is the battery tests are prety poor,you do video playback when every SoC out there has a dedicated decode unit and that test is only representative for vid playback.Here the most important test should have been battery life when both GPU and CPU are loaded and not including that seems like an intentional omission to avoid makiing the device look bad.
    There are a lot of other things to say about the review,too many but one thing has to be said.
    This is a plan B or plan C device.The screen is the selling point,is what had to go in,they didn't had 28/32nm in time and had to go for a heavier,thicker,hotter device with a huge chip (CPU speed is limited most likely by heat not so much power consumption,ofc both are directly related).Apple had to make way too many compromises to fit in the screen,no way this was plan A.
    Reply
  • tipoo - Thursday, March 29, 2012 - link

    I would have liked a gaming battery life test as well. Reply
  • PeteH - Thursday, March 29, 2012 - link

    Beyond even that, I'd like to see a worst-case battery life (i.e. gaming, max brightness, LTE up, etc).

    Also, it'd be really interesting to see how brightness impacts battery life. Maybe the web browsing test at 20%, 40%, 60%, 80%, and 100% brightness. Of course that would probably delay the review by several days, so it might not be worth it.
    Reply
  • Anand Lal Shimpi - Thursday, March 29, 2012 - link

    We did a max brightness test, however a gaming test would be appropriate as well. I will see if I can't run some of that in the background while I work on things for next week :)

    Take care,
    Anand
    Reply
  • SimpleLance - Wednesday, March 28, 2012 - link

    The biggest drain for the battery comes from the display. So, if the iPad will be used for hotspot only (with display turned off), you will get a lot of hours from it because it has such a huge battery.

    But then, using the the iPad just for a hotspot would be a waste of that gorgeous display.

    Very nice review of a very nice product.
    Reply
  • thrawn3 - Wednesday, March 28, 2012 - link

    Am I the only one that feels the max brightness is more important in day to day use of a highly portable device than DPI and color accuracy?
    I absolutely would love to have all of these three be excellent but I think for a tablet or small laptop Max Brightness and DPI are higher priority than color accuracy. This is exactly what the ASUS Transformer Infinity is supposed to be but I would prefer it on a real laptop.
    I care about color accuracy too but I am perfectly fine with needing a desktop monitor and trading brightness there since it is in a stable environment until we hit the technological level that will allow all these elements to be combined. Maybe quantum dot display technology in the future?

    One thing I have to give all these new displays is that they FINALLY have gotten the wide viewing angles thing right and I will be so happy to get this into the rest of the market.
    Reply
  • seapeople - Tuesday, April 03, 2012 - link

    Would you really prefer a bright 1366x768 TN panel with 200 contrast ratio on a 15" laptop over a less bright IPS Ipad screen with much better resolution, DPI, color accuracy, and viewing angle? Reply
  • vision33r - Wednesday, March 28, 2012 - link

    The screen is really gorgeous when you shoot raw with any DSLR and view it in iPhoto. Reply
  • ol1bit - Wednesday, March 28, 2012 - link

    I just bought a Asus Transformer Prime, and your review was spot on with what I decided. I can not live with IOS and using Android for 3 years.

    Just the simple stuff was my decision:
    1. Freedom of Android, file transfers, etc. No Itunes requirement.
    2. MicroSD
    3. Kewl keyboard
    4. Live Wallpaper.
    5. A real desktop, separate from my applications.
    6. 32GB versus 16GB
    7. Gorilla Glass (yes, true. My original droid lived in my pocket 2 years no scratches, my HTC Rezound scratches the first 2 weeks).
    8. Asus (love their MBs)
    9. Nivida (love their GPUs)

    What I will miss:
    1. Ipad 3 Display.
    Reply
  • darkcrayon - Thursday, March 29, 2012 - link

    1. iTunes is no longer ever needed for an iOS device. I consider the option of a first party desktop sync solution to be an advantage now that it's not a requirement.
    7. It seems likely the new iPad uses Gorilla Glass or Gorilla Glass 2...
    9. Odd that you'd love nVidia's GPUs when they've been pretty much the bottom of the performance barrel for ARM device graphics, even excluding Apple's SoCs (which have lately been using the fastest GPUs in the industry by far).
    Reply
  • ananduser - Thursday, March 29, 2012 - link

    Imagination does own the fastest GPU available today. The current Tegra offerings cannot match it but nvidia goes a different way. They will start piling CUDA cores like they do on the desktop GPU front. Say you'll have in the future a better quad core from Imagination and nvidia's GPU will consist of something like 64 CUDA cores. Reply
  • zorxd - Thursday, March 29, 2012 - link

    The problem is that you can't plug an iDevice to a computer a transfer files as you would on a USB thumb drive without iTunes. That's a major disadvantage.

    Also if you excluse Apple's SoC, what company makes better mobile GPU than Nvidia? The Mali 400 MP4 is good too (about on par it seems), but I wouldn't say that Tegra 3 is the bottom of the performance barrel. You seems to forget the major players of Qualcomm and TI.
    Reply
  • darkcrayon - Thursday, March 29, 2012 - link

    You can however transfer files to any number of apps via WIFI or with cloud solutions without needing iTunes though. I'd call it a "disadvantage" but not a major one.

    You're right the Tegra 3 isn't, I was speaking more generally considering how the Tegra 2 performed vs. the competition as well. It just seemed out of place to choose a tablet because you "love nVidia GPUs" when nVidia has not necessarily put out a spectacular GPU in any ARM SoC.
    Reply
  • merajaan - Wednesday, March 28, 2012 - link

    You guys must be commended on this review. You covered all the areas that I wanted to know about and really didn't leave one stone unturned. I applaud the depth and detail and appreciate that you didn't rush your review out for launch day like many other sites. I also appreciate the unbiased nature in which the review was written and your honest viewpoints! Reply
  • Anand Lal Shimpi - Thursday, March 29, 2012 - link

    Thank you for the kind words :)

    Take care,
    Anand
    Reply
  • repoman27 - Thursday, March 29, 2012 - link

    So you've gone and included a lovely example of how the AnandTech icon looks at 72 and 144 dpi... How's about including <link rel="apple-touch-icon" sizes="144x144" href="icon144.png"/> on this site so we can have a proper icon when we add an AnandTech web clip to our home screens? Reply
  • adrien - Thursday, March 29, 2012 - link

    Still reading the review (and liking it), I'm wondering about thermals.

    What was the temperature of the room? Could you try with different room temperatures? (I'm wondering how it'll change when it gets 15°C hotter and how it'll fare with sun shining on it).

    Is there CPU or GPU throttling when it starts heating? Do you know the SoC temp?

    Thanks. :-)
    Reply
  • Anand Lal Shimpi - Thursday, March 29, 2012 - link

    The ambient temperature in the room was approximately 23C. An overheating condition will trigger an OS-wide warning, which I believe causes the system to shut down.

    I unfortunately don't have access to anything that could read the SoC temp.

    Testing at different room temperatures is an interesting idea but one that would be difficult to accurately control without some serious equipment. I ran these tests side by side at the same time to avoid issues with a changing ambient temperature.

    Take care,
    Anand
    Reply
  • adrien - Thursday, March 29, 2012 - link

    Ok, thanks. With summer approaching (and very quickly in France), I guess we'll see real-world tests for the temperature in a few weeks anyway. ;-) Reply
  • scribby - Thursday, March 29, 2012 - link

    Nice review :)

    I'm also wondering about thermals,

    What was the brightness level when measuring the thermals on the new ipad?

    Thank you.
    Reply
  • Anand Lal Shimpi - Thursday, March 29, 2012 - link

    Max brightness.

    Take care,
    Anand
    Reply
  • h4stur - Thursday, March 29, 2012 - link

    I use it every day. But it don't see enough improvement in the new version, to warrant an upgrade. I view the high ress as an actual downgrade. As the machine will have to upscale the majority of the content. Reply
  • mavere - Thursday, March 29, 2012 - link

    text text text text.

    If that means nothing to you, then the upgrade won't do anything for you. For the rest of us, this screen is a godsend.
    Reply
  • darkcrayon - Thursday, March 29, 2012 - link

    I'm guessing the machine will have to upscale very little content other than images on the web in a month or two. Every major app will be updated for the higher resolution, no new app will be caught dead not supporting the new resolution, and text based apps get a "free" upgrade to the higher resolution. If your primary concern is whether images on the web will be updated, then that's an area for disappointment. Otherwise... Reply
  • adityarjun - Thursday, March 29, 2012 - link

    I love this site and most of the reviews. Since the ipad has been released I have been coming here 6-7 times a day just for this review. Glad to see it finally put up. I just registered here specifically to ask a few questions.

    While I was more than impressed with the review, I was hoping to read something about the use of Ipad as an educational tool. This section was sadly missing.

    I am a engg grad student and I am currently looking for a good pdf reader. The only viable options for me are the new Ipad or the Kindle DX (the kindle 6" is too small). While the Kindle does sound good , the problem is that some of my pdf books are over 100mb and full of mechanical drawings. Will the Kindle be able to handle that, especially if i want to frequently jump pages or refer to multiple books side by side? I have never seen a Kindle in person so anyone who has used it, please comment.

    Reading ebooks on my laptop is a pain. I often read through the night and that is not possible for me to do on a laptop. The vertical height is too small and I often end up turning the laptop 90 degree to read. Not to mention, carrying around a laptop in your hand is impossible for long durations. Plus the zoom options on Adobe reader are just weird. In short, I am really uncomfortable reading on a laptop. I have tried both a 14" 1366*768 screen as well as 17" 1920*1080 screen

    On the other hand, ipad gives me the advantage of iOS. I will also be able to see OCW videos on the ipad as well as watch my college slides (ppt). Ipad owners please comment-- can i play .avi or real media player file on it directly or through an app? I can also use the educational apps like Khan academy plus it can serve as a note taking device. The disadvantage of the ipad is that reading on it through the night will probably leave me blind in a year or so. I have myopia and my power is -8D. That is one BIG disadvantage, or so I have read. I have never used an ipad so perhaps someone who uses it can share their experience of reading on it for hours at a stretch.

    I am really confused about this so I hope the collective intelligence of this site will help me make an informed decision. And I would really like to see a page in the reviews of tablets that talk about the reading and note talking abilities and the educational purposes they can serve.
    Reply
  • Monobazus - Thursday, March 29, 2012 - link

    I understand your desilusion with the ommission here of any specific discussion of the advantages or desadvantages of using the iPad as a book reader. After all, that may probably be one of the main usages of the iPad, apart from browsing the web or checking the email or Facebook posts. But anandtech.com is mainly a tech site for geeks and technically oriented people, and we must understand that putting a special emphasis on specs and speeds is more interesting for the majority of its readers. For an analysis of your question, you could perhaps go into one of the various sites that deal with ebook readers. Unfortunately most of them, as far as I can tell, have not the level of expertise or care that anantech.com has in its analysis (see http://www.the-ebook-reader.com/ipad-3.html as an example).
    Now to your questions. I have no direct experience with the new iPad or the kindle DX. I have an iPad 1 and a kindle 3 (the one with the 6" screen and no touch controls). I haven't seen yet the new retina display of the iPad, but from what I've been reading it's much better on text than the previous editions. I doubt however that it is as good for the eyes as the eInk screens are - these are reflective and, as such, closer to paper than LCD screens. From my experience - I'm an intensive reader and use glasses, due to my advanced age - eInk screens don't put as much stress on the eyes as the emissive screens do. If you are planning to read through the night with a LCD screen use an indirect ambient light and plan for frequent periods of rest.
    On the other hand, handling pdfs on the Kindle is an awful experience. A DX is certainly better than a 6" one, no doubt, because the bigger screen allows for larger type. On a 6" screen you can forget pdfs. You can't read them. If your typical pdfs can be accommodated in a 9,7" screen without zooming, then a DX can be the eReader for you. But be careful with the illustrations: I think the DX has the same controls that the 6" non-touch kindle. If that's the case be prepared for a bad experience with the illustrations, specially if they are detailed and need zooming (or if they have colour). The DX is a non-touch machine. The iPad touch controls are much better.
    You can't see two documents side by side on any of these readers: not on the iPad and not on the Kindle. For that you need a laptop. On the iPad you can use a trick: open one document in one app (say, on the eBook app) and the other in other app (say, on the kindle reader). By switching rapidly between them, you can see the two documents in rapid succession. You can't do that on the Kindle. But this is a trick, a compromise, and not the same thing as looking to two documents side by side.
    As to seeing ppt's and videos, the iPad is the way to go. There are apps for that. The kindle has not that capability.
    In the end, my advice is this: try to get access to an iPad before buying, and see if it meets your expectations for reading clarity and comfort. Getting access to a DX before buying may be more difficult, because there are few people around with them. I have yet to see one and they are around for several years.
    I'm sorry if these considerations haven't been useful for you.
    Reply
  • Monobazus - Thursday, March 29, 2012 - link

    See this YouTube analysis of the Kindle DX with pdf's: http://www.youtube.com/watch?v=bVPBCD0GgBw&fea... Reply
  • adityarjun - Thursday, March 29, 2012 - link

    Thanks you very much for your reply.

    It does seem as if neither of the two fit my needs perfectly. So I will have to make a compromise.
    A 6" kindle or 7" tablet is out the question. It is just too small to read comfortably on.

    The Kindle DX's screen and size seemed good to me but if you say that it can't handle pdfs comfortably then it is of no use to me. I will not be viewing any newspapers or magazines nor will I be surfing the net with it.

    The only other option that remains is to use the ipad. The pros is that it should be able to handle large pdf *as per videos on youtube* as well as all my videos.
    The con is the eye strain.

    Is it really as bad as some sites make it out to be? Especially when compared to an e ink reader?

    I will try to get my hands on an ipad and use it for a day or two but come to think of it, the screen cant be that much more stressful than a normal laptop, can it? And I have been reading reviews of the SoCs on Anandtech since morning...

    Damn, I am really gonna go blind at this rate. *summons immense willpower and tries to close anandtech* * fails :-) *
    Reply
  • mr_ripley - Thursday, March 29, 2012 - link

    I keep and read all my technical pdf files on the ipad (textbooks, reports, memos, drawings, etc). I use an app called GoodReader which is absolutely amazing with all kinds of pdfs.

    Regarding eye strain, I usually keep my brightness setting at around 50% and zoom in to make the font large, which strains my eyes a little less and definitely less than a desktop screen. The sharp font on the new retina screen helps as well. That said I will admit it is not as easy on the eyes as an e-ink display.
    Reply
  • tbutler - Thursday, March 29, 2012 - link

    Honestly? I think the iPad's screen (even the first iPad, let alone the new one) gives me significantly *less* eyestrain than eInk, and I've owned a couple of Sony eInk readers.

    For me, the key eyestrain issue between the two is contrast. eInk displays are a light grey background with dark grey text, and in bright lighting the contrast is fine. But in less than bright lighting - for example, an indoor room without either a ceiling light fixture or multiple floor lamps - I start having trouble with distinguishing the text. Even a 40-year-old yellowing paperback is easier for me to read under those conditions. While you can use a clip-on reading light, I find that both clunky and less effective than it would be on paper.

    The iPad (and really, any backlit LCD screen) has the 'stare into backlight' issue; but honestly, this is rarely a problem for me, and in particular it's much less of a problem than eInk contrast issues. Backlit color LCDs also wash out in bright sunlight, but not in even the most brightly-lit interior room, in my experience - however, for me this isn't a significant issue, since I spend much more time reading indoors than outdoors.

    So just in terms of legibility, I'd pick the iPad (or the nook Color/Tablet) over any of the eInk readers I've used. And that's leaving out issues of software and PDF handling.
    Reply
  • Riseofthefootclan - Thursday, March 29, 2012 - link

    I entered the tablet market this year in hopes of enhancing my school experience. I was looking for a device that would do the following: reading textbooks, slides, notes, watching video etc.

    I too looked at the kindle, but I will tell you now that for what you want I'd avoid it.

    I first purchased a Samsung galaxy tab 10.1 LTE. I wanted Internet every where I went, but soon became frustrated with the android operating system (inconsistently chunky etc).

    After playing with an iPad 2 in the store, I realized it was a much better experience. Fluid and problem free.

    A month later the iPad 3 (new iPad) is released. After playing with it I realized how much better the screen was, and how much that impacted the experience (especially for someone who primarily uses the device for text consumption).

    So now, here I sit, with a 32b LTE iPad 3. I don't regret the purchase one bit. Armed with the Bluetooth keyboard, or just the on screen variant, I can also take notes quite competently (wrote this entire thing out with the on screen keyboard).

    Best educational tool I have ever purchased. In my hands I can carry my one stop shop for web browsing, email, textbooks, fictional books, course materials, lectures and even games.

    Coming from an iPad 2, I'd go so far as to say it was well worth the upgrade.

    I highly recommend picking one of these up, as I believe it will fit your bill of requirements to a tee.
    Reply
  • adityarjun - Thursday, March 29, 2012 - link

    Thanks dude! And all others who replied. I guess I will be picking up a 32gb LTE version of the ipad.
    Do you guys know whether the ipad has international warranty? If I were to buy it from the US and import it here, would I have warranty?
    And how many years of warranty does it have? Is it a replacement warranty, i.e. , if anything is broken they give a new ipad or a normal warranty?
    This is another aspect the review didnt cover. A para detailing the warranty and tech support should have been there imo.
    Reply
  • adityarjun - Thursday, March 29, 2012 - link

    Oops forgot to add this in the comment above-- which keyboard are you using.. I think I will pick the Logitech one.
    And any good stylus?
    Also, for protection I guess I will go with a Zagg shield and the smart cover. Will that be enough?
    Reply
  • OCedHrt - Thursday, March 29, 2012 - link

    How come the review starts with the 10.1-inch 1920 x 1200 Super IPS+ tablet but all the comparisons are with the 1280 x 800 tablet? Reply
  • adityarjun - Thursday, March 29, 2012 - link

    I am not sure but I dont think that those tablets are in the market yet. That was just a comparison of specs. Later on we had a comparison with other major tablets available in the market currently.i.e ipad 2 and the transformer prime. Reply
  • OCedHrt - Thursday, March 29, 2012 - link

    Says 40 nm on page 2 and 45 nm on page 6. Reply
  • g1011999 - Thursday, March 29, 2012 - link

    At Page (The A5X SoC) / Table (ARM Cortex A9 Based SoC Comparison)

    The cell for "A5X" and "Memory Interface to the CPU" shall be "Quad channel(128bit)"
    Reply
  • Ryan Smith - Thursday, March 29, 2012 - link

    Actually that's correct as it stands. The memory interface to the CPU is 64bit on the A5X. The other two memory channels go to the GPU, rather than the CPU. Reply
  • g1011999 - Thursday, March 29, 2012 - link

    No, Those memory controllers are multi-port AXI controller which are connected to L2 cache controller, system fabrics, GPU.

    L2 cache controller is connected to all those 128bit dram controller, either through direct connection (memory adapter like omap 4470) or through system AXI bus, so the cpu can access all the memory.

    The A5X is a SoC coupled with 128bit quad channel DRAM regardless whether the bandwidth from CPU(L2 cache) to memory is sufficient or not.

    The IPs ( CPU, video codec, display controller, GPU, CAM-IF ...) on SoC can take advantage from the 128bit memory interface with less chance of congestion.
    Reply
  • PeteH - Friday, March 30, 2012 - link

    And how do you know the internal system bus is AXI? Reply
  • damianrobertjones - Thursday, March 29, 2012 - link

    " It's got everything but the kitchen sink. "

    SD card reader?
    USB port?
    Ethernet port?
    Removable battery?
    Move files straight from the device to a pc without ANY software?

    The list could go on.
    Reply
  • darkcrayon - Thursday, March 29, 2012 - link

    But there is n WIFI, optional LTE, Bluetooth 4, and a dock connector (which includes USB) to handle many similar functions. Reply
  • mr_ripley - Thursday, March 29, 2012 - link

    oh, and also:

    a slot to plug in my punch cards and floppy discs
    and it would be nice if it had disc drive so i could play my audio cds
    maybe connect to dial up modems too
    ......

    all because i could not move on to better ways of doing things. All of what you mention is either obsolete or redundant!!
    Reply
  • dagamer34 - Friday, March 30, 2012 - link

    I think you want a laptop.... Reply
  • jihe - Thursday, March 29, 2012 - link

    "I said I wanted to give it a shot at being a real productivity device"
    That is where you went wrong. Pads are toys and nothing more.
    Reply
  • repoman27 - Thursday, March 29, 2012 - link

    You might want to try running that last statement by a pilot, doctor or teacher. They're not terribly optimized for content creation at this point, but if you think that their value does not extend beyond mere entertainment, then you really haven't considered the possible use cases for these devices. Reply
  • mavere - Friday, March 30, 2012 - link

    Don't forget lawyers.

    Lots and lots of trees have been saved since the iPad's introduction.
    Reply
  • neoabraxas - Thursday, March 29, 2012 - link

    I find it absolutely ridiculous that someone who does not appreciate the tablet form factor is offering their thoughts on the new iPad.

    Is there really nobody at Anandtech who genuinely enjoys tablets and can write a summary that is aimed more at tablet enthusiasts?

    Bloggers like you do tend to write a lot on their devices. Most people don't. For them tablets are media consumption devices. I'm a programmer. When I get home the last thing I want to do with my computing equipment is type more. For me a tablet is ideal.
    Reply
  • vol7ron - Thursday, March 29, 2012 - link

    To be honest, the last thing I want to do on a tablet/smartphone is type. Unless, it's with the Transformer (or other like) keyboard, but even then it's still not what I want to use a tablet for.

    The main thing, for me, is reading.
    Reply
  • MrCromulent - Thursday, March 29, 2012 - link

    Great review! I was looking forward to reading it when I saw it posted yesterday evening.

    One point I'm missing from every review though: Has the touchscreen sensitivity / resolution changed in any way? The doubling of display resolution does not imply doubling of the touch input resolution, right? I love the iPad, but I always found it almost unusable for any kind of handwritten input (be it finger or stylus).
    Reply
  • SixOfSeven - Thursday, March 29, 2012 - link

    Any chance the glass is less likely to shatter on this one than on its predecessors?

    I didn't think so.
    Reply
  • darkcrayon - Thursday, March 29, 2012 - link

    The iPad 2's glass was much more resilient than that on the 1.. So who knows. I wonder if it's Gorilla Glass 2 and that's where a bit of weight savings came on the new iPad, considering the battery is so much larger yet the device is only slightly heavier. Reply
  • pdjblum - Thursday, March 29, 2012 - link

    So you convinced yourself there is a use for it as it is "the world's greatest netbook" to your mind. Yet you can get a much more powerful intel notebook for about the same price or less. It is nothing more than an expensive indulgence. It is anything but enthusiast gear. Oh, I forgot, this has become another gadget site that loves crapple. Reply
  • kepler - Thursday, March 29, 2012 - link

    That isn't true at all. Wait until you read the review for the Transformer Prime Infinity, it will be just as detailed, and I'm sure they'll like it just as much (or more) than the iPad3.

    I dislike Apple for a number of reasons, but I don't feel AnandTech has shown any bias.
    Reply
  • darkcrayon - Thursday, March 29, 2012 - link

    Not that you're anywhere close to reality, but I wonder why all these "gadget sites" seem to "love" Apple? Maybe because they make good products that people enjoy using? Naah, must be some hidden conspiracy. You're the one that knows the real deal, right?

    Everyone knows you can get a more powerful intel netbook for the same price. You can also get a more powerful intel netbook than smartphones which also cost more than the iPad. Oddly enough, you will not find an intel netbook on the market now with a screen anywhere near as nice as the iPad. Cool times we live in, eh?
    Reply
  • doobydoo - Sunday, April 01, 2012 - link

    Can you tell me which 'more powerful' intel notebook you can get which is capable of gaming at 2048 x 1536 at 60 fps and has a battery life over 9 hours, as well as being ultra-portable and light, instantly turns on, has a camera and built in 4G, at less than or the same price as the iPad?

    Good luck.
    Reply
  • Lil Cheney - Thursday, March 29, 2012 - link

    Wondering why as you review the A5x, you never use a Snapdragon chip for comparison, in addition to the Tegra 2 and 3? Reply
  • PeteH - Thursday, March 29, 2012 - link

    Might be the lack of a shipping product to benchmark it against. The only performance numbers I've seen for an S4 came from Qualcomm's reference design. Reply
  • dagamer34 - Friday, March 30, 2012 - link

    I'm actually not aware of any major tablets that use a Snapdragon chip. Most went with Tegra 2 early on, then moved to the OMAP 4 platform later in the year last year. Reply
  • siddharth7 - Thursday, March 29, 2012 - link

    Well, the review is just amazing! Though its late than other sites, it was worth the wait. You went into so much of detail that I was just blown away. Photos are also great. Waiting for more reviews like this.
    Keep up the good work.

    BTW, am I the only Indian commenting here expect the staff.

    Thanks Anandtech.
    Reply
  • vol7ron - Thursday, March 29, 2012 - link

    Think you meant "except" :) Reply
  • adityarjun - Friday, March 30, 2012 - link

    No, you aren't Reply
  • siddharth7 - Friday, March 30, 2012 - link

    Yeah! Kind of a typo :). Also forgot the question mark. :-)
    Wanted to edit it, but was not able to after posting. :)
    Reply
  • Xyraxx - Thursday, March 29, 2012 - link

    Ok, gaming clearly the TF came out ahead. Why the backhanded commentary in that section? I don't see that for the sections that the iPad clearly won. TF takes the overall top spot for its gaming performance. But instead of commentary on that, we get aggressive talk about how they should be pushing even further ahead, and how they are failing at it.

    The controller compatibility is an absolute win for the Android side, but instead of talk about that, we get this "Yeah, but who says controllers will win over touch". Its like every advantage the iPad doesn't win over, gets trotted out and downplayed as if to say it doesn't mean anything, or somehow doesn't matter.
    Reply
  • darkcrayon - Thursday, March 29, 2012 - link

    Interesting. I got a different impression entirely. It seems like games specifically optimized for the Tegra 3 by nVidia were somewhat better visuallyr, but the iPad has a more extensive game library and considering the GPU is far more powerful than the Tegra 3's, it's only a matter of time before there are far better graphics to be had on iPad games.

    Though there is no OS level controller support in iOS, both Bluetooth and dock connector controllers are possible (hence the iControlpad iCade, and a few others). It may be that more games support them now on Android, but nothing is stopping developers from supporting them on iOS at this point.

    Finally, it's pretty important that ~100% of the iPad games in the App Store will run on the new iPad, which can't be said for the TF as shown in the review, or probably for any other individual Android tablet.
    Reply
  • mr_ripley - Thursday, March 29, 2012 - link

    It seems to me that the iPad does charge when I plug it into the USB port on my Macbook Pro. In fact I was surprised to see that not only did it not say "not charging" and show the "plugged-in" icon, it also seemed pretty fast (I will have to try it again to see if it was as fast as the power charger).

    However, when I tried to this morning to plug it into my Lenovo laptop it showed "not charging". Does this only work when plugged into Apple products?
    Reply
  • vol7ron - Thursday, March 29, 2012 - link

    This question seems better suited for the Apple support forums. Reply
  • doobydoo - Sunday, April 01, 2012 - link

    This reply seems better suited to the kids-r-us forum. Reply
  • Aenean144 - Thursday, March 29, 2012 - link

    Modern Macs have special USB hubs that output 7 to 8 Watts of power (~5V at ~1.6A). Most Windows machines or non-Macs and older Macs output about 4.5 Watts max. There's a USB battery charging specification, but I'm not aware of any computers that have this implemented.

    So, most Mac should be able to charge an iPod, iPad, iPhone relatively quickly. A PC with the normal USB specs will typically do it a little bit slower. 40 to 50%. 4.5 Watts is basically the bare minimum. Doable on an iPad 2 with the screen off, but the 2012 iPad will be tough. If you have the screen off, turn off WiFi and Bluetooth, it'll charge ok if Apple lets it. The screen has to be off with 4 Watt power source.
    Reply
  • mr_ripley - Thursday, March 29, 2012 - link

    Ah ha! Thanks for the clarification. When I saw that I was able to charge my new ipad while using it with my Macbook pro USB, I neglected to bring along my charger to work. Seems like that was unwise.

    But I still have around 50% power left so should be fine for today. And yes I do use it at work.
    Reply
  • name99 - Friday, March 30, 2012 - link

    Just to clarify, this is NOT some Apple proprietary thing. The Apple ports are following the USB charging spec. This is an optional part of the spec, but any other manufacturer is also welcome to follow it --- if they care about the user experience. Reply
  • darkcrayon - Thursday, March 29, 2012 - link

    All recent Macs (last 2-3 years) can supply additional power via their USB ports which is enough to charge an iPad that's turned on (though probably not if it's working very hard doing something). Most non-Mac computer USB ports can only deliver the standard amount of USB power, which is why you're seeing this.

    Your Lenovo *should* still recharge the iPad if the iPad is locked and sleeping, though it will do so very slowly.
    Reply
  • dagamer34 - Friday, March 30, 2012 - link

    I did the calculations and it would take about 21 hours to recharge an iPad 3 on a normal non-fast charging USB port from dead to 100%. Keep in mind, we're talking about a battery that's larger in capacity than the 11" MacBook Air. Reply
  • snoozemode - Thursday, March 29, 2012 - link

    http://www.qualcomm.com/media/documents/files/snap... Reply
  • Aenean144 - Thursday, March 29, 2012 - link

    Anandtech: "iPhoto is a very tangible example of where Apple could have benefitted from having four CPU cores on A5X"

    Is iPhoto really a kind of app that can actually take advantage of 2 cores? If there are batch image processing type functionality, certainly, though I don't know if iPhoto for iOS has this type of functionality. The slowness could just be from a 1.0 product and further tuning and refinement will fix it.

    I'm typically highly skeptical of the generic "if the app is multithreaded, it can make use of all of the cores" line of thought. Basically all of the threads, save one, are typically just waiting on user input.
    Reply
  • Anand Lal Shimpi - Thursday, March 29, 2012 - link

    It very well could be that iOS iPhoto isn't well written, but in using the editing tools I can typically use 60 - 95% of the A5X's two hardware threads. Two more cores, at the bare minimum, would improve UI responsiveness as it gives the scheduler another, lightly scheduled core to target.

    Alternatively, a 50% increase in operating frequency and an improvement in IPC could result in the same net benefit.

    Take care,
    Anand
    Reply
  • shompa - Friday, March 30, 2012 - link

    *hint* Use top on a iOS/Android device and you will see 30-60 processes at all time. The single threaded, single program thinking is Windows specific and have been solved on Unix since late 1960. Todays Windows phones are all single threaded because windows kernel is not good at Multit hreding.

    With many processes running, it will always be beneficial to have additional cores. Apple have also solved it in OSX by adding Grand central dispatch in their development tools making multithreaded programs easy.

    Iphoto for Ipad: Editing 3 million pixel will demand huge amount of CPU/GPU time + memory. Apple have so far been able to program elegant solutions around the limits of ARM CPUs by using NOVA SIMD extensions and GPU acceleration. An educated guess is that Iphoto is not fully optimized and will be at later time.

    (the integrated approach gives Apple a huge advantage over Android since Apple can accelerate stuff with SIMDs. Google does not control the hardware and can therefore not optimize its code. That is one of the reasons why single core A4 was almost as fast as dual core Tegras. I was surpassed when Google managed to implement their own acceleration in Andriod 4.X. Instead of SIMD, Google uses GL, since all devices have graphics cards. This is the best feuture in Android 4.x.)
    Reply
  • name99 - Thursday, March 29, 2012 - link

    [/quote]
    Apple’s design lifespan directly correlates to the maturity of the product line as well as the competitiveness of the market the product is in.
    [/quote]

    I think this is completely the wrong way to look at it. Look across the entire Apple product line.
    I'd say a better analysis of chassis is that when a product first comes out, Apple can't be sure how it will be used and perceived, so there is some experimentation with different designs. But as time goes by, the design becomes more and more perfected (yes yes, if you hate Apple we know your feelings about the use of this word) and so there's no need to change until something substantial drives a large change.

    Look, for example, at the evolution of iMac from the Luxo Jr version to the white all-in-on-flatscreen, to the current aluminum-edged flatscreen which is largely unchanged for what, five or six years now. Likewise for the MacBook Pro.
    Look at the MacBook Air. The first two revs showed the same experimentation, trying different curves and angles, but Apple (and I'd say customers) seems to feel that the current wedge shape is optimal --- a definite improvement on the previous MBA models, and without anything that obviously needs to be improved. (Perhaps the sharp edges could be rounded a little, and if someone could work out the mechanicals, perhaps the screen could tilt further back.)

    And people accept and are comfortable with this --- in spite of "people buy Apple as a fashion statement idiocy". No-one will be at all upset if the Ivy League iMacs and MBAs and Mac Minis look like their predecessors (apart from minor changes like USB3 ports) --- in fact people expect it.

    So for iPhone and iPad. Might Apple keep using the same iPhone4 chassis for the next two years, with only minor changes? Why not? There's no obvious improvement it needs.
    (Except, maybe, a magnet on the side like iPad has, so you could slip a book-like case on it that covered the screen, and switched it on by opening the book.)
    Likewise for iPad.

    New must have features in phones/tablets (NFC? near-field charging? waterproof? built-in projector like Samsung Beam?) might change things. But absent those, really, the issue is not "Apple uses two year design cycles", it is "Apple perfects the design, then sticks with it".
    Reply
  • mr_ripley - Thursday, March 29, 2012 - link

    "In situations where a game is available in both the iOS app store as well as NVIDIA's Tegra Zone, NVIDIA generally delivers a comparable gaming experience to what you get on the iPad... The iPad's GPU performance advantage just isn't evident in those cases..."

    Would you expect it to be if all the games you compare have not been optimized for the new ipad yet? They run at great frame rates but suffer in visuals or are only available at ipad 2 resolutions. The tegra zone games are clearly optimized for Tegra while their iOS counterparts are not optimized for the A5x, so of course the GPU advantage is not evident.

    This comparison does not seem fair unless there is a valid reason to believe that the tegra zone games cannot be further enhanced/optimized to take advantage of the new ipad hardware.

    I suspect that the tegra zone games optimized for A5x will offer a tangibly superior performance and experience. And the fact that the real world performance suffers today does not mean we will not see it shortly.
    Reply
  • Steelbom - Thursday, March 29, 2012 - link

    Exactly this. Reply
  • Steelbom - Thursday, March 29, 2012 - link

    I'm curious why we didn't see any graphics benchmarks from the UDK like with the iPhone 4S review? Reply
  • Craig234 - Thursday, March 29, 2012 - link

    Wow, this is good to buy... 'if you are in desperate need for a tablet'?

    That's a pretty weak recommendation, I expected a much stronger endorsement based on the review.
    Reply
  • Chaki Shante - Friday, March 30, 2012 - link

    Great, thorough review, thanks Anand et al.

    Given the sheer size of the SoC (like 4x larger then Tegra2 or OMAP4430, and 2x Tegra3), you'd bet Apple has the fastest current SoC, at least GPU-wise.

    This SoC is just huge and Apple's margin is certainly lowered. Is this sustainable on the long run ?

    I wonder if any other silicon manufacturer could make same size devices (not technologically but from a price perspective) and expect to sell them.
    Reply
  • dagamer34 - Friday, March 30, 2012 - link

    No one else needs to crank out so many chips that are the same. Also, other companies will be waiting long enough to use 28nm, so there's little chance they'll be hitting the same size as the A5X on 45nm. Reply
  • Aenean144 - Friday, March 30, 2012 - link

    Since Apple is both the chip designer/licensee and hardware vendor, it saves them the cost of paying a middleman. Ie, Nvidia has to make a profit on a Tegra sale, Apple does not, and can afford a more expensive chip from the fab compared to the business component chain from Asus to Nvidia to GF/TSMC and other IP licensees.

    I bet there is at least 50% margin somewhere in the transaction chain from Asus to Nvidia to GF/TSMC. Apple may also have a sweetheart IP deal from both ARMH and IMGTEC that competitors may not have.
    Reply
  • shompa - Friday, March 30, 2012 - link

    @Aenean144

    Tegra2 cost 25 dollars for OEMs and 15 dollars to manufacture. A5 cost Apple 25 dollars to manufacture. By designing its own SoC Apple got 30% larger SoC at the same price as Android OEMs.

    Tegra3 is huge. That is a problem for Nvidia. It costs at least 50% more to manufacture. Nvidia is rumored to charge 50 dollar for the SoC.

    A5X is 50%+ larger then Tegra3. Depending of yields it cost Apple 35-50 dollar per SoC.

    The integrated model gives Apple cheaper SoCs, but also custom designed for their needs. Apple have a long history of Accelerating stuff in its OS. Back in 2002 it was AltiVec. Encoding a DVD on a 667mhz powerbook took 90 minutes. The fastest X86 AMD 1.5ghz it took 15 hours. (and it was almost impossible to have XP not bluescreen for 15 hours under full load). Since 2002 Apple accelerate OSX with Quarz Extreme. Both these techniques are now used in iOS with SIMD acceleration and GPU acceleration. Its much more elegant then the brute force X86 approach. Integrated makes it possible to use slower, cheaper and more efficient designs.
    Reply
  • shompa - Friday, March 30, 2012 - link

    The A5X SoC is a disaster. Its a desperation SoC that had to be implemented when TSMC 28nm process slipped almost 2 years. That is the reason why Apple did not tape out a 32nm A5X on Samsung. PA Semi had to crank out a new tapeout fast with existing assets. So they took the A5 and added 2 more graphics core.

    The real A6 SoC is probably ready since long back, but TSMC cant deliver enough wafers. The rumored tapeout for A6 was mid 2011. Apple got test wafers from TSMC in june and another batch of test wafers in october. Still at this point Apple believed they would use TSMC for Ipad3.

    ARM is about small, cheap and low power SoCs. That is the future of computing. The A5X is larger then many X86 chips. Technically Intel manufactures many of its CPUs cheaper then Apple manufactures the A5X SoC. That is insane.
    Reply
  • stimudent - Friday, March 30, 2012 - link

    Products reviews are fun to look at, but where there's a bright side, there is always a dark side. Maybe product scoring should also reflect how a manufacturer treats its employees. Reply
  • name99 - Friday, March 30, 2012 - link

    You mean offers them a better wage than they could find in the rest of China, and living conditions substantially superior to anywhere else they could work?
    Yes, by all means let's use that scoring.

    Or perhaps you'd like to continue to live your Mike Daisey dystopia because god-forbid that the world doesn't conform to your expectations?
    Reply
  • Craig234 - Friday, March 30, 2012 - link

    I'm all for including 'how a company treats its employees' and other social issues; but I'd list them separately, not put them in a product rating. Reply
  • mr_ripley - Friday, March 30, 2012 - link

    It's a shame some people argue that against the workers when over a hundred of them have committed suicide over the working conditions. How can you still say that they are being offers a better deal here??

    On the other hand, it is also unfair that Apple is being singled out here. The world of Chinese manufacturing is a dirty one and all major corporations have a part in it. I'd trust Apple over most other companies to make a difference in that regard, and I'm happy to see something is being done in that regard. Ever heard McDonalds CEO touring the slaughterhouse of the meat packing companies??
    Reply
  • name99 - Friday, March 30, 2012 - link

    Reporting suicides as a number not as a rate shows you to be either a fool or a deliberate liar. How many people, over how many years, comprise the pool from which this suicide number is drawn? Everything I have read says that the actual suicide rate is not only lower than the average rate for China, it is lower than the average rate for the US. Reply
  • mr_ripley - Friday, March 30, 2012 - link

    In 2010, 18 workers attempted sucide, 14 succeeded. To me even one in a whole year is not acceptable. If you think that is ok I hope that statistic turns out to be you!! Reply
  • name99 - Saturday, March 31, 2012 - link

    The argument was NOT that suicide is a tragedy, it was a claim that FoxConn employees specifically tied to Apple production have such lousy lives that they commit suicide in higher numbers that other people around the world.

    You have done NOTHING to prove this claim; all you have done is bring up a very different issue.
    Reply
  • mr_ripley - Saturday, March 31, 2012 - link

    There is no disputing the fact that these deths are related to working conditions. I'm pretty sure this has been well established and documented. However, I did say in my previous post that Apple is unfairly singled out. It could have been any other company.

    Comparison between suicide rates is irrelevant. Higher sucide rates elsewhere does not justify this problem. Again the fact remains that many people have died and it is directly related to the working conditions.

    Apple happens to be in a position to directly influence their lives and make it better, after all they profit in billions from the work these people do. Corporations typically place little value over human life and living conditions (IBM sold equipment to the Nazis to track the Jews in concentration camps). Somehow, I feel Apple is different.
    Reply
  • doobydoo - Sunday, April 01, 2012 - link

    Dude, sorry but you're talking no sense at all.

    First of all, pretty much any product you want to buy, electronics wise, uses parts from China where conditions are far worse on average, than Apples factories. So if you actually factored working conditions into the product review, it would look favourable for Apple.

    Secondly, your argument that comparison between suicide rates is irrelevant, is absurd. Higher suicide rates where legislation is such that no jobs suffer such terrible conditions that suicide is the only option, such as is the case here, prove that even if working conditions are refined, you still get some depressed people. Your argument, therefore, is with the people who committed suicide. You say it is 'directly related to the working conditions' but where have you evidenced this, at all? You simply haven't. The fact that the suicide rates at Apple factories are lower than some American ones further backs up my point on this.

    Every company is in a position to change lives and make them better. You too, are in a position to do this. But guess what. You, just like companies, can do WHATEVER YOU LIKE with your OWN MONEY and have NO OBLIGATION WHATSOEVER to solve the worlds problems. Apple already has amongst the best conditions of factories in China. The amount of profit they make is absolutely irrelevant, if you say Apple should be putting money into this then a lot more manufacturers should also put a lot more money into this. It's very easy to decide what other people 'should' do with their money now, isn't it?

    Corporations don't have to adhere to moral values - they are not people. They are there solely to make money. Nothing else. Don't confuse them with people. And I hope you donate every single spare penny to charity and spend every spare second of your time working to build homes in the 3rd world. Oh wait, you're on here crying that other people should do it instead.

    Get a hold of yourself you illogical fool.
    Reply
  • mr_ripley - Sunday, April 01, 2012 - link

    Like I have said before it is a shame some people argue with great zeal against others who are suffering and devalue human life. Fortunately, Tim Cook is not one of them.

    If scores of people killing themselves citing poor working conditions is not enough proof what is? If your claim that there are work environments in America that have higher suicide rates because of working conditions is true that needs to be investigated as well and rectified.

    You give charity to people who are in need and cannot earn for themselves. If you think giving someone fair amount of compensation for hard work is charity you are delusional.

    If working in those factories is such a pleasent experience I suggest you try it out for yourself. Maybe the experience might broaden your perspective.

    Although, I don't see the point I will attempt to educate you. Legally, a corporation is considered as a person, that is right just like a live human being. Regardless of that corporations are run by people and actions of a corporation reflect upon the morality of the people running them.

    I will stop here as there is no point in continuing but you can respond with more insults and accusations of what I do or have done which frankly is no concern of yours.
    Reply
  • PeteH - Monday, April 02, 2012 - link

    I've not seen a single report of people killing themselves and citing "poor working conditions" as the reason. Can you provide a link?

    There have been reports of people killed because of unsafe working conditions, but that's a different issue. Maybe you're confusing the two.
    Reply
  • mr_ripley - Tuesday, April 03, 2012 - link

    Here's a Wikipedia link: you can read some of the circumstances and judge for yourself.

    They may not have said it in so many words but it is clear they were unhappy with ther work environment.

    Imagine your boss coming and beating you up because you lost an iPhone prototype!!!
    Reply
  • mr_ripley - Tuesday, April 03, 2012 - link

    http://en.wikipedia.org/wiki/Foxconn_suicides Reply
  • PeteH - Tuesday, April 03, 2012 - link

    I have to be honest, after reading through that link I didn't see anything that even implied working conditions had anything to do with the suicides of the factory workers. The only suicide for which there was any real information provided was that of the worker who killed himself after losing the iPhone prototype, and in that case the victim wasn't a factory worker, but someone in logistics.

    Did working conditions have anything to do with the factory worker suicides? Maybe, maybe not. There doesn't appear to be evidence either way.
    Reply
  • mr_ripley - Wednesday, April 04, 2012 - link

    I posted the Wikipedia link for all the links in the reference section.

    Here's a more direct report: http://sacom.hk/wp-content/uploads/2010/11/report-...

    And a companion video: http://vimeo.com/17558439

    The video includes an interview of a survivor who is now paralyzed waist down.

    You can choose to patiently read and watch this report or just turn a blind eye like a lot of people do.
    Reply
  • PeteH - Thursday, April 05, 2012 - link

    I did read the report. It details unbelievably miserable working conditions in the factories, which I don't think anyone is disputing, and concludes that the way to change those conditions is to pressure the electronics companies making the bulk of the profits. None of the above comments dispute any of this. However it does not link working conditions to suicides among factory workers.

    And yet you continue to insist that there is a link, with no evidence to back it up. You make statements like, "over a hundred of them have committed suicide over the working conditions," "...scores of people killing themselves citing poor working conditions," and "there is no disputing the fact that these deaths are related to working conditions," but you provide only conjecture to back it up, no proof. When you do this people start dismissing everything you say out of hand, even the things that are accurate. And worse than that, you run the risk that other people arguing for better working conditions will be tarred with the same brush. Look at what happened to Mike Daisy.

    Again, I'm not saying working conditions didn't contribute to the suicides, I'm saying there is no evidence one way or the other. Until you have evidence (in the form of suicide notes, higher suicide rates among factory workers, etc.) please stop. You may actually be hurting the very movement you're trying to help.
    Reply
  • mr_ripley - Thursday, April 05, 2012 - link

    Well, I'm sorry if it is inconvenient for you that these individuals have not said it in so many words. Should we expect them to?... Hey, by the way, I know you're going to kill yourself but why don't your write down an explanation first so we can conclusivly say what the reasons are. And even though you are under a lot of stress right now and are clearly not thinking straight SPELL it our for me please...

    Evidence can come in different forms. Not all of it is directly incriminating, in which case the attention turns to the circumstances. So if these reports don't establish a reasonbly clear coorelation to you, then I am sorry but I disagree.

    You can nitpick on specific words in my comments and quible about words such as evidence. But what are you accomplishing here? Are you justifying your own guilt of purchasing a device manufactured here? Are you an Apple or Foxconn mouthpeice? Do they pay your for spreading lies like Foxconn factories are actually a good place to work (which has been said in the previous comments)? Really, it is people like you need to STOP.

    I'm not going to stop saying what I believe is right!! And unlike Mike Daisy I have not fabricated any evidence. At the most, you can complain that I have drawn incorrect conclusions and I am saying the same about you.
    Reply
  • PeteH - Thursday, April 05, 2012 - link

    It's inconvenient for me that you are lying. You're the one saying that there are, "...scores of people killing themselves citing poor working conditions," not me. Either show me a case where those people who killed themselves cited poor working conditions as the reason, or cease claiming it is fact. You do damage to the movement that's trying to improve things.

    People hear the news reports about Mike Daisy lying to Ira Glass and what they take away is not the specific lies (claiming to witness things had actually happened but that he had only read about), it's that he's a liar and the story wasn't true. They dismiss the whole issue of poor working conditions out of hand. That's what you risk when you lie to get people to listen.
    Reply
  • mr_ripley - Thursday, April 05, 2012 - link

    Go ahead, nitpick on specific phrases and completely lose the meaning. But the problem is easily corrected. One can argue that citing something does not have to be done on paper as you would in a professional article. To me the workers "cite" the existence of a problem through their actions as words have failed them.

    Still if you want me to rephrase I'll say "scores of people killing themselves in midst of poor working conditions.." Can you prove that this statement is inaccurate??

    And while you ask me for evidence have you ever bothered to see if you can find evidence that these deaths are not related to working conditions. Prove it to me and I'll take back everything I said.
    Reply
  • PeteH - Thursday, April 05, 2012 - link

    I think you missed the places above where I stated, "I'm not saying working conditions didn't contribute to the suicides, I'm saying there is no evidence one way or the other." That was my whole point. And I did explicitly state that there's no disputing the poor working conditions. So no, I have no problem with your revised statement.

    However, I don't think what I did was nitpicking at all. Nitpicking would be pointing out that a score is 20, so scores would imply at least 40, and I've only seen documentation of 17 suicides (I haven't seen numbers pre-2010). But that's not what I did.
    Reply
  • shompa - Friday, March 30, 2012 - link

    Manufacturing employees?

    Look at the world! There are about 20 countries in the world that are democratic and have great living standards. Its just 100 years ago since these countries had child workers and harsh condition.

    BTW. My county is on the "top countries" in the world. Still we have the largest suicide rate in our population in the world. Why are you not fighting against the Swedish government that drives thousands to kill them self each year? We live like slaves here with 80% taxes.

    BTW. Do you care if other companies use HonHai/FoxConn or is it just Apple? Are you writing the same thing about Dell/HP and all other companies that use FoxConn?

    What have you done?
    Have you donated money to a chinese worker? Or is Trolling the only thing you manage to do?

    Reply
  • grave00 - Friday, March 30, 2012 - link

    I was curious about this statement. Could you elaborate. What inconsistency is there?

    "On the iPhone Apple has been entirely too lax about maintaining consistency between suppliers. If it wants to be taken seriously in this space Apple needs to ensure a consistent experience across all of its component vendors."
    Reply
  • loboracing - Friday, March 30, 2012 - link

    I remember an ad that touted something new to "see and touch". The retina screen is the "see" part but what about the "touch"? Was that just a gimmick meaning you could touch the screen, or is there some sort of different feel to the screen? Reply
  • name99 - Friday, March 30, 2012 - link

    Compared to the iPad1, the screen is, IMHO slightly smoother and a lot more oleophobic (ie it's a lot easier to clean off fingerprints by wiping a cloth over it). I never had an iPad2 so I don't know if these improvements are new or came with iPad2. Reply
  • shompa - Friday, March 30, 2012 - link

    See = AppleTV
    Touch = Ipad.

    But there was rumors about touch feedback from the screen. Probably in the next Ipad.
    Reply
  • rakez - Friday, March 30, 2012 - link

    as long as they stick with 4:3 i will never buy it. Reply
  • darkcrayon - Friday, March 30, 2012 - link

    Similarly, that's one of the best things about the iPad. I can't see using a widescreen tablet in portrait mode, there is pretty much no popular content that works well there. On the other hand, 4:3 isn't as good for video, but the net effect is that the video is just smaller. I'll take properly positioned and scaled documents and smaller video over larger video and tiny documents. Reply
  • shompa - Friday, March 30, 2012 - link

    You know that 16:9 is interesting if movies is the only thing you want to do.
    If you want to work on a tablet 16:9 does not work. You cant use landscape mode and see enough of the screen when you type. The 4:3 sceen is a bold move against tech nerds. I bet you are one of the tech nerds that screems when there are black bars on the side on you 16:9 TV. "why aren't the TV shows using the whole screen".
    Then stupid TV people listen to you and crop 4:3 TV shows to fit 16:9 and cutting of large part of the picture.

    The whole 16:9 debacle is actually a step backwards for the computing industry. Apple introduced widescreen displays early 2000. Steve made a great choose in 16:10. 2004 Apple invented the 2560x1600 screen. 16:10. Today its almost impossible to get a 16:10 screen. We all use TV LCDs for our computers = 16:9. 2560x1440. You loose 10% of real estate.
    Reply
  • KoolAidMan1 - Saturday, March 31, 2012 - link

    4:3 is better for web browsing and applications on a screen that size, the vertical room in landscape is great. It also makes for a much better balanced feel when holding it in portrait mode.

    Do you also like 16:9 on a desktop monitor? I sure don't, not unless it 27" 2560x1440
    Reply
  • rakez - Saturday, March 31, 2012 - link

    it's hard to argue with isheep and their products designed by god. i am pretty sure i know what i like more than someone else would know what i like. that being said, once again i prefer to not have 4:3 on my tablet. to each his own, Reply
  • Formul - Saturday, March 31, 2012 - link

    starting with isheep and ending with "to each his own" ... you do love your bipolarity, don't you? Reply
  • rakez - Saturday, March 31, 2012 - link

    sounds like i hit a nerve. go ahead keep following the herd. in the meantime i will buy what i want. Reply
  • PeteH - Monday, April 02, 2012 - link

    Out of curiosity, what do you dislike about the 4:3 aspect ratio, and what's your preferred aspect ratio? Reply
  • kwamayze - Friday, March 30, 2012 - link

    WOW!!! What a nice review!!! Well done Reply
  • michalkaznowski - Saturday, March 31, 2012 - link

    Just to say as always a brilliant view. Your site is a must view for any enthusiast here in the UK. I also have appreciated your wireless router reviews of the Airport Extreme Base Station. Only you have pointed out that it has a quantum leap stability when compared to other makes of routers, something that a group of us have had to find out a very hard, frustrating and long way!

    Michal
    Reply
  • x0rg - Saturday, March 31, 2012 - link

    I have a suggestion. Instead of taking pictures you could take screenshots of these devices when you show how beautiful the screen is while working with Remote Desktop. Pictures taken with the camera look terrible and the whole concept of taking pictures instead of screenshots seems unprofessional for the portal like AnandTech. Things like focus, gamma, apperture are not affecting the picture quality when you just take a screenshot (Home+Power on iPad, you know that). Please replace these terrible pictures with screenshots. Thank you. Reply
  • slashbinslashbash - Sunday, April 01, 2012 - link

    You missed the whole point of that part of the review. The point of the photos was to show that the text over Remote Desktop is actually readable in real-world use. A screenshot wouldn't convey that information.

    Imagine this. Say you took an iPhone 4 screenshot of the same scene in Remote Desktop, and you posted it on the site. This would be a 640x960 pixel image. Text would be readable on a desktop monitor, but it would probably not be readable on the actual 3.5" iPhone screen. That is the question, and it applies equally to the iPad3 review. A screenshot just shows you what pixels the iPad is showing; a photo shows you how those pixels look in real-life.
    Reply
  • x0rg - Thursday, April 05, 2012 - link

    I agree, my bad. Reply
  • TekFanChris - Sunday, April 01, 2012 - link

    Thank you Anand and Vivek! You guys always take the iPad reviews to the next level. Comprehensive and complete.

    Cheers.
    Reply
  • Death666Angel - Monday, April 02, 2012 - link

    That kinda reminded me of the PS2 vs PC quality back in the days. :D Reply
  • josemonmaliakal - Monday, April 02, 2012 - link

    Hi Your article seems be so good . And i have got something about the upcoming iPhone 5 of Apple here @ http://wp.me/p2gN9B-lq Reply
  • Wardawg - Thursday, April 05, 2012 - link

    You forget the new iPad just came out 95% of the apps have not upgraded for the new retina display yet. So all of these comparisons are very inaccurate. It doesn't matter that the iPad has higher res and 3.1 million pixels if the app isn't upgraded for retina display it won't display as such you would expect. I expect you guys to make a new article soon fixing these concerns of mine with this article. Reply
  • Noobuser45 - Monday, April 09, 2012 - link

    Anand, you're the only tech expert that I trust so I would love to have my mind put at ease with a definitive answer from you. Is it fine to charge the iPad whenever you want? Can I charge without running it down first? Can I charge for a while and unplug it before it has reached a full charge? Can I use it while it's charging? I just don't want to screw up the battery life. Reply
  • JasperJanssen - Sunday, April 15, 2012 - link

    The iPad uses the same battery technology as the iPhone and the MacBook Air — flat LiPo cells. As owner of all three (iPad 1, 3, iPhone 3G, 3GS, 4, 4S, MacBook Air 13" mid-2011) I can tell you that yes, this is fine. The absolute least degradation of your battery capacity would be to leave it around 70% full and never use the device.

    Second best is to not let it drain down too far, say not under 20-30%. Third best from a capacity standpoint but by far the best in user experience is to not worry about it. All of my devices (iPhone in front, of course) drain to under thirty percent on a regular basis. The one I've had and used longest, the iPad (1st gen), hasn't had a perceptible decrease in battery life after two years, although I admit I haven't run actual tests. 

    If you do manage to use it so much the battery gets tired, a replacement out of warranty from Apple costs only $99+shipping, slightly more than DIY but a lot less hassle. Currently that service is available for all iPhones including the 2G, so not very likely to be unavailable during the useful life of an iPad.
    Reply
  • evolucion8 - Thursday, August 01, 2013 - link

    I love your articles and site, I wish I could say the same thing to your forums, most admins there are just doing their own whatever it feels like, threating and offending people with private messages and turning your forums into a monkey sling cr*p fest. Reply

Log in

Don't have an account? Sign up now