POST A COMMENT

81 Comments

Back to Article

  • masimilianzo - Thursday, May 15, 2014 - link

    I would literally love a review of this device..And in general review of the chinese devices like Meizu, Nubia, Lenovo, Vivo and so on..you would a chinese editor I guess Reply
  • Mondozai - Thursday, May 15, 2014 - link

    Most smartphones released by Chinese OEMs are for the Chinese market only. Only the major players like Huawei, ZTE and others release their phones to other markets, and even then it is done selectively. All of them have many models which are only for China.

    Xiaomi themselves are slowly rolling out their international presence but not in the West, at least not yet.

    Having an editor covering the Chinese market only makes sense if most of the phones/tablets being sold there are also being sold in the Western world.
    Reply
  • alexvoda - Thursday, May 15, 2014 - link

    Not so.
    I believe Anandtech as a technology blog reports on the latest technology, not just technology available to some markets.
    Even if such devices are not available to the west it is important to have reviews of them in order to compare them to available ones.
    Reply
  • masimilianzo - Thursday, May 15, 2014 - link

    I agree, chinese products reviews would be great, since these brands are emerging at a very fast rate they are very interesting. Reply
  • spider623 - Tuesday, June 03, 2014 - link

    you can actually get all xiaomi products from there not so official(official by the international translation teams)dealer, check miuiandroid(UK, now xiaomi.eu) for the name of the dealer and his website Reply
  • sciwizam - Thursday, May 15, 2014 - link

    From Engadget: http://www.engadget.com/gallery/xiaomi-tablet-laun...

    Looks like it's 2x faster than the Apple A7 in GFX Bench3, according to Xiaomi
    Reply
  • nathanddrews - Thursday, May 15, 2014 - link

    Impressive... that A7 is that good compared to a company that's been doing graphics exclusively for 20 years. A8 will be out this year as well and will likely close the gap a bit.

    I got so excited when I saw "AUO Optronics". I used to follow SED/FED technologies intently until almost every company backed off except AUO.
    Reply
  • ams23 - Thursday, May 15, 2014 - link

    NVIDIA and Xiaomi deserve some credit here. In a tablet that has roughly the same external dimensions and internal battery capacity as the ipad Mini Retina, the MiPad with Tegra K1 has 2.3x better GPU performance! The Kepler GPU in Tegra K1 is not just the fastest and most power efficient ultra mobile GPU, but is also the most advanced ultra mobile GPU for graphics/compute workloads too (full support for DX12, OpenGL 4.4, CUDA 6, etc). This is arguably one of the most revolutionary ultra mobile graphics processor that the world has ever seen. And it doesn't end here: the Tegra Erista Maxwell GPU will be ~ 1.6x faster than the Tegra K1 Kepler GPU at the same power consumption levels, and first silicon should start sampling sometime over the next few months. Reply
  • grahaman27 - Thursday, May 15, 2014 - link

    Cant wait for the Denver version to hit the market, I really would like to see nvidia integrate their LTE modem into the maxwell version next year. Reply
  • Flunk - Thursday, May 15, 2014 - link

    You're looking at this wrong, It's very impressive that a new $240 tablet can outperform last year's top-end iPad Air in any benchmark. Reply
  • michael2k - Thursday, May 15, 2014 - link

    It is! The only real question is that when it hits the market in June, it will be competing against the almost year old A7, and thus really it's competition won't be the year old iPad mini but the upcoming A8 iPad mini.

    In other words, the A7 iPad mini will likely drop in price to $299, which is much more competitive with the $240 MiPad, while the much more powerful A8 iPad mini will likely slot in at $399 and possibly be significantly faster.
    Reply
  • Morawka - Thursday, May 15, 2014 - link

    theres no way apple will sell the ipad mini with A7 at $299. They might go down $50, or they will do away with the aluminum unibody and go polycarbonite plastic like the iphone 5C. Apple is not in the habbit of dropping the price of their tablets that much. The 1st gen ipad mini only got a $50 reduction when the retina mini was released.

    @nathanddrews Apple has also bee making computers and devices for 20+ years. and when they designed the A7, its not like they were starting from step 1. They bought cpu design firms that had decades of experience. It's still impressive, but your comment makes it sound like apple designed this a7 with virtually 1-2 years of experience, when that is not true.

    I really hope this device makes it to the US market soon, because nobody else has a retina like tablet on the market or even coming soon, with this kind of gpu in it.
    Reply
  • nathanddrews - Thursday, May 15, 2014 - link

    NVIDIA has specialized in only graphics for 20 years while Apple has had its hands in many different technologies. Apple has been learning GPUs quickly, but I would still completely expect K1 to decimate A7, and while a 2X fps advantage sounds great on paper, in reality it's a 2X fps advantage in one benchmark against a one year old chip with no knowledge of battery life/power usage. Don't get me wrong, both totally kick the crap out of my current tablet and I'd be happy to have either, but that's not the point.

    I need more data about K1 before I wet my pants over this, that's all.
    Reply
  • Bob Todd - Thursday, May 15, 2014 - link

    The GPU in the A7 isn't Apple's design anyway so the comparison here doesn't make much sense. The comparison would be between Nvidia and Imagination Technologies, which has been focused on mobile GPUs for quite a while. Reply
  • Spunjji - Friday, May 16, 2014 - link

    A 2x FPS increase on the same process in the same form factor is massive. Huge deal. You're downplaying something that doesn't really need to be downplayed. Reply
  • michael2k - Thursday, May 15, 2014 - link

    They've been dropping the price of their iPads every year since 2010. The issue is really by how much. So far they've set the floor at $299 for an almost 2 year old iPad mini or $399 for the 2 year old iPad with Retina.

    So the question is if they release a non Retina iPad with A7 to displace the iPad mini at $299, keep the iPad mini unchanged at $299, or move the iPad mini with A7 to the $299 pricepoint (or possibly $329, $349, whatever), and utilizing a plastic chassis to increase the profit margin.

    They might do all of those things; leave the iPad mini unchanged except for switching to a plastic enclosure and selling at the $299 pricepoint, as an example.
    Reply
  • Senor Mysterioso - Sunday, May 18, 2014 - link

    Its not that competitive actually. Even if it drops to $299, Apple still blatantly gouges their customers on memory upgrades. A 64 gb iPad will be almost double the price of a 64gb mipad. Its disgusting that after so many years apple still charges 200 bucks more for a 64gb version when memory prices have fallen so much in that time. Xiaomi only charges 32 bucks more for the same upgrade. I would grab the 64gb MiPad in a heartbeat if it was available here. Reply
  • Yojimbo - Thursday, May 15, 2014 - link

    You're leaving information out. "Outperform" does not do it justice. Granted it's their own chosen benchmark, but it's more than double. Semiconductor performance does not, in general, double in a year. Reply
  • alexvoda - Thursday, May 15, 2014 - link

    I wish someone would invest in SED/FED technology again and bring it to market.
    SED/FED screens would be the best of most worlds combining advantages of CRTs, LCDs and Plasmas.
    Meanwhile Sony gave up on OLED screens.
    Will we really be stuck with LCDs???
    Reply
  • Penti - Thursday, May 15, 2014 - link

    Sony doesn't have any screen manufacturing at all, they sold Sony Mobile Display to JDI and their large panel-plant they partnered with (S-LCD) and closed that transaction in early 2012. So they can't really invest in any R&D if they don't intend to run any manufacturing plants. JDI does some OLED though. Sony basically gave up on everything there so they have to source from the open market as most. Reply
  • nafhan - Thursday, May 15, 2014 - link

    Uhm... PowerVR has been building graphics chips for about 20 years, too. Reply
  • errorr - Thursday, May 15, 2014 - link

    You mean Imagination Technologies which has been focused on low power graphics since 1994. Qualcomm Adreno is the evolution of ATIs embedded graphics product which they purchased. ARM Mali can from falanx a small GPU firm founded in 1998 in Norway And bought by ARM in 2006.

    All of these companies have been doing low power graphics forever. It is on NV to prove there arch isn't a power-hog/hand-warmer which I'm worried about.
    Reply
  • Joel Kleppinger - Thursday, May 15, 2014 - link

    That's an ignorant statement. PowerVR / Imagination Technologies is the graphics company that makes the IP for A7's graphics. They have been selling 3D graphics products since 1997 and graphics products in general since 1985. Nvidia was founded in 1993. They are roughly the same age and have roughly the same experience in graphics, though Nvidia's solutions were better for the desktop / high performance use so PowerVR focused on mobile. Reply
  • Spunjji - Friday, May 16, 2014 - link

    The A7 graphics are by a company that has been doing graphics exclusively for nearly as long, though. ;) Reply
  • TheJian - Friday, May 16, 2014 - link

    It will be facing a 20nm version of K1 by then too. The gap won't change. NV's gpus will start to take over now that they can put their desktops gpu into a SOC with each yearly rev. The driver team's experience at NV and game developer experience working with this hardware will start to show it's teeth this year or next. M1 same time next year (probably a quad denver with maxwell at 20nm). NV pays to dev once for desktop but ends up with R&D for socs done already pretty much. This is perfect timing since Intel's payments run out in 2016, which has paid for all the T1-4 R&D up to this point yearly, giving NV essentially 4 free revs of experience to get it right for K1. T1-4 were losing money, and only a way into the market to delay until this year with K1/Denver/Modem (still waiting for T4i to show, but modem just got certified in Nov2013). They won't need that now that the desktop IS the SOC gpu, just with less SMX.

    NV is about to start taking Qcom share as gaming really kicks into high gear on mobile. I hope AMD gets a phone/tablet soc out soon to compete as they would have the same advantages of NV and there is room in the market for both to easily add a billion or two PROFIT to their sheets. Qcom makes $6.5B a year and there is a TON more in the market they could get too also. If Google/amazon/Samsung don't start using NV in more devices they should put out more of their own ref designs or just sell them directly for good. Just don't piss off google by forking, keep it straight android. Also ARM will be stealing desktop share as Denver and all the others come with in house CPU's, just as they have already done on the really low end in notebooks (21% gone to arm already). I wouldn't be surprised to see 10-20% desktops go to ARM by xmas 2015 (show up in the 2016Q1 report probably after xmas sales) in the form of a 100-500w box with a GPU from NV in the high-end models (100-300w box general internet/email box I guess, but discrete in 400-500w boxes for gamers also). How fast would a SOC run with a PC CPU type heatsink on it and 100w to work with instead of 6-10w? Good enough to take out $300 dell desktops with no win8 lic, no Intel cpu, etc. So AMD/NV both have a shot at Intel's 10-12B profits also. ARM IP is good enough AMD could take share back via A57's etc on the lower end even with no custom cores.

    We need more money at AMD/NV. GPU's like it or not, are becoming more an more important in everything we watch, play etc. There were 100 companies testing NV GRID 2 quarters ago. Last quarter it doubled to 200. This quarter it's 600 companies testing GRID. It's clear we're heading to grid for lots of work (movies, game dev, etc all require rendering farms today), so both AMD/NV have a great growth area or they wouldn't be getting so many people to test this stuff. Clearly they want it.

    I couldn't imagine how happy engineers would be if they were waiting for a 5hr drawing/model to render with a single $4000-6000 Quadro etc, then they get access to a VPU farm that is 1000x faster and does it in seconds or minutes. It is much easier to maintain a single box with hardware and all the pro app lics than it is thousands of users so I can see why they want this and would test it. Everybody gets faster rendering etc, and management gets a break.

    2X faster and you think A7 is impressive? It was when it came out, yes. But it's not a company that is new. IMG.L, the people that make the gpu in nearly all apple mobile devices (all? - Imagination) have been in the business as long as NV, so really it's a total failure. They were driven OUT of desktops by NV/AMD/Intel. They ran to mobile, and now you're seeing the kings moving to mobile and 1/2 as fast isn't a good result in that light right? Founded in 1985 so longer than NV (1993), but losing. You're mistakenly thinking Apple makes the gpu themselves. Even if they do some of the work, they have a 30yr GPU company backing them with help right? They don't call it an apple gpu.

    https://en.wikipedia.org/wiki/Imagination_Technolo...
    $12mil in profits last year vs. ~500mil for NV and IMG.L is in every apple device so management can't figure out how to price their product right if you're in the most expensive devices from the company making the most money and you still are broke. They had to borrow money to by mips. NV's licensing of their IP now will take it's toll soon. IMG.L had no real competition until NV said that. I picture samsung/apple licensing NV IP for K1/M1/V1 at some point if they are not able to field their own GPU that can beat them. IMG.L is dead unless someone buys them. It's just a question of how long it takes for people to get out chips on NV IP. I suspect that won't happen until more games get ported etc to android/tegra and we see what K1 games can do. xmas 2015 we'll see people start to realize Mobile is a great replacement for consoles. Raise your hand if you think portal & half life 2 were great games? The audience on 1.2Billion mobile units sold yearly will think the same thing as all of us. Get a few dozen more great titles over there and it will snowball followed with NEW gaming IP funded by the quick easy ports to the huge mobile audience. People will want desktop graphics in their phone/tablet and Apple, Samsung etc will give them whatever sells best.
    Reply
  • name99 - Thursday, May 15, 2014 - link

    By itself this (2x faster than Apple A7) is not especially interesting. It's trivial to make a faster GPU --- if you are willing to spend the power.
    The complaint by those who few have had the chance to experiment with K1 is not that it is slow, it's that it's a power hog, unbalanced for the target market. Simply saying that it's 2x as fast without battery life is not helpful.

    I find it significant that in the entire Engadget gallery there is no photo giving any sort of battery life information. The whole thing looks exactly like an Apple presentation (no surprise) except it's missing two pieces --- the part at the beginning where Apple tells us they have sold their 10 billionth song, 1 billionth movie, have upgraded iOS 7 to 97% of the customer base etc etc; and the part during the presentation of a new device where they tell us it can watch movies for ten hours, listen to music for two weeks, etc etc.
    Make of that what you will...
    Reply
  • fivefeet8 - Thursday, May 15, 2014 - link

    That's funny, I noticed the battery life information in one of those slides. You can extrapolate what they mean from them fairly easily or go to another site where they did it for you. Reply
  • name99 - Thursday, May 15, 2014 - link

    My bad. That's what comes of not being able to read Chinese!
    Those are impressive numbers. Well done, NV.
    Reply
  • skypacer - Thursday, May 15, 2014 - link

    battery life information of MiPad (from Xiaomi.com)
    6700mAh battery,
    11 hours online-video,
    17 hours reading ebooks,
    86 hours music,
    1300 hours idle.
    Reply
  • Spunjji - Friday, May 16, 2014 - link

    I am legitimately excited about this product! Reply
  • GC2:CS - Monday, May 19, 2014 - link

    None of these really stresses out the SoC and I would like them to give me a solid battery run time, which will be hard to pass whatever I do. Reply
  • ams23 - Thursday, May 15, 2014 - link

    You aren't making any sense. Xiaomi clearly had a slide describing battery life of MiPad (80 hours music listening, 17 hours movie watching, and 11 hours of web browsing): http://www.engadget.com/gallery/xiaomi-tablet-laun...

    And no, Tegra K1 is obviously not a power hungry beast. The GPU is at least 1.5x more power efficient than the A7 GPU. The app processor + mem power consumption on a mobile optimized platform should be no more than ~ 5w with very compute heavy workloads that push all CUDA cores. Remember that this is > 300 GFLOPS GPU throughput, which is unheard of so far in the ultra mobile space!
    Reply
  • grahaman27 - Thursday, May 15, 2014 - link

    nice catch! And thanks for the translation. Reply
  • Anders CT - Thursday, May 15, 2014 - link

    I don't think the K1 is a powerhog exactly. Mine will draw roughly 6 watts after power conversion, factory clocked at 2.3 GHz with all four cores running and the GPU saturated with compute (doing around 350 gflops. The GPU appears to run at 900 MHz, but I could be wrong). Thas pretty decent I think, and clocked more moderately it should easily come down below 2 watts while still performing very well. Unfortunately its not straigh forward to reduce voltage on the jetson board. Reply
  • GC2:CS - Monday, May 19, 2014 - link

    6W ? That's more than enough to consider that a power hog. Reply
  • Anders CT - Monday, May 19, 2014 - link

    @ GC2:CS

    I don't think so. Most mobile SoC's will use 6-8 watts when clocked and volted to its design limits and throttling is disabled. The Tegra 3 and Snapdragon S4 boards i have worked with has been more power-hungry than the Jetson TK1.
    Reply
  • supgk - Thursday, May 15, 2014 - link

    You're all taking a benchmark from a Chinese company of little reputation way too seriously. Especially with NVIDIA in the mix...

    Let's see how it fares up in real world testing.
    Reply
  • deppman - Thursday, May 15, 2014 - link

    I have spent weeks with the Jetson TK1 board, and have analyzed the power fairly extensively. I can confirm Nvidia's power claims are accurate to conservative. And my estimates for this tablet, made prior to knowing of the slide, were almost spot-on.

    ===========
    I recently spent some quality time with a multimeter and did all sorts of power assessments on the Jetson TK1. The "shocking" number of watts I calculated for SOC + RAM:

    1. Idle: 0.6W
    2. Web browsing, office apps: 2.19W
    3. Demanding gaming (xonotic): 4.74W.

    This tablet has a 6700 mAh battery. That's close enough to the 6500 mAh of my earlier analysis provided below.

    A 6500 mAh battery is enough juice for 36.2 hours of web browsing, or 16.6 hours of gaming *for the SOC + RAM alone*.

    Now of course a tablet requires a screen and other bits like storage. Figure with a high res screen like that and sound we might see about 3W system overhead. So these estimates then look like:

    Web browsing: 15hr
    Gaming: 10.2hr

    If the screen draws another watt or so, we might see 75% of those numbers:

    Web browsing: 11.3hr
    Gaming: 7.5hr

    I expect those last numbers are a reasonable estimate for this tablet.
    ===========
    Reply
  • grahaman27 - Thursday, May 15, 2014 - link

    very interesting numbers. did you take into account the clockspeed differences? I would see what the tk1 pulls at 2.2Ghz, which looks like its what nvidia was shooting for in tablets. Reply
  • deppman - Thursday, May 15, 2014 - link

    I did not take into account frequency scaling or the difference in clock rates. However, in this case, there was only 0.1 gigahertz difference. the analysis was originally done for the mocha tablet. Reply
  • Spunjji - Friday, May 16, 2014 - link

    Thanks for that. Your actual experienced input is an unusually valuable insight in a comments thread. Reply
  • GC2:CS - Monday, May 19, 2014 - link

    If your numbers of power compustion you calculated are relevant it looks bad with tegra... But the problem is not in power compustion of the SoC itself, but those numbers of battery lifetime.

    Honestly I don't want to embarrass you here on a forum but it looks like you have problems with the basics of physics. So let me explain... Work made by electricity is the electrical current multiplied by voltage (P=U*I). What that means is that you can't make any specific battery run time numbers if you don't know the current flowing trough your electrical machine ( tegra in this case) or battery capacity in Wh, you just can't divide the current and work and get time (or whatever you did).

    So how to fix this, I know that 99.9 percent of batteries in tablets have 3.7 V nominal voltage. That means the work they can do from full charge to zero charge is this voltage multiplied with the current. So we got those 6.5 Ah battery (6500 mAh) at estimated 3,7 V. That works out to be about 24.05 Wh.

    Now we can divide this number with power compustion of your electrical machine and we got about 40 hours of idling, 10 hours 59 minutes of web browsing and 5 hours 5 minutes of gaming *just for the SoC and RAM alone*. Yeah It's very bad especially if we compare this to the incredible efficiency of the A7. And I don't think nvidia will take over with this kind of power compustion like it was written in a very long post above.
    Reply
  • jjj - Thursday, May 15, 2014 - link

    "Today Xiaomi introduced its latest tablet: the MiPad"

    Its first tablet not its latest and worth mentioning that the wifi ac is supposed to be 2x2 and those benchmarks they shared look interesting. Too bad for the moronic AR.
    Reply
  • kwrzesien - Thursday, May 15, 2014 - link

    Holy Batman! A 48GB storage upgrade for $32! Please Apple, bring us some love in the next round of devices. Reply
  • Flunk - Thursday, May 15, 2014 - link

    Won't happen until the market demands it. As long as people pay the stupid prices, they stay stupid. Reply
  • supgk - Thursday, May 15, 2014 - link

    But can you run apps from that storage? Reply
  • Morawka - Thursday, May 15, 2014 - link

    yes you can. KitKat supports "move to SD card" feature which moves all of the app's assets to the card and runs them from it. Reply
  • tviceman - Thursday, May 15, 2014 - link

    I so wish this would come stateside. :( Reply
  • Devo2007 - Thursday, May 15, 2014 - link

    nVIDIA introduced Tegra K1 at CES 2014, not 2013. Reply
  • Brandon Chester - Thursday, May 15, 2014 - link

    Just fixed that up, thanks for pointing that out. Reply
  • Arnulf - Thursday, May 15, 2014 - link

    Apple users get shafted with pretty much every mobile gadget purchase. Isn't this enough love for them? Reply
  • steven75 - Thursday, May 15, 2014 - link

    They get best resale value, best support (including in person), etc. Oh wait, you're trolling. Reply
  • phoenix_rizzen - Thursday, May 15, 2014 - link

    Will be interesting to see if this gets imported to North America, or offered for sale to NAs. And how long until someone gets plain-jane Android running on it.

    Could this, finally, be an nVidia-based device worth getting?

    Cautiously optimistic; experience with Tegra2 and Tegra3 doesn't leave me very hopeful at this point.
    Reply
  • grahaman27 - Thursday, May 15, 2014 - link

    what was wrong with the tegra 2? what device did you have? Reply
  • frostyfiredude - Thursday, May 15, 2014 - link

    It was a janky mess with displays of 720p like was found in tablets and was missing things like a NEON FP unit so certain applications simply didn't work or were slow. Tegra 3 was a definite improvement but the low memory bandwidth made it inconsistent too while eating power like mad on its 40nm process. Reply
  • grahaman27 - Thursday, May 15, 2014 - link

    tablets were a janky mess in general at that time. the atrix 4G was a very fast phone at the time and had great battery life. Reply
  • Impulses - Thursday, May 15, 2014 - link

    Honeycomb wasn't terribly janky, and many of those Tegra 2 tablets later saw ICS... The NEON issue and lack of Skype support for many months (amongst other things) fell mostly on NV's shoulders tho, as did the lack of drivers for future OS revisions (Atrix suffered particularly bad there compared to the other dual core phones that followed) Reply
  • tviceman - Thursday, May 15, 2014 - link

    Hey Google, offer Xaomi North American distribution and rebrand this here as the Nexus 8. Reply
  • grahaman27 - Thursday, May 15, 2014 - link

    with a 1440p display and 16x9 resolution. Reply
  • steven75 - Thursday, May 15, 2014 - link

    So they can be (rightly) sued into oblivion? Reply
  • ams23 - Thursday, May 15, 2014 - link

    While I am skeptical about this specific tablet coming to North America, what exactly could they be sued for? MiPad's exterior color scheme and materials are totally different than any ipad. The SoC hardware, camera hardware, and underlying operating system are totally different than any ipad. The screen size and screen resolution are the same, but that is hardly something that can be considered to be exclusive technology. The custom MI user interface is somewhat like ipad, but that is just customized MI icons sitting on the main screen and has been used for ages on their MI phones. Reply
  • Penti - Thursday, May 15, 2014 - link

    The only thing that is clone-like is that they use the same screen as the iPad Mini, or at least a related one, but that's obviously because the manufacturers hasn't went into any exclusive contract with Apple there. Others have done similar in Europe and there is no reason why they would have any trouble there. It's not like Apple was the first with a 4:3 slate (TabletPCs, LCD-based e-readers had them) for that matter, they existed before they sought any design patents. Neither did Samsung infringe on anything in that way. It's not like China ignores IP either, their companies are heavily invested in R&D and participate in the development of tech/standards, that they as any of their competitors protect with patents. Reply
  • Morawka - Thursday, May 15, 2014 - link

    Speakers on the Back: apple does not do that
    MicroSD: Apple does not have that either
    Polycarbonite Tablet: everyone does that
    Camera in the Left side corner: wow you gonna patent camera placement?
    Pastel Colors: You gonna patent colors to?
    Pastel UI: again with the colors
    clock on lockscreen: dammit stop copying us that was a ingenious idea
    Same Panel Resolution: what you expect us to make a custom panel just so we dont match your panel resolution

    seriously enough, they are noticing what the market likes, and then putting it into there products.

    They can sue but it will be fruitless (no pun intended).

    i really hope this makes apple reduce nand upgrade prices tho. it probably will if it comes to the US. because at these prices and performance, apple will turn into a fashion statement product like Beats by dre.

    the only thing they have going for them is aluminium and a closed locked down eco-system, in which the latter will probably lead to their demise.
    Reply
  • Abelard - Thursday, May 15, 2014 - link

    Possibly the most blatant ripoff yet! Look closely at the screen and you can see a small Control Center style arrow ^ at the bottom, and something that looks like Apple's distinctive Home button underneath. Not to mention the pastel color palette, narrow typeface, camera at top left corner, etc. The Tegra is interesting though. Reply
  • name99 - Thursday, May 15, 2014 - link

    Except that it comes without all the bitching regarding Apple's use of "plastic" and "girly colors".

    Like so many other things, it's the end of the world when Apple does something new (no SD card, built-in battery, fingerprint reader...) but not worth mentioning as soon as someone else copies them.
    Reply
  • Morawka - Thursday, May 15, 2014 - link

    It's a design philosophy. You could say they are not copying, they are just doing what the market demands. Reply
  • ams23 - Thursday, May 15, 2014 - link

    Obviously the iPad mini retina greatly influenced the MiPad design (especially exterior dimensions, screen, and resolution). That said, there are many differences too. The MiPad is completely different than iPad mini retina when it comes to CPU, GPU, underlying OS, choice of colors, exterior finish, SD card slot, cost for additional built-in internal storage, etc. Reply
  • Anders CT - Friday, May 16, 2014 - link

    It is in no way a rip off. It uses the same panel, hence the same form factor. Nothing more. Reply
  • UpSpin - Saturday, May 17, 2014 - link

    Arrow: Search for 'kit kat lock screen' and you'll see the arrow, indication for Google Now, too.
    Home Button: Well, people seem to understand the rectangle as home, so why shouldn't others use the same symbol? I mean, someone also had to introduce the house as a home symbol.
    Color: Honestly, colors? How can someone copy colors? And why don't you say that Apple copied Sony or Nokia or any other company which offered their devices in a plethora of different colors, including the ones used by Apple? (Sony VAIO P, VAIO C, Acer Aspire One, Nokia Lumia, ...)
    Narrow typeface: You mean something like Arial Narrow or any other narrow font? Brilliant, totally new. Never seen before. First of its kind.
    Camera: Ok, that's even more brilliant. Never ever thought of placing a camera not in the middle. That's genius.

    Yes, it's similar to the iPad, that's obvious. But have they copied it? No. And is it bad that it looks similar? No.
    It's the same that cars from different manufacturers look pretty similar, too. The reason: People like the current design trend.
    The same that pop music sounds similar, too. The reason music of a specifc genre sounds similar. But it's still a different music. No one can patent Rock, or Dubstep or whatever Apple lawyers/sheeps think they could/should patent.
    Reply
  • Achtung_BG - Thursday, May 15, 2014 - link

    Maybe Nexus 8 or Tegra Note 2 will come with 4GB RAM.
    http://blog.gsmarena.com/nvidia-tegra-k1-benchmark...
    Reply
  • andrewwilliams1985 - Thursday, May 15, 2014 - link

    I dont want this http://bit.ly/1myG31g Reply
  • Penti - Thursday, May 15, 2014 - link

    Nice to see them expand their business, while I'm a bit critical on their MIUI-forklike Android OS it sells hugely in China, there they have to have a fork basically but they do seem to lag a bit on the version but does great in China and the last quarter they were one of the top ten smartphone makers in the world. Spec wise it seems fine, OGLES 3.0 means it's kinda future proof, don't know how good the Nvidia BSP/Android support really is yet though. Styling reminds me of the Asus MeMO Pad, but mostly because they also sold pink ones :). Looks like they source the 7.9 "retina" screen Apple uses here. They even call it so. They might as well just have went with something else there. Don't get why they use physical/capacitive buttons on this form factor either. Reply
  • errorr - Thursday, May 15, 2014 - link

    What I'm curious about is battery life. Every other gpu architecture does some form of tile based deferred rendering vs. immediate mode rendering of a Kepler. (Technically Adreno can do immediate when asked or a heuristic switches modes dynamically).

    While TBDR has some drawbacks it is waaaaay more efficient in terms of memory reads and writes as well as fully utilizing all the architecture at once. (The deferred part means that the GPU waits until an entire scene is ready before it renders so it can optimize resource scheduling which also reduces power. Also makes certain types of AA virtually free)

    The problem that Tegra could have is that all that extra bandwidth uses a lot of power. That may not be as big of a deal on desktop where power constraints are tiny and bandwidth is cheap. But on a mobile platform you are gonna use a lot more power no matter what.

    One solution would be to have a huuuuge cache with some really good caching algorithm like what the Apple A7 does to reduce reads from RAM but that doesn't seem economically feasible for Tegra.

    So is Kepler really power efficient to counteract its architecture disadvantages? Kepler was designed without the absolute need to minimize those pesky memory reads and writes.

    Also it may depend on the app quality as I know a lot of people are unaware of good coding practices for mobile archs and I remember ARM complaining about how unrealistic benchmarks meant to stress mobile GPUs are.

    Will be interested but I have serious doubt's about the power efficiency of Kepler even if it is more powerful overall.
    Reply
  • ams23 - Thursday, May 15, 2014 - link

    The Kepler.M GPU in Tegra K1 is ~ 1.5x more power efficient than the best ultra mobile GPU's available today, on the same fab process node to boot: http://cdn.pcper.com/files/imagecache/article_max_... . According to the Tegra K1 whitepaper, this GPU has many features to help improve power efficiency including: hierarchical on-chip Z-cull, primitive culling, early Z culling, Texture/Z/color compression, and a relatively large unified L2 cache. And on a side note, Adreno GPU's are not tile based deferred renderers. Reply
  • fivefeet8 - Friday, May 16, 2014 - link

    Battery life of the device is not as power hungry as you seem to think. Reply
  • jasonkcarter - Thursday, May 15, 2014 - link

    I bet Apple's already got their lawyers primed if this thing becomes successful. This thing is a lawsuit waiting to happen. Seriously, they couldn't think of a name for the thing that didn't end in "iPad"? Reply
  • skypacer - Friday, May 16, 2014 - link

    Seriously, the "MiPad" have no offical name in English yet, its name in Chinese is Xiaomi Pad, which is pretty natural, for its a Pad provided by Xiaomi.com.
    BTW, xiaomi means millet in Chinese.
    Reply
  • TheJian - Wednesday, May 21, 2014 - link

    http://arstechnica.com/gaming/2014/03/opengl-es-3-...
    It's a kepler, it should support OpenGL ES3.1 without issue. They need to update their spec sheet.
    "While some mobile GPUs, such as NVIDIA's Tegra K1, will support 3.1 with nothing more than driver updates, other GPU families might need new chips."

    I don't believe there is anything in ES3.1 that isn't already in OpenGL4.4 which is already supported here in K1. ES comes from it's bigger brother. ES3.0 took from OpenGL 3.1, and it seems ES 3.1 probably takes from OpenGL 4.4.
    http://www.weand.com/news/2014-04-22/Nvidia_second...
    You can translate that from chinese to english at google etc. They say:
    "Earlier this CES 2014 Conference , Nvidia released the latest generation of its own processor Tegra K1, includes 32 - and 64- bit versions , both versions of the GPU parts are built 192 Kepler CUDA cores, and chips for the first time in the history of the mobile space supports OpenGL ES3.1 standard."

    That was last month.
    Reply
  • Brakken - Friday, May 30, 2014 - link

    Xiaomi is great! I love how honest they are with their total lack of creativity and theft. They even do blue T-shirts and Apple fonts and effect in their very own keynotes. SO innovative! Reply
  • wunaru - Tuesday, July 08, 2014 - link

    339.99
    w u n a . r u
    Reply
  • wunaru - Friday, July 18, 2014 - link

    $ 299.99 in stock
    w u n a . r u
    Reply

Log in

Don't have an account? Sign up now