Intel this week confirmed that it had decided to close down its New Devices Group, which developed various wearable electronics, such as smartwatches, health/fitness monitors, smart/AR glasses and so on. The group was created five years ago by then-incoming CEO Bryan Krzanich, who wanted to ensure that Intel’s chips would be inside millions of emerging devices. While wearables have become relatively popular, their propagation is far below that of smartphones. Meanwhile, wearables made by Intel have never been among the market's bestsellers. Thus, the chip giant is pulling the plug.

Over the five-year history of NDG, Intel made two significant acquisitions to bring necessary expertise to the group: the company took over Basis (a maker of fitness watches) in 2014 and Recon (a maker of wearable heads-up displays) in 2015. Most recently, Intel’s NDG showcased their Vaunt (aka Superlight) smart glasses that looked like “normal” glasses, yet used laser beams to project information to retina justifying their “smart” moniker. While NDG had cutting edge technologies, the group has never managed to produce a truly popular product. Moreover, when problems with one of their Basis smart watches showed up on a limited number of devices, Intel preferred to stop their sales and refund their costs to the customers rather than fix the problems and replace faulty units.

In the second half of 2015, Intel folded the New Devices Group into the New Technology Group, which was a signal that the company was hardly satisfied with NGD’s performance. Since then, we have seen multiple reports about layoffs in Intel’s NGD and have hear multiple rumors to axe the unit. Because making actual devices is generally unnatural for Intel, it was a matter of time brefore the chip giant was to pull the plug, so apparently it decided to do so this month.

Since Intel’s New Technology Group remains in place, all of Intel’s ongoing research projects for smart devices remain intact. More importantly, other Intel’s divisions continue to work on their products for wearables and ultra-low-power devices that will become widespread in the looming 5G era. The only products that are not going to see the light of day are those designed by Intel’s New Devices Group (e.g., the Vaunt glasses). Considering the fact that neither of NDG’s products has become popular, it is unclear whether those products are going to be missed.

It is noteworthy that Intel canned their Galileo, Joule, and Edison product lines aimed at the Internet-of-Things last Summer.

Related Reading:

Source: CNBC

POST A COMMENT

55 Comments

View All Comments

  • HStewart - Friday, April 20, 2018 - link

    It sounds like Intel is trimming some fat - probably non productive group. But I never actually heard of this product Intel does a lot of R&D and some things just don't make it . Possibly there is newer technology like FPGA's Group technology has a better fit. Reply
  • Wilco1 - Saturday, April 21, 2018 - link

    Well this is another huge market lost. Intel went in with a slightly modified 80486 - however x86 is just too complex, large, slow and power hungry. We're starting to see this now detailed comparisons with Arm servers are available. Reply
  • Ryan Smith - Sunday, April 22, 2018 - link

    Administrative note: a user has been banned for bigoted comments.

    This is a tech site, not a politics site, so please leave the latter at the door.
    Reply
  • Hifihedgehog - Sunday, April 22, 2018 - link

    It was all a dog and pony show, anyway, merely to appease ignorant investors since their bread and butter has been stuck on 14nm with an outdated architecture for far too long now. Reply
  • wumpus - Friday, April 20, 2018 - link

    Why do I get the feeling that these companies had marginal products (not bad ones, but certainly niches that Apple, Samsung, and the like couldn't be bothered with) to begin with, and that once Intel replaced their ARM chips with Atom, all hope was lost.

    Younger readers might not be familiar with the horrible kludge that is x86, but the whole architecture is warts on top of warts, run by a modern OoO execution engine that powers through all the problems. The "x86 penalty" might be only a few mm^2 and less than a W of power on a desktop or laptop chip, but it has knocked Intel clean out of the running for phones and wearables are even more hopeless.
    Reply
  • HardwareDufus - Friday, April 20, 2018 - link

    True.... and then we had 20bit addressing.. Now X64 uses 40bit addressing....
    Intel tried to leave it behind with EPIC architecture of the Itanium, but really hard to get a whole generation of programmers to write explicitely parallel code.

    But then of course the CISC instruction set paired to the X86/X64 architectures has the advantage of pretty decent IPC when it's humming along, compared to it's ARM counterparts. But as ARM matures, and good productivity apps (accounting software, word processors, spreadsheets, databases, cad) are created from the ground up and compiled to take advantage of native ARM architectures and instruction sets, we will see that IPC advantage narrow
    Reply
  • FunBunny2 - Friday, April 20, 2018 - link

    "really hard to get a whole generation of programmers to write explicitely parallel code."

    well, really hard to get any bunch of programmers to create explicitly parallel user space problems.
    Reply
  • HStewart - Friday, April 20, 2018 - link

    "Younger readers might not be familiar with the horrible kludge that is x86, but the whole architecture is warts on top of warts, run by a modern OoO execution engine that powers through all the problems"

    As person that has 30 years of development with x86, I feel offended that one is naïve to stated that x86 architecture is a kludge. It just different then ARM. It basically the RISC ( ARM ) vs CISC ( x86 ). RISC architecture like ARM have been around for long time and yes taking x86 architecture to 64 bit maybe consider a kludge - but in other ways it natural evolution. Will we need to go to say 128 or 256 cpus - I am not sure but it days before 32 bit, they were not sure about we need more than a meg of memory in the early days. But as the technology increase, first the hardware involves and then software involves.

    Most people are not knowledgeable about other parts of x86 CPU that can make a huge big difference - in the early days beside x86 going to 32 bit - the big change was introduction of Virtual Mode - this allow run virtual dos sessions in early versions of Windows for example. I see going to 64 bit a natural evolution allowing more to 4G of memory.

    But there are more than just enhance memory in cpu that is important. There are extensions which now especially with AVX 2 and AVX 512 with vector array calculations that RISC like ARM would take many instruction to handle.

    The big difference between x86 (CISC) and ARM (RISC) is basic in CISC vs RISC. In CISC you can have a single instruction that would require many instructions in RISC basically - but RISC does have the advantage of splitting execution up so that it can be done in parallel better because instructions are smaller - but at least for Intel and I believe also AMD, the larger CISC instructions are broken down in smaller micro code which can be done in parallel better as same as RISC.

    The problem today is people think of there phone so much, yes larger instructions are not the best thing for phone - but companies like Intel and like AMD realize that as increase customers need smaller and lighter machine - thus Intel came out with low power Atom - but I believe that Atom was a test ground to get the whole computer on chip, this has now merge in Intel series Y chip and like more of in future mobile chips especially once Intel perfects 10nm.

    As a developer - the biggest reason why I don't see ARM replacing the x86 processor is simple, look at Apple iPad Pro - it claims to be desktop replacement but it still requires ( as far as I know ) an Mac to create code for it.

    Also take your latest game technology - like say "Rise of Tomb Raider" can that run on ARM machine. The answer is no. But it can run on Xbox One and PS4 but those are not ARM cpus.
    Yes GPU has a big part of this - but so the CPU which is the code that drives the GPU and game.
    Reply
  • anymous456 - Friday, April 20, 2018 - link

    Take a look at the Geekbench single-core scores of the 2015 MacBook Pro 15in(which supposedly will play Rise of The Tomb Raider fairly decently) vs the A11 in the iPhone X. Just because ARM traditionally has less power and heat displacement to deal with does not mean that it is architecturally less strong, in fact, the Qualcomm Centriq 2400 can supposedly offer the same performance as an Itel Xeon using less power. Reply
  • HStewart - Friday, April 20, 2018 - link

    Geek Bench in my opinion is one of worst benchmarks out - especially the part with web benchmarks. Any case 2015 MacBook Pro is Intel based not ARM.

    The QUALCOMM also has 48 cores and what kind of tests - I pretty sure with tests that required Vector Math - it will come no where near the Xeon. Maybe for some web processing that requires no real computations - it can handle it.

    Of course we all went thought yet another claim of Windows on ARM emulation - with less power than an Atom - why didn't they name Windows RT.

    ARM uses less instructions, of course it uses less power and for some web services that is fine.
    Reply

Log in

Don't have an account? Sign up now