Intel Shuts Down New Devices Group: No More Intel-Made Wearables
by Anton Shilov on April 20, 2018 11:00 AM EST- Posted in
- Wearables
- Peripherals
- Intel
- Atom
- Basis
Intel this week confirmed that it had decided to close down its New Devices Group, which developed various wearable electronics, such as smartwatches, health/fitness monitors, smart/AR glasses and so on. The group was created five years ago by then-incoming CEO Bryan Krzanich, who wanted to ensure that Intel’s chips would be inside millions of emerging devices. While wearables have become relatively popular, their propagation is far below that of smartphones. Meanwhile, wearables made by Intel have never been among the market's bestsellers. Thus, the chip giant is pulling the plug.
Over the five-year history of NDG, Intel made two significant acquisitions to bring necessary expertise to the group: the company took over Basis (a maker of fitness watches) in 2014 and Recon (a maker of wearable heads-up displays) in 2015. Most recently, Intel’s NDG showcased their Vaunt (aka Superlight) smart glasses that looked like “normal” glasses, yet used laser beams to project information to retina justifying their “smart” moniker. While NDG had cutting edge technologies, the group has never managed to produce a truly popular product. Moreover, when problems with one of their Basis smart watches showed up on a limited number of devices, Intel preferred to stop their sales and refund their costs to the customers rather than fix the problems and replace faulty units.
In the second half of 2015, Intel folded the New Devices Group into the New Technology Group, which was a signal that the company was hardly satisfied with NGD’s performance. Since then, we have seen multiple reports about layoffs in Intel’s NGD and have hear multiple rumors to axe the unit. Because making actual devices is generally unnatural for Intel, it was a matter of time brefore the chip giant was to pull the plug, so apparently it decided to do so this month.
Since Intel’s New Technology Group remains in place, all of Intel’s ongoing research projects for smart devices remain intact. More importantly, other Intel’s divisions continue to work on their products for wearables and ultra-low-power devices that will become widespread in the looming 5G era. The only products that are not going to see the light of day are those designed by Intel’s New Devices Group (e.g., the Vaunt glasses). Considering the fact that neither of NDG’s products has become popular, it is unclear whether those products are going to be missed.
It is noteworthy that Intel canned their Galileo, Joule, and Edison product lines aimed at the Internet-of-Things last Summer.
Related Reading:
Source: CNBC
55 Comments
View All Comments
mode_13h - Sunday, April 22, 2018 - link
I wasn't arguing CISC vs. RISC, but I'm amused by the historical fiction involving C. If there's anything to it, you shouldn't have difficulty finding sources to cite.> do you really think that Intel could have built that RISC/X86 machine if they hadn't gone through the Itanium learning curve?
In a word... yes. Not least because the Pentium Pro (their first architecture to translate x86 to RISC micro-ops) launched 6 years before it. And if THAT was substantially influenced by anything else they did, I'd first look to the i860 and i960. Not that I have any evidence to support that it was, but at least my speculation is both qualified and not refuted by basic causality.
mode_13h - Sunday, April 22, 2018 - link
The thing is, your premise is just wrong:> writing user level applications in a RISC assembler was always a non-starter.
I don't think user level apps should've been written in asm since ... I don't know exactly when. But there's nothing especially bad about RISC assembler. I've written RISC assembly at a job, you can find plenty of it kicking around in kernels and device drivers. You can make life easier with macros and subroutines, as with any assembly language.
Perhaps you're confusing it with VLIW, because that's a legitimately hard to write any substantial quantity of efficient code. You can't even use tiny macros or small subroutines, if you care about keeping your instruction slots and pipelines filled. And then, all of the registers you have to juggle to keep everything in flight makes the exercise especially tedious. And any time you need to add a feature or fix a bug, you get to reschedule the entire block and allocate all of the registers.
Are you sure you weren't thinking of VLIW? But that didn't even really hit the scene until after RISC went superscalar and out-of-order, at which point people started thinking it might be a good idea to do the scheduling at compile-time. Again, this was so long after C was already established that it might've been a prerequisite but you can't call it a game-changer.
FunBunny2 - Sunday, April 22, 2018 - link
"Are you sure you weren't thinking of VLIW?"no. Intel's 8XXX chips grew from the late 60s, a time when much code really was written in assembler (C hadn't yet fully escaped Bell Labs). said assemblers were CISC. the later IBM 360 (or 370, can't find a link) even added COBOL assist instructions to the ISA. in today's terms, COBOL and FORTRAN were DSLs, not real programming languages. real coders used assembler, and had since the beginning of electronic computing. RISC came about just because ISAs/assemblers had gotten unwieldy, real estate hungry, and sloooooooow. one might argue that V(VVVVVVV)LSI is what made assembler passe`. memory, transistor, and speed budgets that were not imagined in 1960. if you can be profligate with resources, then the application building paradigm shifts.
anyone who used 1-2-3 through the transition from pure assembly to C saw the difference. if memory (mine, not this computer's) serves, that fact generated lots o PC upgrades.
or, to ask the question from the other end: if you expect your machine to only run 3/4GL, why would you need CISC in the first place? application coders will never be on the metal. the compiler/OS writers need to understand the machine, but nobody else does.
mode_13h - Sunday, April 22, 2018 - link
There were plenty of other programming languages, back then. Lisp dates back to 1958; SNOBOL to 1962. It's pretty remarkable how quickly new languages developed and gained sophistication.You talk like C was the only game in town. Sure, if you're writing on OS, it was going to be C or asm or maybe a small handful of other options (Mac OS was written in Pascal, which dates back to 1970; Multics - the inspiration for UNIX - used PL/I).
I'm not exactly a programming language historian, but I'm just not buying the idea that CPU designers were building out their instruction sets because programmers lacked better tools and were too lazy to write subroutines or use macros. I think they did it simply because each time they got more transistors to play with, they tried to speed up programs by implementing ever higher level functionality in hardware.
StevoLincolnite - Friday, April 20, 2018 - link
Pretty sure modern x86 processors are all RISC these days internally anyway.HStewart - Friday, April 20, 2018 - link
Actually CISC and RISC are both eventually come down Micro-Code.mode_13h - Saturday, April 21, 2018 - link
No. You can't turn one into the other simply by replacing the microcode.More troll-worthy nonsense.
FunBunny2 - Saturday, April 21, 2018 - link
"No. You can't turn one into the other simply by replacing the microcode."but you can by swapping the "decoder". that's the whole point of RISC on the hardware. calling a "micro code engine" is just obfuscation. it's a RISC machine. whether any (past or current) X86 machines shared any specific hardware with Itanium I leave to the discussion.
for those old enough you know that the 360/30, bottom end of the family, implemented the ISA purely in firmware/microcode/whatever. that was 1965. https://en.wikipedia.org/wiki/IBM_System/360_Model...
mode_13h - Sunday, April 22, 2018 - link
I wouldn't over-generalize from the example of modern x86.Wilco1 - Sunday, April 22, 2018 - link
No, an ISA is not implemented just in the decoder, so you can't swap the decoder and implement a different ISA. ISAs affects *everything* - the registers, ALUs, flags, control logic, caches, memory model, etc. Just think about it for one second. It's simply impossible unless the ISAs are virtually identical (think Arm and Thumb-2).Calling a CISC a RISC machine is plain wrong - RISC vs CISC is about the user visible ISA, not about the internal implementation. Micro-ops on modern implementations are very similar to actual instructions. There are complex micro-ops which take multiple cycles to execute.